Remote Data Jobs

Job listings

  • Design, develop, and deploy end-to-end Computer Vision solutions using classical and deep learning approaches.
  • Build and optimize models for image classification, object detection, segmentation, and OCR/document understanding.
  • Collaborate on data annotation strategies, data augmentation pipelines, and approaches to handle imbalance, noise, and domain shift.

Blend is an AI services provider committed to co-creating meaningful impact for its clients through data science, AI, technology, and people. They have a mission to fuel bold visions by aligning human expertise with artificial intelligence to unlock value and foster innovation.

  • Build and enhance Power BI reports/dashboards in line with provided wireframes and agreed KPI definitions.
  • Translate business requirements into reporting logic, clarifying metric definitions, filters, and calculation rules with stakeholders.
  • Perform robust self-QA: reconciliation to source data, validation of KPI outputs, edge-case testing, and consistency checks across visuals/pages.

NEC Software Solutions is part of the global tech giant NEC Corporation. They provide software solutions to dispatch ambulances, support families, keep trains on the move, locate missing people and even test the hearing of newborn babies, working to support amazing public services.

$170,000–$200,000/yr
Global 7w PTO

  • Help build the discovery feed, the core recommendations engine.
  • Partner with the CEO and co-founders to execute on a vision that redefines discovery.
  • Design, implement, and iterate on large-scale ML systems to reach product/market fit.

Circle is building an all-in-one platform for online communities. They are a fully remote company of around 200 team members from 30+ countries, focusing on asynchronous work and documentation.

  • Architect and implement Databricks Lakehouse solutions for large-scale data platforms.
  • Design and optimize batch & streaming data pipelines using Apache Spark (PySpark/SQL).
  • Implement Delta Lake best practices (ACID, schema enforcement, time travel, performance tuning).

They are looking for a Databricks Architect to design and lead modern Lakehouse data platforms using Databricks. The role focuses on building scalable, high-performance data pipelines and enabling analytics and AI use cases on cloud-native data platforms.

  • Monitor and support ETL processes moving data from on-premise or hosted servers into Snowflake
  • Design and maintain aggregate and reporting tables optimized for Tableau and Power BI dashboards
  • Optimize Snowflake performance and cost , including warehouse usage, query tuning, and table design

Affinitiv is the largest provider of end-to-end, data-driven marketing and software solutions exclusively focused on the automotive customer lifecycle. Backed by 20+ years of automotive and marketing expertise, they work with over 6,500 dealerships and every major manufacturer in the country.

  • Analyze large-scale telematics and execution data across fleets, lanes, and time.
  • Build and maintain large-scale sandboxes using masked or synthetic data.
  • Surface data-driven insights that influence roadmap priorities.

Catena is building a universal data API that lets brokers, TMSs, fintechs, and fleets connect to truck and trailer data through a single integration. Catena sits beneath the freight ecosystem, normalizing real-time telematics and execution data so platforms can automate workflows, reduce risk, and make better decisions.

  • Improve the quality of pretraining datasets by leveraging your previous experience, intuition and training experiments.
  • Focus on generating synthetic data at scale and determining the best strategies to leverage such data into training large models.
  • Closely collaborate with other teams like Pretraining, Postraining, Evals, and Product to define high-quality data needs.

Poolside aims to be the company that builds a world where AI will be the engine behind economically valuable work and scientific progress. They are a remote-first team across Europe and North America that values the quality of their systems.

US Unlimited PTO

  • Design, build, and maintain pipelines that power all data use cases.
  • Develop intuitive, performant, and scalable data models that support product features.
  • Pay down technical debt, improve automation, and follow best practices in data modeling.

Patreon is a media and community platform where over 300,000 creators give their biggest fans access to exclusive work and experiences. They are leaders in the space, with over $10 billion generated by creators since Patreon's inception, with a team passionate about their mission.

$82,000–$158,000/yr

  • Develop repeatable data ingestion and transformation logic to standardize and prepare data for analytics.
  • Design, rapidly prototype, and implement one or more Power BI semantic models integrating multiple contact center technologies and CRM platforms.
  • Create user friendly dashboards and visualizations that show which business processes are supported at each call center.

LMI is a digital solutions provider dedicated to accelerating government impact with innovation and speed. They invest in technology and prototypes ahead of need, bringing commercial-grade platforms and mission-ready AI to federal agencies at commercial speed.