Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
Build and optimize scalable, efficient ETL and data lake processes.
Own the ingestion, modeling, and transformation of structured and unstructured data.
Maintain and enhance database monitoring, anomaly detection, and quality assurance workflows.
Launch Potato is a digital media company that connects consumers with brands through data-driven content and technology. They have a remote-first team spanning over 15 countries and have built a high-growth, high-performance culture.
Design, build, maintain, and operate scalable streaming and batch data pipelines.
Work with AWS services, including Redshift, EMR, and ECS, to support data processing and analytics workloads.
Develop and maintain data workflows using Python and SQL.
Southworks helps companies with software development and digital transformation. They focus on solving complex problems and delivering innovative solutions.
Design, build, and maintain ETL/ELT pipelines and integrations across legacy and cloud systems.
Model, store, and transform data to support analytics, reporting, and downstream applications.
Build API-based and file-based integrations across enterprise platforms.
Jobgether is a platform that uses AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company.
Architect our AWS-based data warehouse and ingestion pipelines.
Transform high-volume simulation outputs into clean, trusted datasets.
Establish schema standards and data contracts with engineering.
Onebrief provides collaboration and AI-powered workflow software designed for military staffs, making them faster, smarter, and more efficient. The company, founded in 2019, values ownership and excellence, with a team spanning veterans and technologists; it has raised $320m+ from investors and is valued at $2.15B.
Design and build robust, highly scalable data pipelines and lakehouse infrastructure with PySpark, Databricks, and Airflow on AWS.
Improve the data platform development experience for Engineering, Data Science, and Product by creating intuitive abstractions, self‑service tooling, and clear documentation.
Own and maintain core data pipelines and models that power internal dashboards, ML models, and customer-facing products.
Parafin aims to grow small businesses by providing them with the financial tools they need through the platforms they already sell on. They are a Series C company backed by prominent venture capitalists, with a tight-knit team of innovators from companies like Stripe, Square, and Coinbase.
Design and evolve scalable data pipelines and architectures.
Act as the primary anchor for data ingestion, transformation, and storage solutions.
Ensure mission-critical data is accessible and reliable.
CodeRoad provides end-to-end software development services, helping businesses scale with ideal infrastructure solutions. From staff augmentation to dedicated IT teams and general software engineering, their nearshore technology services empower businesses to thrive in an ever-evolving digital landscape.
Design, build, and optimize scalable data architectures that power marketing analytics and survey measurement initiatives.
Deliver automated, high-impact data solutions and insights that enhance decision-making across teams.
Build robust pipelines, dashboards, and analytical frameworks in fast-paced environments.
ItD is a consulting and software development company that blends diversity, innovation, and integrity with real business results. They reject any strong hierarchy, empowering teams to deliver excellent results in a woman- and minority-led firm.
Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.
Trustonic provides smartphone locking technology, enabling global access to devices and digital finance. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. They celebrate diversity and aim to do the right thing for each other, the community, and the planet.
Identify structural weaknesses and eliminate operational fragility.
Define clear ingestion, validation, and testing standards across the platform.
Drive ambiguous initiatives from concept to production-ready outcomes.
Life360's mission is to keep people close to the ones they love. By continuing to innovate and deliver for our customers, they have become a household name and the must-have mobile-based membership for families. Life360 has more than 500 (and growing!) remote-first employees.
Lead the design and implementation of scalable ETL pipelines and data lakes in AWS
Develop and optimise data architectures for terabyte-scale relational and distributed data systems
Collaborate with Data Scientists, Software Engineers, and Architects to integrate data solutions into analytics platforms and applications
Smart Working connects skilled professionals with outstanding global teams for full-time, long-term roles. They help discover meaningful work with teams that invest in your success, empowering you to grow personally and professionally in a remote-first world.
Assemble and manage large, complex datasets that meet both non-functional and functional business requirements.
Identify, design, and implement internal process improvements to enhance scalability, optimize data delivery, and automate manual processes.
Build and maintain optimal data pipeline architecture for efficient extraction, transformation, and loading of data from various sources.
Sanford Health is one of the largest and fastest-growing not-for-profit health systems in the United States, dedicated to the work of health and healing. The organization has 53,000 employees and serves over 2 million patients across the upper Midwest.
Design, develop, and maintain ETL/ELT pipelines on cloud-based data platforms.
Build data ingestion, transformation, and orchestration workflows using tools such as Azure Data Factory, Airflow, Fivetran, or similar.
Develop transformations and data processing logic using platforms such as Databricks, Snowflake, or equivalent.
Ankura Consulting Group, LLC is an independent global expert services and advisory firm. They deliver services and end-to-end solutions to help clients at critical inflection points related to conflict, crisis, performance, risk, strategy, and transformation, and consists of more than 2000 professionals.
Create and maintain optimal data pipeline architecture
Extend our machine learning platform by designing tools that interface with cloud services
Build the infrastructure required for optimal extraction, transformation, and loading of data
NinjaHoldings aims to revolutionize how Americans interact with financial services. They have a lean and innovative team that empowers people overlooked by traditional financial institutions through digital banking and lending products.
Build and maintain scalable data pipelines from ingestion through transformation and delivery.
Design, build, and maintain our data warehouse and data marts.
Partner with stakeholders to translate business needs into clean data models.
Gurobi Optimization focuses on mathematical optimization. They empower customers to expand their use of mathematical optimization technology in order to make smarter decisions and solve some of the world's toughest and most impactful business problems.
Design, build, and maintain robust ETL/ELT pipelines to ingest large-scale datasets and high-frequency streams.
Lead the design and evolution of our enterprise data warehouse, ensuring it is scalable and performant.
Manage our data transformation layer using Dataform (preferred) or dbt to orchestrate complex, reliable workflows.
UW provides utilities all in one place, including energy, broadband, mobile, and insurance. They aim to double in size and offer savings to customers, fostering a culture that values imaginative and pragmatic problem-solvers.
Design, develop, and maintain a core Python ETL framework.
Develop and optimize an automated refresh pipeline orchestrated through AWS Batch, Lambda, Step Functions, and EventBridge.
Build Python integrations with external systems that are robust, testable, and reusable.
BlastPoint is a B2B data analytics startup that helps companies engage with customers more effectively by discovering insights in their data. Founded in 2016 by Carnegie Mellon Alumni, they are a tight-knit, forward-thinking team that serves diverse industries including energy, finance, retail, and transportation.
Designing, building and maintaining data pipelines and data warehouses.
Developing ETL ecosystem tools using tools like Python, External APIs, Airbyte, Snowflake, dbt, PostgreSQL, MySQL, and Amazon S3.
Helping to define automated solutions to solve complex problems around better understanding data, users, and the market
Neighborhoods.com provides real estate software platforms, including 55places.com and neighborhoods.com. They strive to foster an inclusive environment, comfortable for everyone, and will not tolerate harassment or discrimination of any kind.
Design, build, and optimize data pipelines to support AI and ML projects.
Integrate data from various sources to provide a unified data view for AI applications.
Implement processes to ensure data quality, consistency, and accuracy across systems.
The Tyndale Company is a leading national supplier of arc-rated flame-resistant clothing (FRC) to the energy sector. They are a family-owned business, 9x Top Workplace winner in PA and 5x winner in TX, providing a retail-style apparel experience.
Partner with BI Analysts, Operations, Product, and Engineering teams to define and assess data requirements.
Design, implement, and maintain ETL/ELT pipelines and data integrations.
Build and manage data architecture, including relational and dimensional databases or cloud data warehouses.
Jobgether is an AI-powered matching service that helps candidates get reviewed quickly, objectively, and fairly. They identify top-fitting candidates and share this shortlist directly with the hiring company.