Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.
Design, develop, and maintain scalable data pipelines using cloud data services.
Serve as a technical leader, defining data engineering standards and best practices.
Lead the design and implementation of optimized data models in our cloud data warehouse.
Constant Contact empowers people by giving them the help and tools they need to grow online. They are energized by new challenges and possibilities, and they celebrate diversity and inclusion with programs in place to bring people together.
Build and maintain scalable data pipelines from ingestion through transformation and delivery.
Design, build, and maintain our data warehouse and data marts.
Partner with stakeholders to translate business needs into clean data models.
Gurobi Optimization focuses on mathematical optimization. They empower customers to expand their use of mathematical optimization technology in order to make smarter decisions and solve some of the world's toughest and most impactful business problems.
Design, build, and optimize scalable data architectures that power marketing analytics and survey measurement initiatives.
Deliver automated, high-impact data solutions and insights that enhance decision-making across teams.
Build robust pipelines, dashboards, and analytical frameworks in fast-paced environments.
ItD is a consulting and software development company that blends diversity, innovation, and integrity with real business results. They reject any strong hierarchy, empowering teams to deliver excellent results in a woman- and minority-led firm.
Build and maintain data pipelines, transform raw data into reliable models.
Develop Tableau dashboards that put insights in front of clients.
Work directly with clients and shape how their platform evolves.
DataDrive is a fast-growing managed analytics service provider. They support ongoing training, adoption, and growth of their clients’ data cultures and offer a unique team-oriented environment.
Architect, develop, and deploy robust, scalable data solutions using Azure tools.
Design and optimize ETL/ELT data pipelines using Python, PySpark, and SQL.
Build and manage modern data architectures, including data lakes and warehouses.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. The system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Experience with the integration of data from multiple data sources.
Experience with various database technologies such as SQLServer, Redshift, Postgres, and RDS.
Experience designing, building, and maintaining data pipelines.
Bluelight Consulting is a leading software consultancy dedicated to designing and developing innovative technology that enhances users' lives. With a presence across the United States and Central/South America, Bluelight is in an exciting phase of expansion, continually seeking exceptional talent to join its dynamic and diverse community.
Design and build robust, highly scalable data pipelines and lakehouse infrastructure with PySpark, Databricks, and Airflow on AWS.
Improve the data platform development experience for Engineering, Data Science, and Product by creating intuitive abstractions, self‑service tooling, and clear documentation.
Own and maintain core data pipelines and models that power internal dashboards, ML models, and customer-facing products.
Parafin aims to grow small businesses by providing them with the financial tools they need through the platforms they already sell on. They are a Series C company backed by prominent venture capitalists, with a tight-knit team of innovators from companies like Stripe, Square, and Coinbase.
Architect our AWS-based data warehouse and ingestion pipelines.
Transform high-volume simulation outputs into clean, trusted datasets.
Establish schema standards and data contracts with engineering.
Onebrief provides collaboration and AI-powered workflow software designed for military staffs, making them faster, smarter, and more efficient. The company, founded in 2019, values ownership and excellence, with a team spanning veterans and technologists; it has raised $320m+ from investors and is valued at $2.15B.
Design and evolve scalable data pipelines and architectures.
Act as the primary anchor for data ingestion, transformation, and storage solutions.
Ensure mission-critical data is accessible and reliable.
CodeRoad provides end-to-end software development services, helping businesses scale with ideal infrastructure solutions. From staff augmentation to dedicated IT teams and general software engineering, their nearshore technology services empower businesses to thrive in an ever-evolving digital landscape.
Build, optimize, and maintain data pipelines that power our business
Define and build out abstracted reusable data sets to be used for Business Intelligence, Marketing, and Data Science Research
Design, build, and evangelize a federated data validation frameworks to be used to monitor potential data inconsistencies
Garner Health strives to transform the healthcare economy, delivering accessible, high-quality healthcare. They are a fast-growing healthcare technology company dedicated to making a meaningful impact on healthcare at scale with a team of talented, mission-driven individuals.
Build and maintain production quality data pipelines between operational systems and BigQuery.
Implement data quality and freshness checks and monitoring processes to ensure data accuracy and consistency.
Maintain and contribute to our ingestion framework that leverages various purpose-built data load tool (dlt) connectors.
Grafana Labs is a remote-first, open-source powerhouse that provides visualization tools. They help companies manage their observability strategies with the Grafana LGTM Stack and have a global collaborative culture with a passion for meaningful work.
Collaborate with analytics and finance teams to enable trusted metrics and dashboards in Looker
Design, build, and maintain scalable data pipelines using Python and dbt
Develop and optimize BigQuery data models for analytics and product use cases
UJET delivers a cloud platform that redefines customer experience with AI and a mobile-first approach. They are an innovative company committed to exceptional interactions and accelerated growth in the AI-driven world.
Build and maintain Azure Data Factory pipelines for data ingestion.
Write Python code in Databricks for data cleaning and transformation.
Monitor daily jobs and troubleshoot pipeline failures to ensure reliability.
Jobgether is a platform that helps candidates find relevant jobs through AI-powered matching. The company ensures applications are reviewed quickly, objectively, and fairly against the role's core requirements.
Own the delivery of scalable internal data solutions.
Translate business needs into clear technical designs and working systems.
Build and improve data pipelines, integrations, and automation.
Transparent Hiring is recruiting for a fast-growing reinsurance company operating across Germany and the United States. The environment is collaborative and driven by a strong “build and ship” mindset.
Partner with BI Analysts, Operations, Product, and Engineering teams to define and assess data requirements.
Design, implement, and maintain ETL/ELT pipelines and data integrations.
Build and manage data architecture, including relational and dimensional databases or cloud data warehouses.
Jobgether is an AI-powered matching service that helps candidates get reviewed quickly, objectively, and fairly. They identify top-fitting candidates and share this shortlist directly with the hiring company.
Design, build, and maintain scalable and reliable batch and real-time ETL/ELT data pipelines.
Architect and implement robust data infrastructure capable of handling high-volume data ingestion and processing.
Implement automated data quality checks, validation rules, and monitoring frameworks.
ShyftLabs is a data product company founded in early 2020 that works with Fortune 500 companies. They deliver digital solutions to help accelerate the growth of businesses across various industries through innovation; they also value strong business awareness.
Assemble and manage large, complex datasets that meet both non-functional and functional business requirements.
Identify, design, and implement internal process improvements to enhance scalability, optimize data delivery, and automate manual processes.
Build and maintain optimal data pipeline architecture for efficient extraction, transformation, and loading of data from various sources.
Sanford Health is one of the largest and fastest-growing not-for-profit health systems in the United States, dedicated to the work of health and healing. The organization has 53,000 employees and serves over 2 million patients across the upper Midwest.