Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology.
Develop and manage robust data integrations with external vendors and organizations (including complex API integrations).
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions.
Collaborate with Data Science, Product Managers and Software Engineers to build robust ETL pipelines.
Contribute to architecture decisions, observability tooling, and data quality initiatives.
Contribute to a scalable internal framework for managing prompt engineering pipelines and AI workflows.
Federato is on a mission to defend the right to efficient, equitable insurance for all by enabling insurers to provide affordable coverage. They are well funded by those behind Salesforce, Veeva, Zoom, Box, etc, and they value learning and the ability to change their minds.
Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion.
Implement data quality checks, monitoring, and validation processes.
Automate manual processes into centralized and scalable solutions.
Informa TechTarget accelerates growth from R&D to ROI, informing and connecting technology buyers and sellers. They are a vibrant community of over 2000 colleagues worldwide and traded on Nasdaq as part of Informa PLC.
Build, optimize, and maintain data pipelines that power our business
Define and build out abstracted reusable data sets to be used for Business Intelligence, Marketing, and Data Science Research
Design, build, and evangelize a federated data validation frameworks to be used to monitor potential data inconsistencies
Garner Health strives to transform the healthcare economy, delivering accessible, high-quality healthcare. They are a fast-growing healthcare technology company dedicated to making a meaningful impact on healthcare at scale with a team of talented, mission-driven individuals.
Design, implement, and maintain robust, scalable data pipelines to support AI, analytics, and operational reporting
Own and evolve the data warehouse architecture, ensuring it meets performance, flexibility, and governance needs
Ensure data integrity, availability, lineage, and observability across complex pipelines
Remote People is building the infrastructure to power borderless teams. Their technology handles global payroll, benefits, taxes, and compliance, enabling businesses to compliantly hire anyone anywhere at the push of a button. They are a growing, international family.
Build and maintain scalable data pipelines from ingestion through transformation and delivery.
Design, build, and maintain our data warehouse and data marts.
Partner with stakeholders to translate business needs into clean data models.
Gurobi Optimization focuses on mathematical optimization. They empower customers to expand their use of mathematical optimization technology in order to make smarter decisions and solve some of the world's toughest and most impactful business problems.
QAD Inc. is a leading provider of adaptive, cloud-based enterprise software and services for global manufacturing companies. They help customers in various industries rapidly adapt to change and innovate for competitive advantage.
Work cross-functionally with Product and subject matter experts to conceptualize, prototype, and build data solutions
Connect disparate datasets (e.g. claims, contract rates, demographics data) to empower internal and external stakeholders
Build and maintain data engineering systems that support AI use cases, including scalable ingestion pipelines, feature generation, and downstream products
Turquoise Health aims to make healthcare pricing simpler, more transparent, and lower cost. They are a Series B startup backed by top VCs with an accomplished group of folks with a passion for improving healthcare.
Design, develop, and maintain scalable data pipelines using cloud data services.
Serve as a technical leader, defining data engineering standards and best practices.
Lead the design and implementation of optimized data models in our cloud data warehouse.
Constant Contact empowers people by giving them the help and tools they need to grow online. They are energized by new challenges and possibilities, and they celebrate diversity and inclusion with programs in place to bring people together.
Responsible for building core infrastructure software (pipelines, APIs, data modelling) as part of our client's data platform team.
Coach & mentor other engineers to support the growth of their technical expertise.
Implementing the appropriate technologies for scaling data access patterns, batch processing, and data streaming for soft real-time consumption.
YLD is a software engineering and design consultancy that creates digital capabilities for their clients. The company has offices in London, Lisbon, and Porto and aims to attract, inspire, develop, and retain extraordinary people.
Architect and sustain self-healing pipelines using Astronomer/Airflow to ensure 24/7 data availability.
Design and optimize event-driven API ingestion frameworks leveraging AWS Lambda and DLT (Data Load Tool).
Manage high-performance modeling within AWS Redshift, utilizing DBT to transform raw transactional data into high-fidelity business intelligence.
Odisea helps close the opportunity gap between Colombia and the United States by redefining nearshoring. They are building a passionate team of professionals committed to this purpose.
Design, build, and maintain scalable batch and real-time data pipelines that power analytics, experimentation, and machine learning
Partner cross-functionally with analytics, product, engineering and operations to deliver high-quality data solutions that drive measurable business impact
Champion data quality, reliability, and observability by implementing best practices in testing, monitoring, lineage, and incident response
Gopuff is reimagining how people purchase everyday essentials, from snacks to household goods to alcohol, all delivered in minutes. They are assembling a team of thinkers, dreamers and risk-takers who know the value of peace of mind in an unpredictable world.
Design and evolve scalable data pipelines and architectures.
Act as the primary anchor for data ingestion, transformation, and storage solutions.
Ensure mission-critical data is accessible and reliable.
CodeRoad provides end-to-end software development services, helping businesses scale with ideal infrastructure solutions. From staff augmentation to dedicated IT teams and general software engineering, their nearshore technology services empower businesses to thrive in an ever-evolving digital landscape.
Design, implement, and maintain scalable, high-performance data architectures connecting relational and non-relational systems.
Manage end-to-end data pipelines, ensuring seamless ingestion from scrapers to AI/ML workflows.
Audit and optimize existing workflows for efficiency, accuracy, and flexibility.
Jobgether is a pioneering HR Tech startup operating entirely remotely, and leading the revolution in the world of work. They are a job search engine designed exclusively for remote workers, with a team of 30 individuals located across the globe.
Architect, build, and operate data infrastructure that powers Tebra’s intelligent features.
Translate business requirements into software solutions that accelerate our ability to deploy AI.
Monitor data pipelines, detect anomalies, and implement automated recovery systems.
Tebra unites Kareo and PatientPop, providing a digital backbone for practice well-being, supporting both products with a shared vision for modernized care. Over 100,000 providers trust Tebra to elevate patient experience and grow their practice, building the future of well-being with compassion and humanity.
Design, build, and maintain scalable data pipelines.
Develop and optimize ETL/ELT processes using cloud data technologies.
Partner with teams to understand data requirements and improve data capture strategies.
Blueprint is a technology solutions firm with a strong presence across the United States, solving complicated problems for their clients. They are bold, smart, agile, and fun, and believe in unique perspectives, building teams of people with diverse skillsets and backgrounds.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.
Design, build, and optimize scalable data architectures that power marketing analytics and survey measurement initiatives.
Deliver automated, high-impact data solutions and insights that enhance decision-making across teams.
Build robust pipelines, dashboards, and analytical frameworks in fast-paced environments.
ItD is a consulting and software development company that blends diversity, innovation, and integrity with real business results. They reject any strong hierarchy, empowering teams to deliver excellent results in a woman- and minority-led firm.
Build and maintain production quality data pipelines between operational systems and BigQuery.
Implement data quality and freshness checks and monitoring processes to ensure data accuracy and consistency.
Maintain and contribute to our ingestion framework that leverages various purpose-built data load tool (dlt) connectors.
Grafana Labs is a remote-first, open-source powerhouse that provides visualization tools. They help companies manage their observability strategies with the Grafana LGTM Stack and have a global collaborative culture with a passion for meaningful work.
Build and maintain Azure Data Factory pipelines for data ingestion.
Write Python code in Databricks for data cleaning and transformation.
Monitor daily jobs and troubleshoot pipeline failures to ensure reliability.
Jobgether is a platform that helps candidates find relevant jobs through AI-powered matching. The company ensures applications are reviewed quickly, objectively, and fairly against the role's core requirements.