Design, build, and maintain scalable and reliable data pipelines.
Develop and maintain ETL data pipelines for large volumes of data, writing clean, maintainable, and efficient code.
Work closely with product managers, data scientists, and software engineers to create and prepare datasets from disparate sources.
Curinos empowers financial institutions to make better, faster and more profitable decisions through industry-leading proprietary data, technologies and insights.
Design, implement, and maintain scalable ETL/ELT pipelines using Python, SQL, and modern orchestration frameworks. Build and optimize data models and schemas for cloud warehouses and relational databases, supporting AI and analytics workflows. Lead large-scale data initiatives from planning through execution, ensuring performance, cost efficiency, and reliability.
This position is posted by Jobgether on behalf of a partner company.
Work with data end-to-end, exploring, cleaning, and assembling large, complex datasets. Analyze raw data from multiple sources and identify trends and patterns, maintaining reliable data pipelines. Build analytics-ready outputs and models that enable self-service and trustworthy insights across the organization.
Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York, delivering top-tier technology solutions for over two decades.
The Sr Data Engineer, DevX creates the best developer experience for data and application engineers at Basis. They design, implement and maintain deployment and ETL pipelines for data products. Integrate diverse data sources and vendor products, including databases, APIs, and third-party services.
Basis Technologies empowers agencies and brands with cutting-edge software that automates digital media operations, offering flexible work options across the U.S.
As a Senior Data Engineer, shape a scalable data platform that drives business insights. Design and maintain robust data pipelines and collaborate with cross-functional teams. Tackle complex data challenges, implement best practices, and mentor junior engineers.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.
Design, build, and maintain high-volume ETL pipelines and ingestion systems.
Architect reusable data integrations to replace legacy, bucket-based custom reporting.
Deliver BI-ready datasets consumed by tools such as Snowflake, Tableau, Looker, or internal platforms.
Ubiminds partners with American software product companies to scale their development footprint by offering staff augmentation and employer-of-record services.
Build end-to-end data solutions that include ingest, logging, validation, cleaning, transformation, and security. Lead the design, development, and delivery of scalable data pipelines and ETL processes. Design and evolve robust data models and storage patterns that support analytics and efficiency use-cases.
Founded in 1997, Expression provides data fusion, data analytics, AI/ML, software engineering, information technology, and electromagnetic spectrum management solutions.
Architect and maintain robust data pipelines to transform diverse data inputs.
Integrate data from various sources into a unified platform.
Build APIs with AI assistance to enable secure access to consolidated insights.
Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.
Develop and maintain scalable data pipelines and ETL processes.
Design, build, and optimize data models and databases.
Perform data analysis, data mining, and statistical modeling.
We’re supporting a global fintech and digital currency platform in their search for a Senior Data Engineer to help scale and optimize their analytics and data infrastructure.
Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Create and maintain optimal data pipeline architecture.
Assemble large, complex data sets that meet functional / non-functional business requirements.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Tivity Health, Inc. provides healthy life-changing solutions, including SilverSneakers®, Prime® Fitness, and WholeHealth Living®. They help adults improve their health and support them on life's journey by providing access to in-person and virtual physical activity, social and mental enrichment programs. Tivity Health is an equal employment opportunity employer and is committed to a proactive program of diversity development.
Migrate data and analytics workloads from BigQuery to Snowflake
Develop and optimize ETL/ELT pipelines using Python and SQL
Build analytics-ready datasets for reporting and dashboards
Egen is a fast-growing and entrepreneurial company with a data-first mindset. They bring together the best engineering talent working with the most advanced technology platforms to help clients drive action and impact through data and insights.
Design, build, and maintain scalable data pipelines and warehouses for analytics and reporting.
Develop and optimize data models in Snowflake or similar platforms.
Implement ETL/ELT processes using Python and modern data tools.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company; the final decision and next steps (interviews, assessments) are managed by their internal team.
Design and implement scalable, high-performance data architectures to support business needs.
Develop, automate, and maintain production-grade data pipelines using modern data stack tools and best practices.
Optimize data workflows and implement observability frameworks to monitor pipeline performance, reliability, and accuracy.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.
As a key member of our Data Engineering team, you will: Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives. Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture. Collaborate with the team to meet performance, scalability, and reliability goals.
PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences.