Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Enable efficient consumption of domain data as a product by delivering and promoting strategically designed actionable datasets and data models
Build, maintain, and improve rock-solid data pipelines using a broad range of technologies like AWS Redshift, Trino, Spark, Airflow, and Kafka streaming for real-time processing
Support teams without data engineers in building decentralised data solutions and product integrations, for example, around DynamoDB Act as a data ambassador, promoting the value of data and our data platform among engineering teams and enabling cooperation
OLX operates consumer brands that facilitate trade to build a more sustainable world. They have colleagues around the world who serve millions of people every month.
Partner with clients and implementation teams to understand data distribution requirements.
Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.
Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.
Develop and maintain scalable data pipelines and ETL processes.
Design, build, and optimize data models and databases.
Perform data analysis, data mining, and statistical modeling.
We’re supporting a global fintech and digital currency platform in their search for a Senior Data Engineer to help scale and optimize their analytics and data infrastructure.
Design, build, and maintain scalable data pipelines and warehouses for analytics and reporting.
Develop and optimize data models in Snowflake or similar platforms.
Implement ETL/ELT processes using Python and modern data tools.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company; the final decision and next steps (interviews, assessments) are managed by their internal team.
Design, develop, and maintain scalable data pipelines and data warehouses.
Develop ETL/ELT processes using Python and modern data tools.
Ensure data quality, reliability, and performance across systems.
3Pillar Global is dedicated to engineering solutions that challenge conventional norms. They are an elite team of visionaries that actively shapes the tech landscape for their clients and sets global standards along the way.
Design, build, and maintain scalable, high-quality data pipelines.
Implement robust data ingestion, transformation, and storage using cloud-based technologies.
Collaborate with stakeholders to understand business goals and translate them into data engineering solutions.
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have partnerships with more than 1,000 clients and value diversity, fostering a diverse, inclusive, and safe work environment.
As a key member of our Data Engineering team, you will: Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives. Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture. Collaborate with the team to meet performance, scalability, and reliability goals.
PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences.
Work alongside Caylent’s Architects, Engineering Managers, and Engineers to deliver AWS solutions.
Build solutions defined in project backlogs, writing production-ready, well-tested, and documented code across cloud environments.
Participate in Agile ceremonies such as daily standups, sprint planning, retrospectives, and demos.
Caylent is a cloud native services company that helps organizations bring the best out of their people and technology using Amazon Web Services (AWS). They are a global company and operate fully remote with employees in Canada, the United States, and Latin America fostering a community of technological curiosity.
Optimize SQL queries to maximize system performance.
RefinedScience is dedicated to delivering high-quality emerging tech solutions. While the job description does not contain company size or culture information, the role seems to value innovation and collaboration.
Responsible for designing, building, and maintaining scalable data pipelines and warehouse architectures. Integrate, transform, and manage high-volume datasets across multiple platforms. Focus on ensuring data quality, performance, and security while driving innovation through the adoption of modern tools and technologies.
This position is posted by Jobgether on behalf of a partner company.
Design, build, and maintain scalable data pipelines and workflows in Snowflake.
Integrate and ingest data from multiple systems into Snowflake.
Develop and optimize SQL queries, views, and materialized datasets.
GTX Solutions is a consulting firm specializing in modern data architecture, Customer Data Platforms (CDPs), and marketing technology enablement. They work with enterprise clients across industries including Retail, Travel, Hospitality, and Financial Services to design and implement scalable data ecosystems.
Design, develop, and maintain scalable data pipelines using Snowflake and dbt.
Write and optimize advanced SQL queries for performance and reliability.
Implement ETL/ELT processes to ingest and transform data from multiple sources.
Nagarro is a digital product engineering company that is scaling in a big way and builds products, services, and experiences that inspire, excite, and delight.
Support their managed analytics service client through their data-driven journey and deliver measurable business value through data modeling, API integration, SQL scripting, and data pipeline development.
Bridge the important gap between data applications and insightful business reports.
Participate in building our data platform from the ground up by exploring new technologies & vendors within our cloud-first environment
DataDrive is a fast-growing managed analytics service provider that provides modern cloud analytics data platforms to data-driven organizations, while also supporting ongoing training, adoption, and growth of our clients’ data cultures. DataDrive offers a unique team-oriented environment where one can develop their skills and work directly with some of the most talented analytics professionals in the business.
Analyze and interpret complex data to identify and resolve anomalies.
Design, develop, and maintain scalable ETL pipelines using Matillion ETL.
Optimize SQL queries to improve efficiency and reduce processing time.
RxSense is a healthcare technology company providing platforms and solutions to improve the management and access of cost-effective pharmacy benefits. They are a leader in SaaS technology for healthcare that connects the pharmacy ecosystem.