Design, develop, test, and maintain scalable applications using modern frameworks.
Actively participate in Agile/Scrum ceremonies, contributing to planning, estimation, and continuous improvement.
Contribute to architectural design discussions, test planning, and operational excellence initiatives.
Tealium is a trusted leader in real-time Customer Data Platforms (CDP), helping organizations unify their customer data to deliver more personalized, privacy-conscious experiences. Team Tealium has team members present in nearly 20 countries worldwide, serving customers across more than 30 countries, winning together with respect and appreciation.
Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Lead and mentor a team of data engineers, fostering innovation, collaboration, and continuous improvement.
Design, implement, and optimize scalable data pipelines and ETL processes to meet evolving business needs.
Ensure data quality, governance, security, and compliance with industry standards and best practices.
Jobgether is a platform that connects job seekers with companies. They use an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements.
Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions. Design and oversee the data architecture and infrastructure, ensuring scalability, performance, and security. Create scalable and efficient data processing frameworks, including ETL processes and data pipelines.
Lingaro has been on the market since 2008, with 1500+ talents currently on board in 7 global sites and emphasizes career growth and skills development.
Design, code, test, and debug data-driven applications in accordance with established coding standards and best practices.
Develop supporting data tooling, including data pipelines, APIs and related tools to support those applications, ensuring performant, scalable and secure solutions.
Integrate with various data sources and technologies to ensure the right data is available for the solutions in a timely manner.
EasyPost, founded in 2012, is a YC unicorn with the mission to simplify shipping for businesses, from startups to Fortune 500 companies. They provide a developer-friendly REST API for shipping. The team is rapidly growing and fosters a culture of builders and problem-solvers who value elegant architecture and fast decisions.
Design, build, and maintain scalable, high-quality data pipelines.
Implement robust data ingestion, transformation, and storage using cloud-based technologies.
Collaborate with stakeholders to understand business goals and translate them into data engineering solutions.
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have partnerships with more than 1,000 clients and value diversity, fostering a diverse, inclusive, and safe work environment.
Partner with clients and implementation teams to understand data distribution requirements.
Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.
Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
ATPCO is the world's primary source for air fare content. They hold over 200 million fares across 160 countries and the travel industry relies on their technology and data solutions. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Optimize SQL queries to maximize system performance.
RefinedScience is dedicated to delivering high-quality emerging tech solutions. While the job description does not contain company size or culture information, the role seems to value innovation and collaboration.
Design, build, and oversee the deployment of technology for managing structured and unstructured data.
Develop tools leveraging AI, ML, and big-data to cleanse, organize, and transform data.
Design and maintain CI/CD pipelines using GitHub Actions to automate deployment, testing, and monitoring.
NBCUniversal is one of the world's leading media and entertainment companies creating world-class content across film, television, streaming, theme parks, and more.
Architect and maintain robust data pipelines to transform diverse data inputs.
Integrate data from various sources into a unified platform.
Build APIs with AI assistance to enable secure access to consolidated insights.
Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.
Identify, prioritize and execute tasks in the software development life cycle.
Develop tools and applications by producing clean, efficient code.
Work with distributed computing systems like Apache Hudi and Trino for big data processing.
PointClickCare is a health tech company that helps providers deliver exceptional care. They empower their employees to push boundaries, innovate, and shape the future of healthcare. They are a founder-led and privately held company, recognized by Forbes as a top private cloud company.
Write and deploy crawling scripts to collect source data from the web
Write and run data transformers in Scala Spark to standardize bulk data sets
Write and run modules in Python to parse entity references and relationships from source data
Sayari is a risk intelligence provider equipping sectors with visibility into commercial relationships, delivering corporate and trade data from over 250 jurisdictions. Headquartered in Washington, D.C., its solutions are trusted globally and recognized for growth and workplace culture.
As a key member of our Data Engineering team, you will: Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives. Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture. Collaborate with the team to meet performance, scalability, and reliability goals.
PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences.
Collaborate with engineering, data science, ML, data engineering, and product analytics teams to understand and shape the future needs of our data platform and infrastructure.
Define, drive, and implement the future live ingestion layer of data into our data platform (e.g. Kafka, Kinesis).
Define and evolve standards for storage, compute, data management, provenance, and orchestration.
Inspiren offers the most complete and connected ecosystem in senior living.
Build, manage, and operationalize data pipelines for marketing use cases.
Develop a comprehensive understanding of customer and marketing data requirements.
Transform large data sets into targeted customer audiences for personalized experiences.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Our system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Architect and lead the evolution of our modern data platform.
Design and build production LLM pipelines and infrastructure that power intelligent operations.
Own end-to-end data acquisition and integration architecture across diverse sources.
Brightwheel is the largest, fastest growing, and most loved platform in early ed. They are trusted by millions of educators and families every day. The team is passionate, talented, and customer-focused and embodies their Leadership Principles in their work and culture.
Lead and guide the design and implementation of scalable streaming data pipelines.
Engineer and optimize real-time data solutions using frameworks like Apache Kafka, Flink, Spark Streaming.
Drive adoption of best practices in data governance, observability, and performance tuning for streaming workloads.
PointClickCare is a leading health tech company that’s founder-led and privately held, empowering employees to push boundaries, innovate, and shape the future of healthcare.