We are looking for a Senior Data Engineer to strengthen our Content and Data team, with a focus on data quality. Your work will directly improve the accuracy of our job listings, classification systems, and matching algorithms. This is a hands-on role where you will design and improve data pipelines, manage non-relational databases, apply NLP techniques, and support ML workflows using existing models.
Source Job
20 jobs similar to Senior Data Engineer
Jobs ranked by similarity.
Build and improve processes to clean, enrich and structure large datasets. Integrate and manage existing ML models used for classification and enrichment. Apply NLP techniques to understand and categorize job descriptions and candidate profiles.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements.
Build end-to-end data solutions that include ingest, logging, validation, cleaning, transformation, and security. Lead the design, development, and delivery of scalable data pipelines and ETL processes. Design and evolve robust data models and storage patterns that support analytics and efficiency use-cases.
Founded in 1997, Expression provides data fusion, data analytics, AI/ML, software engineering, information technology, and electromagnetic spectrum management solutions.
- Develop and maintain scalable data pipelines and ETL processes.
- Design, build, and optimize data models and databases.
- Perform data analysis, data mining, and statistical modeling.
We’re supporting a global fintech and digital currency platform in their search for a Senior Data Engineer to help scale and optimize their analytics and data infrastructure.
- Design, build, and maintain highly scalable, reliable, and efficient ETL/ELT pipelines.
- Ingest data from a multitude of sources and transform raw data into clean, structured, and AI/ML-ready formats.
- Work closely with data scientists, machine learning engineers, and business analysts to understand their data needs.
Valtech exists to unlock a better way to experience the world by blending crafts, categories, and cultures, helping brands unlock new value in an increasingly digital world.
- Design, build, and maintain scalable and reliable data pipelines.
- Develop and maintain ETL data pipelines for large volumes of data, writing clean, maintainable, and efficient code.
- Work closely with product managers, data scientists, and software engineers to create and prepare datasets from disparate sources.
Curinos empowers financial institutions to make better, faster and more profitable decisions through industry-leading proprietary data, technologies and insights.
Design, implement, and maintain scalable ETL/ELT pipelines using Python, SQL, and modern orchestration frameworks. Build and optimize data models and schemas for cloud warehouses and relational databases, supporting AI and analytics workflows. Lead large-scale data initiatives from planning through execution, ensuring performance, cost efficiency, and reliability.
This position is posted by Jobgether on behalf of a partner company.
- Architect and maintain robust data pipelines to transform diverse data inputs.
- Integrate data from various sources into a unified platform.
- Build APIs with AI assistance to enable secure access to consolidated insights.
Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.
- Design, build, and maintain scalable, high-quality data pipelines.
- Implement robust data ingestion, transformation, and storage using cloud-based technologies.
- Collaborate with stakeholders to understand business goals and translate them into data engineering solutions.
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have partnerships with more than 1,000 clients and value diversity, fostering a diverse, inclusive, and safe work environment.
- Develop data models and pipelines for customer-facing applications, research, reporting and machine learning.
- Optimize data models to support efficient data storage and retrieval processes for performance and scalability.
- Optimize ETL processes for ingesting, processing and transforming large volumes of structured and unstructured data into our data ecosystem.
Inspiren offers the most complete and connected ecosystem in senior living bringing peace of mind to residents, families, and staff.
- Design, build, and scale data pipelines across a variety of source systems and streams.
- Implement appropriate design patterns while optimizing performance, cost, security, and scale and end-user experience.
- Collaborate with cross-functional teams to understand data requirements and develop efficient data acquisition and integration strategies.
NBCUniversal is one of the world's leading media and entertainment companies creating world-class content across film, television, streaming and theme parks.
- Work alongside Caylent’s Architects, Engineering Managers, and Engineers to deliver AWS solutions.
- Build solutions defined in project backlogs, writing production-ready, well-tested, and documented code across cloud environments.
- Participate in Agile ceremonies such as daily standups, sprint planning, retrospectives, and demos.
Caylent is a cloud native services company that helps organizations bring the best out of their people and technology using Amazon Web Services (AWS). They are a global company and operate fully remote with employees in Canada, the United States, and Latin America fostering a community of technological curiosity.
Work with data end-to-end, exploring, cleaning, and assembling large, complex datasets. Analyze raw data from multiple sources and identify trends and patterns, maintaining reliable data pipelines. Build analytics-ready outputs and models that enable self-service and trustworthy insights across the organization.
Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York, delivering top-tier technology solutions for over two decades.
- Train and tune machine learning models.
- Explore data and conduct feature engineering.
- Apply natural language processing to understand text and aid in conversational interfaces.
BambooHR is building a people intelligence platform that transforms HR and sets people free to do great work.
- Design, develop, and deliver AI-driven data products in partnership with engineering and product leadership.
- Translate model outputs into clear, actionable insights that drive customer understanding and decision-making.
- Contribute to the evolution of data science standards, tools, and best practices across the organization.
This position is posted by Jobgether on behalf of a partner company.
- Design, develop, and maintain robust data processes and solutions.
- Develop and maintain data models, databases, and data warehouses.
- Collaborate with stakeholders to gather requirements and provide data solutions.
Highmark Health is a national, blended health organization that includes one of America’s largest Blue Cross Blue Shield insurers.
As a Senior Data Engineer, shape a scalable data platform that drives business insights. Design and maintain robust data pipelines and collaborate with cross-functional teams. Tackle complex data challenges, implement best practices, and mentor junior engineers.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.
- Design, develop, and implement end-to-end data pipelines to support data collection and transformation.
- Lead the architecture and development of scalable and maintainable data solutions.
- Collaborate with data scientists and analysts to provide clean and accessible data.
DexCare optimizes time in healthcare, streamlining patient access, reducing waits, and enhancing overall experiences.
- Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
- Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
- Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
- Partner with clients and implementation teams to understand data distribution requirements.
- Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
- Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.
Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.
- Design, build, and maintain cloud-native data infrastructure using Terraform for IaC.
- Develop and optimize data pipelines leveraging AWS services and Snowflake.
- Build and maintain LLM frameworks, ensuring high-quality and cost-effective outputs.
ClickUp is building the first truly converged AI workspace, unifying tasks, docs, chat, calendar, and enterprise search, all supercharged by context-driven AI.