Design, develop, and manage modern, scalable data solutions across federal and commercial environments. Build robust data pipelines, integrate data across multiple sources, and ensure high-quality, reliable data for analytics and operational use. Collaborate with cross-functional teams including Architects, Data Scientists, and DevOps engineers to deliver secure and efficient data solutions.
Source Job
20 jobs similar to Senior Consultant Data Engineer (DHS Public Trust)
Jobs ranked by similarity.
- Design, build, and maintain scalable and reliable data pipelines.
- Develop and maintain ETL data pipelines for large volumes of data, writing clean, maintainable, and efficient code.
- Work closely with product managers, data scientists, and software engineers to create and prepare datasets from disparate sources.
Curinos empowers financial institutions to make better, faster and more profitable decisions through industry-leading proprietary data, technologies and insights.
As a Senior Data Engineer, shape a scalable data platform that drives business insights. Design and maintain robust data pipelines and collaborate with cross-functional teams. Tackle complex data challenges, implement best practices, and mentor junior engineers.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.
- Design, build, and maintain scalable, high-quality data pipelines.
- Implement robust data ingestion, transformation, and storage using cloud-based technologies.
- Collaborate with stakeholders to understand business goals and translate them into data engineering solutions.
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have partnerships with more than 1,000 clients and value diversity, fostering a diverse, inclusive, and safe work environment.
- Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
- Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
- Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Responsible for designing, building, and maintaining scalable data pipelines and warehouse architectures. Integrate, transform, and manage high-volume datasets across multiple platforms. Focus on ensuring data quality, performance, and security while driving innovation through the adoption of modern tools and technologies.
This position is posted by Jobgether on behalf of a partner company.
- Design and implement scalable, high-performance data architectures to support business needs.
- Develop, automate, and maintain production-grade data pipelines using modern data stack tools and best practices.
- Optimize data workflows and implement observability frameworks to monitor pipeline performance, reliability, and accuracy.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.
Work with data end-to-end, exploring, cleaning, and assembling large, complex datasets. Analyze raw data from multiple sources and identify trends and patterns, maintaining reliable data pipelines. Build analytics-ready outputs and models that enable self-service and trustworthy insights across the organization.
Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York, delivering top-tier technology solutions for over two decades.
- Partner with clients and implementation teams to understand data distribution requirements.
- Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
- Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.
Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.
- Build and monitor Cribl’s core data tech stack including data pipelines and data warehouse.
- Develop cloud-native services and infrastructure that power scalable and reliable data systems.
- Support Cribl’s growing data science and agentic initiatives by preparing model-ready datasets.
Cribl is a company that provides a data engine for IT and Security for various industries.
- Design, build, and maintain highly scalable, reliable, and efficient ETL/ELT pipelines.
- Ingest data from a multitude of sources and transform raw data into clean, structured, and AI/ML-ready formats.
- Work closely with data scientists, machine learning engineers, and business analysts to understand their data needs.
Valtech exists to unlock a better way to experience the world by blending crafts, categories, and cultures, helping brands unlock new value in an increasingly digital world.
- Design and develop scalable data pipelines and infrastructure to process large volumes of data efficiently
- Collaborate with cross-functional teams to ensure data integrity, accessibility, and usability
- Implement and maintain data quality measures throughout the data lifecycle
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have a culture that values diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment.
- Partner with our customer teams to develop engineering plans to implement our health system partners
- Build and support robust batch and streaming pipelines
- Evolve the maturity of our monitoring systems and processes to improve visibility and failure detection in our infrastructure
Paradigm is rebuilding the clinical research ecosystem by enabling equitable access to trials for all patients. Incubated by ARCH Venture Partners and backed by leading healthcare and life sciences investors, Paradigm’s seamless infrastructure implemented at healthcare provider organizations, will bring potentially life-saving therapies to patients faster.
- Design, develop, and implement end-to-end data pipelines to support data collection and transformation.
- Lead the architecture and development of scalable and maintainable data solutions.
- Collaborate with data scientists and analysts to provide clean and accessible data.
DexCare optimizes time in healthcare, streamlining patient access, reducing waits, and enhancing overall experiences.
The Sr Data Engineer, DevX creates the best developer experience for data and application engineers at Basis. They design, implement and maintain deployment and ETL pipelines for data products. Integrate diverse data sources and vendor products, including databases, APIs, and third-party services.
Basis Technologies empowers agencies and brands with cutting-edge software that automates digital media operations, offering flexible work options across the U.S.
- Design, build and execute data pipelines.
- Build the configurable ETL framework.
- Optimize SQL queries to maximize system performance.
RefinedScience is dedicated to delivering high-quality emerging tech solutions. While the job description does not contain company size or culture information, the role seems to value innovation and collaboration.
- Lead and mentor a team of data engineers, fostering innovation, collaboration, and continuous improvement.
- Design, implement, and optimize scalable data pipelines and ETL processes to meet evolving business needs.
- Ensure data quality, governance, security, and compliance with industry standards and best practices.
Jobgether is a platform that connects job seekers with companies. They use an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements.
- Design, develop, and maintain scalable and robust data pipelines.
- Create solutions for data ingestion, transformation, and modeling using Databricks, Spark/PySpark, Cloudera, and Azure Data Factory (ADF).
- Ensure the quality, integrity, and usability of data throughout the entire pipeline.
CI&T specializes in technological transformation, uniting human expertise with AI to create scalable tech solutions. With over 8,000 CI&Ters worldwide, they have partnered with over 1,000 clients during their 30-year history, with a focus on Artificial Intelligence.
Own the design, build, and optimization of end-to-end data pipelines. Establish and enforce best practices in data modeling, orchestration, and system reliability. Collaborate with stakeholders to translate requirements into robust, scalable data solutions.
YipitData is the leading market research and analytics firm for the disruptive economy and most recently raised $475M from The Carlyle Group at a valuation of over $1B.
- Architect and maintain robust data pipelines to transform diverse data inputs.
- Integrate data from various sources into a unified platform.
- Build APIs with AI assistance to enable secure access to consolidated insights.
Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.
- Architect, design, and lead the implementation of highly complex, scalable, and resilient data solutions in the cloud.
- Quickly build subject matter expertise in a specific business area and data domain.
- Support defining and executing the overarching strategy for the analytics engineering function.
Huntress is a fully remote, global team of passionate experts and ethical badasses on a mission to break down the barriers to cybersecurity. Founded in 2015 by former NSA cyber operators, Huntress protects all businesses with enterprise-grade, fully owned, and managed cybersecurity products.