Source Job

US

  • Develop big data applications for Synchrony in Hadoop ecosystem.
  • Participate in the agile development process including backlog grooming, coding, code reviews, testing and deployment.
  • Work independently to develop analytic applications leveraging technologies such as: Hadoop, NoSQL, In-memory Data Grids, Kafka, Spark, Ab Initio.

Hadoop SQL Kafka Spark

20 jobs similar to Big Data Engineer

Jobs ranked by similarity.

US

  • Identify, prioritize and execute tasks in the software development life cycle.
  • Develop tools and applications by producing clean, efficient code.
  • Work with distributed computing systems like Apache Hudi and Trino for big data processing.

PointClickCare is a health tech company that helps providers deliver exceptional care. They empower their employees to push boundaries, innovate, and shape the future of healthcare. They are a founder-led and privately held company, recognized by Forbes as a top private cloud company.

$59,520–$77,520/yr
EMEA

  • Design, develop, test, and maintain scalable applications using modern frameworks.
  • Actively participate in Agile/Scrum ceremonies, contributing to planning, estimation, and continuous improvement.
  • Contribute to architectural design discussions, test planning, and operational excellence initiatives.

Tealium is a trusted leader in real-time Customer Data Platforms (CDP), helping organizations unify their customer data to deliver more personalized, privacy-conscious experiences. Team Tealium has team members present in nearly 20 countries worldwide, serving customers across more than 30 countries, winning together with respect and appreciation.

$150,000–$160,000/yr
US

  • Design, build and execute data pipelines.
  • Build the configurable ETL framework.
  • Optimize SQL queries to maximize system performance.

RefinedScience is dedicated to delivering high-quality emerging tech solutions. While the job description does not contain company size or culture information, the role seems to value innovation and collaboration.

$85,000–$90,000/yr
US 4w PTO

  • Write and deploy crawling scripts to collect source data from the web
  • Write and run data transformers in Scala Spark to standardize bulk data sets
  • Write and run modules in Python to parse entity references and relationships from source data

Sayari is a risk intelligence provider equipping sectors with visibility into commercial relationships, delivering corporate and trade data from over 250 jurisdictions. Headquartered in Washington, D.C., its solutions are trusted globally and recognized for growth and workplace culture.

US

  • Design, build, and optimize ETL/ELT workflows using Databricks, SQL, and Python/PySpark.
  • Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets.
  • Collaborate with cross-functional teams to deliver impactful data solutions.

Jobgether is an AI-powered platform that helps job seekers find suitable opportunities. They connect top-fitting candidates with hiring companies, streamlining the recruitment process through objective and fair assessments.

$145,290–$185,000/yr
Unlimited PTO

  • Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
  • Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.

ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.

US

  • Utilize strong SQL & Python skills to engineer sound data pipelines and conduct routine and ad hoc analysis.
  • Build reporting dashboards and visualizations to design, create, and track campaign/program KPIs.
  • Perform analyses on large data sets to understand drivers of operational efficiency.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. This company fosters a supportive and inclusive work environment.

US

  • Design, build, and maintain infrastructure for data ingestion, processing, and analysis.
  • Translate business requirements into technical solutions in collaboration with stakeholders.
  • Ensure data quality, integrity, and security throughout the data lifecycle.

Jobgether is a platform that connects job seekers with companies. They use AI-powered matching to ensure applications are reviewed quickly and fairly.

US

  • Lead and mentor a team of data engineers, fostering innovation, collaboration, and continuous improvement.
  • Design, implement, and optimize scalable data pipelines and ETL processes to meet evolving business needs.
  • Ensure data quality, governance, security, and compliance with industry standards and best practices.

Jobgether is a platform that connects job seekers with companies. They use an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements.

$150,000–$165,000/yr
US Unlimited PTO 11w maternity

  • Partner with our customer teams to develop engineering plans to implement our health system partners
  • Build and support robust batch and streaming pipelines
  • Evolve the maturity of our monitoring systems and processes to improve visibility and failure detection in our infrastructure

Paradigm is rebuilding the clinical research ecosystem by enabling equitable access to trials for all patients. Incubated by ARCH Venture Partners and backed by leading healthcare and life sciences investors, Paradigm’s seamless infrastructure implemented at healthcare provider organizations, will bring potentially life-saving therapies to patients faster.

$135,000–$165,000/yr
US Unlimited PTO

  • Design, build, and maintain scalable data pipelines.
  • Develop and implement data models for analytical use cases.
  • Implement data quality checks and governance practices.

MO helps government leaders shape the future. They engineer scalable, human-centered solutions that help agencies deliver their mission faster and better. They are building a company where technologists, designers, and builders can serve the mission and grow their craft.

$190,800–$267,100/yr
US

  • Be the Analytics Engineering lead within the Sales and Marketing organization.
  • Be the data steward for Sales and Marketing: architect and improve the data collection.
  • Develop and maintain robust data pipelines and workflows for data ingestion and transformation.

Reddit is a community-driven platform built on shared interests and trust, fostering open and authentic conversations. With over 100,000 active communities and approximately 116 million daily active unique visitors, it serves as a major source of information on the internet.

US Europe

  • Design and develop scalable data pipelines and infrastructure to process large volumes of data efficiently
  • Collaborate with cross-functional teams to ensure data integrity, accessibility, and usability
  • Implement and maintain data quality measures throughout the data lifecycle

CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have a culture that values diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment.

$110,572–$145,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content. They hold over 200 million fares across 160 countries and the travel industry relies on their technology and data solutions. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.

$125,000–$150,000/yr
US

  • Design, implement, and optimize robust and scalable data pipelines using SQL, Python, and cloud-based ETL tools such as Databricks.
  • Enhance our overarching data architecture strategy, assisting in decisions related to data storage, consumption, integration, and management within cloud environments.
  • Partner with data scientists, BI teams, and other engineering teams to understand and translate complex data requirements into actionable engineering solutions.

The New York Blood Center appears to be a medical organization. They are looking for a Senior Data Engineer to join their team.

South America

  • Design, develop, and maintain scalable and robust data pipelines.
  • Create solutions for data ingestion, transformation, and modeling using Databricks, Spark/PySpark, Cloudera, and Azure Data Factory (ADF).
  • Ensure the quality, integrity, and usability of data throughout the entire pipeline.

CI&T specializes in technological transformation, uniting human expertise with AI to create scalable tech solutions. With over 8,000 CI&Ters worldwide, they have partnered with over 1,000 clients during their 30-year history, with a focus on Artificial Intelligence.

$0–$200,000/yr
North America Latin America

  • Architect and maintain robust data pipelines to transform diverse data inputs.
  • Integrate data from various sources into a unified platform.
  • Build APIs with AI assistance to enable secure access to consolidated insights.

Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.

$175,000–$225,000/yr
US

  • Lead product requirements and advanced analytics requirements gathering efforts.
  • Work with analytics, data science, and wider engineering teams to help with automating data analysis and visualization needs.
  • Build a scalable technology platform to support a growing business and deliver high-quality code to production.

Achieve is a leading digital personal finance company that helps everyday people move from struggling to thriving by providing innovative, personalized financial solutions. They have over 3,000 employees in mostly hybrid and 100% remote roles across the United States with hubs in Arizona, California, and Texas and a culture of putting people first.

US

  • Design and engineer robust data pipelines using technologies like Databricks, Azure Data Factory, Apache Spark, and Delta Lake.
  • Craft healthcare data solutions - processing massive healthcare datasets, optimizing performance, and ensuring data is accurate and secure.
  • Communicate technical concepts to non-technical stakeholders, manage multiple priorities, and meet deadlines.

Gentiva offers compassionate care in the comfort of patients' homes as a national leader in hospice, palliative, home health care, and advanced illness management. They have nearly 600 locations and thousands of clinicians across 38 states, offering rewarding careers in a collaborative environment.

$115,000–$145,000/yr
US

  • Collaborate with business leaders, engineers, and product managers to understand data needs.
  • Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

NBCUniversal is one of the world's leading media and entertainment companies that creates world-class content, which we distribute across our portfolio of film, television, and streaming, and bring to life through our global theme park destinations, consumer products, and experiences. We champion an inclusive culture and strive to attract and develop a talented workforce to create and deliver a wide range of content reflecting our world.