Source Job

Global

  • Strong experience in Snowflake.
  • Strong programming skills in Python and Linux Bash for automation and data workflows.
  • Expertise in Hadoop ecosystem tools and managing SQL databases for data storage and query optimization.

Python SQL Snowflake AWS Linux

20 jobs similar to Senor Data Engineer India

Jobs ranked by similarity.

US Unlimited PTO

  • Guide clients on optimizing their data environment.
  • Develop system engineering, integrations, and architectures based on client needs.
  • Implement and advise on data warehouse solutions, ETL pipelines, and BI reporting tools.

Jobgether helps candidates get their applications reviewed quickly and objectively. They use AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements.

$120,000–$160,000/yr
US

  • You will join a team of talented engineers working closely with Data Scientists to build and scale our next-generation Ad EnGage data pipeline.
  • You will work with large-scale datasets (hundreds of TBs to petabyte-scale systems) using a modern data stack centered on AWS, Airflow, dbt, and Snowflake.
  • You’ll contribute to building reliable, high-quality data pipelines and improving the performance, scalability, and observability of our data platform.

EDO is the TV outcomes company. Their leading measurement platform connects convergent TV airings to the ad-driven consumer behaviors most predictive of future sales. They are headquartered in New York City and Los Angeles with an office space in San Francisco and recognize the benefits of hybrid working.

US

  • Prepare and manage pre-stage files for backbook conversion activities.
  • Support and execute data ingestion tasks in alignment with scheduled project events, including key mock events.
  • Monitor and ensure data ingestion completion within defined SLA windows.

Kunai builds full-stack technology solutions for banks, credit and payment networks, infrastructure providers, and their customers. They help their clients modernize, capitalize on emerging trends, and evolve their business for the coming decades by remaining tech-agnostic and human-centered.

US

  • Own organizational-wide data architecture, defining standards and designs.
  • Design and develop data pipelines, integrations, and platform features.
  • Partner with product managers to define new data features and capabilities.

They offer a connected equipment platform for managing mixed assets. The company values quality, continuous learning, and collaboration within a dynamic team environment.

$111,700–$148,900/yr
US

  • Lead, manage, and mentor a group of data engineers.
  • Own the design and development of data pipelines and systems.
  • Partner cross-functionally with Data Science and Product managers.

TrueML is a mission-driven financial software company that aims to create better customer experiences for distressed borrowers. The TrueML team includes inspired data scientists, financial services industry experts and customer experience fanatics building technology to serve people.

Global

  • Design, develop, and maintain databases supporting IVR and contact center systems
  • Design and maintain relational data models for IVR event, routing, and call data
  • Ensure database availability, integrity, performance, and scalability in production environments

Miratech is a global IT services and consulting company that brings together enterprise and start-up innovation, supporting digital transformation for some of the world's largest enterprises. They retain nearly 1000 full-time professionals, experiencing an annual growth rate exceeding 25% and fostering a values-driven organization with a culture of relentless performance.

Latin America

  • Create and maintain optimal data pipeline architecture.
  • Assemble large, complex data sets that meet functional and non-functional business requirements.
  • Identify, design, and implement internal process improvements, automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability.

Coderoad is a software development company that provides end-to-end software development services. It provides an opportunity to work on exciting, real-world projects in a supportive environment, offering staff augmentation, dedicated IT teams, and general software engineering.

$100,649–$174,459/yr
US 4w PTO

  • Design, build, and maintain scalable data platforms using AWS to support analytics, machine learning, and emerging generative AI use cases.
  • Collaborate with data scientists, analysts, and engineering teams to translate business and AI requirements into scalable data solutions.
  • Work with large-scale datasets to build and optimize data pipelines using AWS services such as EMR (Spark, Trino), S3, Glue, Athena, and Airflow

Experian is a global data and technology company, powering opportunities for people and businesses around the world. They invest in people and new advanced technologies to unlock the power of data and to innovate. A FTSE 100 Index company listed on the London Stock Exchange, they have a team of 23,300 people across 32 countries.

Europe

  • Develop engineering expertise within the Dataiku Platform to help maintain and develop system integrations, platform automations, and platform configurations.
  • Build & maintain python & SQL data replication & data pipelines on large & often complex data sets.
  • Identify opportunities for improvements & optimization for greater scalability & delivery velocity

Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.

Global

  • Lead data architecture design, API assessment, and ETL requirements during the Discovery & Design phase.
  • Develop / configure CMIC ERP API integration to establish reliable data exchange between the ERP system and the AWS platform.
  • Design/implement data pipelines using AWS Glue for ETL processing of subcontractor documents and ERP data.

Capnexus is a comprehensive services provider with a team of experienced professionals in designing, building, and supporting retail software. They operate as a build-as-a-service provider with a culture built on outcomes and delivery.

Latin America

  • Design, build, and maintain data pipelines using Snowflake, Airflow, and DBT
  • Lead architectural discussions around the modern data stack
  • Develop scalable ETL and ELT processes using Python and SQL

They are a well-funded healthcare technology company using AI and modern data infrastructure to transform how healthcare and public health decisions are made. The team is small, mission-driven, and building systems that turn raw healthcare data into actionable intelligence at scale.

Latam

  • Design, build, and maintain data pipelines (ETL/ELT) in batch and streaming environments.
  • Develop solutions for ingesting and processing large volumes of structured, unstructured, and semi-structured data.
  • Create data products that respond to the analytical needs of the business.

EX Squared LATAM builds high-impact digital solutions, working with exceptional talent throughout Latin America. They foster a culture of collaboration, continuous learning, and technical excellence.

  • Collaborate with stakeholders to build robust services using data pipeline and ETL tools, and Snowflake data warehouse.
  • Translate advanced business data and analytics problems into technical approaches that yield actionable recommendations.
  • Communicate results and educate others through visualizations, reports, and presentations.

CNG Holdings, Inc. serves consumers by providing financial solutions which fill a need and deliver value. They strive to make a difference in their customers’ lives and the communities they serve.

Global

  • Design, build, and maintain efficient data pipelines (ETL processes) to integrate data from various source systems into the data warehouse.
  • Develop and optimize data warehouse schemas and tables to support analytics and reporting needs.
  • Write and refine complex SQL queries and use scripting (e.g., Python) to transform and aggregate large datasets.

Deel is an all-in-one payroll and HR platform tailored for global teams. As one of the largest globally distributed companies, Deel's 7,000 team members span over 100 countries, fostering a dynamic culture of continuous learning and innovation.

US

  • Own the data engineering roadmap.
  • Lead, mentor, and scale a high-performing data engineering team.
  • Design and evolve our core data infrastructure on AWS, Apache Airflow, and Apache Spark.

Tekmetric is an all-in-one, cloud-based platform helping auto repair shops run smarter, grow faster, and serve customers better. Officially founded in Houston in 2017, Tekmetric has grown from a single shop’s vision to the industry’s leading solution. They value transparency, integrity, innovation, and a service-first mindset.

$100,000–$140,000/yr
US

  • Design, build, and maintain scalable data pipelines for clients across industries.
  • Architect and optimize cloud data warehouse solutions, adapting to each client's stack.
  • Collaborate with analysts and data scientists to ensure data is clean, reliable, and well-modeled.

NuView Analytics helps companies accelerate the time to insights from their data through data analytics, diligence, and fractional data science. They are a growth-stage company looking to drive additional value from the data they are sitting on and value humility, intellectual rigor, and stewardship.

US Unlimited PTO

  • Serve as the embedded technical lead for Databricks customer engagements.
  • Own Databricks platform architecture, design decisions, and technical standards.
  • Lead delivery of complex data pipelines and analytics workloads on Databricks.

540 is a forward-thinking company that the government turns to in order to #getshitdone. They break down barriers, build impactful technology, and solve mission-critical problems.

South America

  • Projetar, desenvolver e otimizar pipelines de dados escaláveis utilizando SQL e Python/PySpark.
  • Construir e manter modelos de dados voltados para análise (ex: Star Schema e OBTs).
  • Garantir qualidade, consistência e governança dos dados ao longo de todo o pipeline.

CI&T specializes in technological transformation, uniting human expertise with AI to create scalable tech solutions. With over 8,000 CI&Ters worldwide, they have partnered with more than 1,000 clients throughout their 30-year history, highlighting that Artificial Intelligence is a key aspect of their operations.

$110,000–$145,000/yr

  • Assist in delivering on internal data and business intelligence initiatives.
  • Design, implement, and maintain ETL processes to facilitate warehousing and systems integration needs.
  • Develop and enhance data models to deliver value to the organization.

CRB delivers life-changing solutions for manufacturers in the life sciences and food and beverage industries. They have over 1,100 expert professionals, and their mission, vision, and core values center around client satisfaction and employee experience.