Source Job

$103,100–$154,700/yr
US 3w PTO

  • Play a key role in designing, developing, and delivering modern data solutions that drive business insight and innovation.
  • Implement scalable, high-performing cloud architectures that support analytics, AI, and operational excellence.
  • Be responsible for technical delivery, authoring solution documentation, and ensuring data pipelines and models meet enterprise standards for performance, reliability, and cost efficiency.

Databricks Azure Python PySpark

20 jobs similar to Senior Consultant - Data Engineering

Jobs ranked by similarity.

$120,000–$160,000/yr
US

  • Design and implement scalable, reliable, and efficient data pipelines to support clinical, operational, and business needs.
  • Optimize data storage and processing in data lakes and cloud data warehouses (Azure, Databricks).
  • Proactively suggest improvements to infrastructure, processes, and automation to improve system efficiency, reduce costs, and enhance performance.

Care Access is dedicated to ensuring that every person has the opportunity to understand their health, access the care they need, and contribute to the medical breakthroughs of tomorrow. They are working to make the future of health better for all and have hundreds of research locations, mobile clinics, and clinicians across the globe.

US

  • Design, build, and maintain scalable ETL pipelines for large-scale data processing.
  • Implement data transformations and workflows using PySpark at an intermediate to advanced level.
  • Optimize pipelines for performance, scalability, and cost efficiency across environments.

Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York. Their team of 600+ highly skilled tech professionals, based in Latin America, drives digital disruption by partnering with U.S. companies on their most impactful projects.

US

  • Architect and implement Databricks Lakehouse solutions for large-scale data platforms.
  • Design and optimize batch & streaming data pipelines using Apache Spark (PySpark/SQL).
  • Implement Delta Lake best practices (ACID, schema enforcement, time travel, performance tuning).

They are looking for a Databricks Architect to design and lead modern Lakehouse data platforms using Databricks. The role focuses on building scalable, high-performance data pipelines and enabling analytics and AI use cases on cloud-native data platforms.

Brazil

  • Writing highly maintainable and performant Python/PySpark code.
  • Understanding of Cloud environments, particularly Microsoft Azure and data orchestration systems.
  • Working with data lakes and understanding common data transformation and storage formats.

YLD helps clients build the skills and capabilities they need to stay ahead of the competition. They are a remote-first consultancy specializing in software engineering, product design, and data with teams based across London, Lisbon, and Porto.

$185,000–$200,000/yr
US

  • Build and maintain Azure Data Factory pipelines to ingest data from multiple sources.
  • Write Python code in Databricks to clean raw data and move it into the silver layer, handling deduplication, type casting, and validation.
  • Monitor daily jobs and troubleshoot any failures to ensure pipeline stability.

Jobgether is a platform that leverages AI to connect job seekers with employers. They focus on ensuring fair and efficient application reviews, connecting top candidates directly with hiring companies.

Latam

  • Design and implement data pipelines using Databricks, PySpark, and Delta Lake.
  • Work closely with business stakeholders and analysts to understand KPIs.
  • Model and structure data using dimensional modeling techniques.

Clear Tech specializes in Data, Analytics, and Artificial Intelligence, helping companies around the world transform their data into real business value. Our team combines highly skilled talent in Latin America with global best practices across cloud technologies and delivers end-to-end projects.

US

  • Design and implement data solutions for enterprise customers.
  • Create and maintain technical documentation and architectural diagrams.
  • Ensure quality and governance standards are met throughout the engineering lifecycle.

Jobgether is a company that helps candidates get hired. They use an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements.

APAC

  • Design and develop scalable, maintainable, and reusable software components with a strong emphasis on performance and reliability.
  • Collaborate with product managers to translate requirements into well-architected solutions, owning features from design through delivery
  • Build intuitive and extensible user experiences using modern UI frameworks, ensuring flexibility for customer-specific needs.

ServiceNow is a global market leader that brings innovative AI-enhanced technology to over 8,100 customers, including 85% of the Fortune 500. Our intelligent cloud-based platform seamlessly connects people, systems, and processes to empower organizations to find smarter, faster, and better ways to work.

Europe

  • Design and evolve the enterprise Azure Lakehouse architecture.
  • Lead the transformation of classic Data Warehouse environments into modern Lakehouse.
  • Define and implement architecture principles, standards, patterns, and best practices for data engineering and analytics platforms.

Deutsche Telekom IT Solutions Slovakia, formerly T-Systems Slovakia, has been a part of the Košice region since 2006. They have grown to be the second-largest employer in eastern Slovakia, with over 3900 employees, aiming to provide innovative information and communication technology services.

$150,000–$170,000/yr
US

  • Architect and develop cloud-native data platforms, focusing on modern data warehousing, transformation, and orchestration frameworks.
  • Design scalable data pipelines and models, ensure data quality and observability, and contribute to backend services and infrastructure supporting data-driven features.
  • Collaborate across multiple teams, influence architectural decisions, mentor engineers, and implement best practices for CI/CD and pipeline delivery.

Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.

India

  • Design, build, and optimize data pipelines and workflows.
  • Drive scalable data solutions to support business decisions.
  • Contribute to architectural decisions and provide technical leadership.

Jobgether is a platform that uses AI to match candidates with jobs. They focus on ensuring fair and objective reviews of applications by using AI to identify top-fitting candidates for hiring companies.

South America

  • Design, develop, and maintain ETL/ELT pipelines on cloud-based data platforms.
  • Build data ingestion, transformation, and orchestration workflows using tools such as Azure Data Factory, Airflow, Fivetran, or similar.
  • Develop transformations and data processing logic using platforms such as Databricks, Snowflake, or equivalent.

Ankura Consulting Group, LLC is an independent global expert services and advisory firm. They deliver services and end-to-end solutions to help clients at critical inflection points related to conflict, crisis, performance, risk, strategy, and transformation, and consists of more than 2000 professionals.

Europe

  • Maintain, configure, and optimize the existing data warehouse platform and pipelines.
  • Design and implement incremental data integration solutions prioritizing data quality, performance, and cost-efficiency.
  • Drive innovation by experimenting with new technologies and recommending platform improvements.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They appreciate your interest and wish you the best!

US Unlimited PTO

  • Design and implement robust data infrastructure in AWS, using Spark with Scala
  • Evolve our core data pipelines to efficiently scale for our massive growth
  • Store data in optimal engines and formats, matching your designs to our performance needs and cost factors

tvScientific is the first CTV advertising platform purpose-built for performance marketers. They leverage data and cutting-edge science to automate and optimize TV advertising to drive business outcomes. tvScientific is built by industry leaders with history in programmatic advertising, digital media, and ad verification.

$86,000–$138,000/yr
US

  • Work with the system engineering team to understand customer business needs and priorities.
  • Implement Azure cloud analytics and databases, leveraging tools such as Azure Synapse, Azure SQL, Synapse pipelines, Databricks, and related frameworks
  • Develop and maintain database solutions for managing structured and unstructured data.

Peraton is a national security company that drives missions of consequence spanning the globe. They deliver trusted, highly differentiated solutions and technologies to protect our nation and allies, serving as a valued partner to essential government agencies, supporting every branch of the U.S. armed forces.

$135,500–$200,000/yr
US

  • Architect, design, implement, and operate end-to-end data engineering solutions.
  • Develop and manage robust data integrations with external vendors.
  • Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams.

SmartAsset is an online destination for consumer-focused financial information and advice, helping people make smart financial decisions. With over 59 million people reached each month, they operate SmartAsset Advisor Marketing Platform (AMP) to connect consumers with fiduciary financial advisors.

$194,400–$305,500/yr
US

  • Play a Sr.tech lead & architect role to build world-class data solutions and applications that power crucial business decisions throughout the organization.
  • Enable a world-class engineering practice, drive the approach with which we use data, develop backend systems and data models to serve the needs of insights and play an active role in building Atlassian's data-driven culture.
  • Maintain a high bar for operational data quality and proactively address performance, scale, complexity and security considerations.

At Atlassian, they're motivated by a common goal: to unleash the potential of every team. Their software products help teams all over the planet and their solutions are designed for all types of work. They ensure that their products and culture continue to incorporate everyone's perspectives and experience, and never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status.

US 3w PTO 2w paternity

  • Serve as a primary advisor to identify technical improvements and automation opportunities.
  • Build advanced data pipelines using the medallion architecture in Snowflake.
  • Write advanced ETL/ELT scripts to integrate data into enterprise data stores.

Spring Venture Group is a digital direct-to-consumer sales and marketing company focused on the senior market. They have a dedicated team of licensed insurance agents and leverage technology to help seniors navigate Medicare.

$125,000–$145,000/yr
US

  • Design, build, and scale the lakehouse architecture that underpins analytics, machine learning, and AI.
  • Modernize our data ecosystem, making it discoverable, reliable, governed, and ready for self-service and intelligent automation.
  • Operate anywhere along the data lifecycle from ingestion and transformation to metadata, orchestration, and MLOps.

OnX is a pioneer in digital outdoor navigation with a suite of apps. With more than 400 employees, they have created regional “Basecamps” to help remote employees find connection and inspiration.

$180,000–$220,000/yr
US Unlimited PTO

  • Design and implement robust, production-grade pipelines using Python, Spark SQL, and Airflow.
  • Lead efforts to canonicalize raw healthcare data into internal models.
  • Onboard new customers by integrating their raw data into internal pipelines and canonical models.

Machinify is a healthcare intelligence company delivering value, transparency, and efficiency to health plan clients. They serve over 85 health plans, including many of the top 20, representing more than 270 million lives, with an AI-powered platform and expertise.