Source Job

Philippines

  • Expertise in designing and implementing logical and physical data models for cloud and hybrid data warehouse environments
  • Implementing data architectures to support a variety of data formats and structures including structured, semi-structured and unstructured data
  • Experience with multiple full life-cycle data warehouse implementations

Python SQL PySpark API Data Modeling

20 jobs similar to Senior Consultant - Data Architecture & Engineering

Jobs ranked by similarity.

$96,500–$144,800/yr
US 3w PTO

  • Lead support of client’s Azure Data platform and Power BI Environment, including response to any escalations while helping to analyze and resolve incidents for customers environment.
  • Consult, develop, and advise on solutions in Microsoft Azure with tools such as Synapse, Data Factory, Databricks, Azure ML, Data Lake, Data Warehouse, and Power BI.
  • Consistently learn, apply, and refine skills around data engineering and data analytics.

3Cloud hires people who aren’t afraid to experiment or fail and who are willing to give direct and candid feedback. They hire people who challenge and hold each other accountable for living 3Cloud’s core values because they know that it will result in amazing experiences and solutions for clients.

$103,100–$154,700/yr
US 3w PTO

  • Play a key role in designing, developing, and delivering modern data solutions that drive business insight and innovation.
  • Implement scalable, high-performing cloud architectures that support analytics, AI, and operational excellence.
  • Be responsible for technical delivery, authoring solution documentation, and ensuring data pipelines and models meet enterprise standards for performance, reliability, and cost efficiency.

3Cloud is a company where people aren’t afraid to experiment or fail. They hire people who care about the collective growth and success of the company, challenging each other to live by 3Cloud’s core values, and resulting in amazing experiences and solutions for clients and each other.

Europe

  • Design and evolve the enterprise Azure Lakehouse architecture.
  • Lead the transformation of classic Data Warehouse environments into modern Lakehouse.
  • Define and implement architecture principles, standards, patterns, and best practices for data engineering and analytics platforms.

Deutsche Telekom IT Solutions Slovakia, formerly T-Systems Slovakia, has been a part of the Košice region since 2006. They have grown to be the second-largest employer in eastern Slovakia, with over 3900 employees, aiming to provide innovative information and communication technology services.

$120,000–$160,000/yr
US

  • Design and implement scalable, reliable, and efficient data pipelines to support clinical, operational, and business needs.
  • Optimize data storage and processing in data lakes and cloud data warehouses (Azure, Databricks).
  • Proactively suggest improvements to infrastructure, processes, and automation to improve system efficiency, reduce costs, and enhance performance.

Care Access is dedicated to ensuring that every person has the opportunity to understand their health, access the care they need, and contribute to the medical breakthroughs of tomorrow. They are working to make the future of health better for all and have hundreds of research locations, mobile clinics, and clinicians across the globe.

Brazil

  • Writing highly maintainable and performant Python/PySpark code.
  • Understanding of Cloud environments, particularly Microsoft Azure and data orchestration systems.
  • Working with data lakes and understanding common data transformation and storage formats.

YLD helps clients build the skills and capabilities they need to stay ahead of the competition. They are a remote-first consultancy specializing in software engineering, product design, and data with teams based across London, Lisbon, and Porto.

South America

  • Design, develop, and maintain ETL/ELT pipelines on cloud-based data platforms.
  • Build data ingestion, transformation, and orchestration workflows using tools such as Azure Data Factory, Airflow, Fivetran, or similar.
  • Develop transformations and data processing logic using platforms such as Databricks, Snowflake, or equivalent.

Ankura Consulting Group, LLC is an independent global expert services and advisory firm. They deliver services and end-to-end solutions to help clients at critical inflection points related to conflict, crisis, performance, risk, strategy, and transformation, and consists of more than 2000 professionals.

North America

  • Define and govern enterprise data architecture standards.
  • Ensure interoperability and high data quality.
  • Drive innovation and strategic architecture direction.

Jobgether connects job seekers with partner companies through an AI-powered matching process. Their system quickly and fairly reviews applications against core requirements, ensuring top candidates are shared with hiring companies.

$230,000–$265,000/yr
US Unlimited PTO

  • Design and build robust, highly scalable data pipelines and lakehouse infrastructure with PySpark, Databricks, and Airflow on AWS.
  • Improve the data platform development experience for Engineering, Data Science, and Product by creating intuitive abstractions, self‑service tooling, and clear documentation.
  • Own and maintain core data pipelines and models that power internal dashboards, ML models, and customer-facing products.

Parafin aims to grow small businesses by providing them with the financial tools they need through the platforms they already sell on. They are a Series C company backed by prominent venture capitalists, with a tight-knit team of innovators from companies like Stripe, Square, and Coinbase.

$200,000–$220,000/yr
US Unlimited PTO

  • Design, develop, and maintain dbt data models that support our healthcare analytics products.
  • Integrate and transform customer data to conform to our data specifications and pipelines.
  • Design and execute initiatives that improve data platform and pipeline automation and resilience.

SmarterDx, a Smarter Technologies company, builds clinical AI that is transforming how hospitals translate care into payment. Founded by physicians in 2020, our platform connects clinical context with revenue intelligence, helping health systems recover millions in missed revenue, improve quality scores, and appeal every denial.

India

  • Lead the design and implementation of scalable ETL pipelines and data lakes in AWS
  • Develop and optimise data architectures for terabyte-scale relational and distributed data systems
  • Collaborate with Data Scientists, Software Engineers, and Architects to integrate data solutions into analytics platforms and applications

Smart Working connects skilled professionals with outstanding global teams for full-time, long-term roles. They help discover meaningful work with teams that invest in your success, empowering you to grow personally and professionally in a remote-first world.

$150,000–$170,000/yr
US

  • Architect and develop cloud-native data platforms, focusing on modern data warehousing, transformation, and orchestration frameworks.
  • Design scalable data pipelines and models, ensure data quality and observability, and contribute to backend services and infrastructure supporting data-driven features.
  • Collaborate across multiple teams, influence architectural decisions, mentor engineers, and implement best practices for CI/CD and pipeline delivery.

Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.

US

  • Architect and implement Databricks Lakehouse solutions for large-scale data platforms.
  • Design and optimize batch & streaming data pipelines using Apache Spark (PySpark/SQL).
  • Implement Delta Lake best practices (ACID, schema enforcement, time travel, performance tuning).

They are looking for a Databricks Architect to design and lead modern Lakehouse data platforms using Databricks. The role focuses on building scalable, high-performance data pipelines and enabling analytics and AI use cases on cloud-native data platforms.

US

  • Design, build, and optimize scalable data lakes, warehouses, and data pipelines using Snowflake and modern cloud platforms.
  • Develop and maintain robust data models (ELT/ETL), ensuring clean, reliable, and well-documented datasets.
  • Design and manage end-to-end data pipelines that ingest, transform, and unify data from multiple systems.

Cobalt Service Partners is building the leading commercial access and security integration business in North America. Backed by Alpine Investors, with $15B+ in AUM, Cobalt has scaled rapidly since launch through acquisitions and is building a differentiated, data-driven platform.

$86,000–$138,000/yr
US

  • Work with the system engineering team to understand customer business needs and priorities.
  • Implement Azure cloud analytics and databases, leveraging tools such as Azure Synapse, Azure SQL, Synapse pipelines, Databricks, and related frameworks
  • Develop and maintain database solutions for managing structured and unstructured data.

Peraton is a national security company that drives missions of consequence spanning the globe. They deliver trusted, highly differentiated solutions and technologies to protect our nation and allies, serving as a valued partner to essential government agencies, supporting every branch of the U.S. armed forces.

$90,000–$150,000/yr
US

  • Lead discovery conversations to understand client goals.
  • Design and deliver technical roadmaps for data platform adoption.
  • Build modern, reliable data pipelines and ETL/ELT frameworks.

InterWorks is a tech consultancy that empowers clients with customized, collaborative solutions. They value unique contributions, and their people are the glue that holds their business together as they pursue innovation alongside people who inspire them.

US Unlimited PTO

  • Design and implement scalable, performant data models.
  • Develop and optimize processes to improve the correctness of 3rd party data.
  • Implement data quality principles to raise the bar for reliability of data.

SmithRx is a venture-backed Health-Tech company disrupting the Pharmacy Benefit Management (PBM) sector with a next-generation drug acquisition platform. They have a mission-driven and collaborative culture that inspires employees to transform the U.S. healthcare system.

US

  • Design and implement data solutions for enterprise customers.
  • Create and maintain technical documentation and architectural diagrams.
  • Ensure quality and governance standards are met throughout the engineering lifecycle.

Jobgether is a company that helps candidates get hired. They use an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements.

Slovakia

  • Analyses complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models.
  • Designs, builds, implementation and maintain the database structures to support business needs including integration of vendor database solutions into data environment.
  • Develops and manages scalable data processing platforms used for exploratory data analysis and real-time analytics.

Deutsche Telekom IT Solutions Slovakia entered the life of Košice region in 2006. They have grown to the second largest employer in the eastern part of the country with more than 3900 employees. Their goal is to proactively find new ways to improve and continuously transform into the type of company providing innovative information and communication technology services.

US

  • Design, build, and maintain scalable data pipelines using Microsoft Fabric and Apache Airflow
  • Ingest, transform, and integrate data from a variety of sources, including relational systems, APIs, and MongoDB
  • Design and maintain analytical data models, including fact and dimension tables, to support reporting and analytics

Theoria Medical is a comprehensive medical group and technology company dedicated to serving patients across the care continuum with an emphasis on post-acute care and primary care. Theoria serves facilities across the United States with a multitude of services to improve the quality of care delivered, refine facility processes, and enhance critical relationships.

$125,000–$150,000/yr
US

  • Build Production-Grade Data Systems by writing clean, modular, well-tested Python code for data pipelines and platform services.
  • Design and Maintain Data Models by implementing relational data models aligned with medallion architectures (bronze/silver/gold).
  • Implement idempotent transformation logic using SQLMesh/Tobiko (preferred) or dbt to maintain Transformation & Data Quality.

Axle is a bioscience and information technology company that offers advancements in translational research, biomedical informatics, and data science applications to research centers and healthcare organizations nationally and abroad. With experts in biomedical science, software engineering, and program management, they focus on developing and applying research tools and techniques to empower decision-making and accelerate research discoveries.