Source Job

20 jobs similar to Databricks Architect

Jobs ranked by similarity.

US

  • Design, build, and optimize ETL/ELT workflows using Databricks, SQL, and Python/PySpark.
  • Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets.
  • Collaborate with cross-functional teams to deliver impactful data solutions.

Jobgether is an AI-powered platform that helps job seekers find suitable opportunities. They connect top-fitting candidates with hiring companies, streamlining the recruitment process through objective and fair assessments.

US

  • Design, build, and maintain scalable ETL pipelines for large-scale data processing.
  • Implement data transformations and workflows using PySpark at an intermediate to advanced level.
  • Optimize pipelines for performance, scalability, and cost efficiency across environments.

Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York. Their team of 600+ highly skilled tech professionals, based in Latin America, drives digital disruption by partnering with U.S. companies on their most impactful projects.

$120,000–$160,000/yr
US

  • Design and implement scalable, reliable, and efficient data pipelines to support clinical, operational, and business needs.
  • Optimize data storage and processing in data lakes and cloud data warehouses (Azure, Databricks).
  • Proactively suggest improvements to infrastructure, processes, and automation to improve system efficiency, reduce costs, and enhance performance.

Care Access is dedicated to ensuring that every person has the opportunity to understand their health, access the care they need, and contribute to the medical breakthroughs of tomorrow. They are working to make the future of health better for all and have hundreds of research locations, mobile clinics, and clinicians across the globe.

Europe

  • Design and evolve the enterprise Azure Lakehouse architecture.
  • Lead the transformation of classic Data Warehouse environments into modern Lakehouse.
  • Define and implement architecture principles, standards, patterns, and best practices for data engineering and analytics platforms.

Deutsche Telekom IT Solutions Slovakia, formerly T-Systems Slovakia, has been a part of the Košice region since 2006. They have grown to be the second-largest employer in eastern Slovakia, with over 3900 employees, aiming to provide innovative information and communication technology services.

$90,200–$130,800/yr
US 3w PTO

  • Lead support of client’s Azure Data platform and Power BI Environment.
  • Consult, develop, and advise on solutions in Microsoft Azure and Power BI.
  • Proactively mentors junior team members and actively gives feedback.

3Cloud is a company where people roll up their sleeves to take on tough problems together. They hire people who aren’t afraid to experiment or fail and who are willing to give direct and candid feedback, so they can deliver amazing experiences and solutions to their clients.

Canada

  • Define the modern cloud-based data architecture, owning key architectural decisions.
  • Architect and operationalize Medallion data models ensuring governance and reusability.
  • Design, develop, and deploy modern data warehouse and Lakehouse solutions.

Kinaxis is a tech company that provides supply chain management solutions. They have grown to a global organization with over 2000 employees and are recognized as one of Canada’s Top Employers, fostering a culture of innovation and collaboration.

  • Architect and evolve our data platform using Snowflake and dbt.
  • Design and implement batch and streaming pipelines in AWS.
  • Define and enforce data governance, security, and privacy policies.

BlueMatrix and Street Context develop web applications for the authoring, distribution, and analysis of investment research, and for internal knowledge management and digital communications. The BlueMatrix and Street Context teams are made up of ambitious and passionate people who have turned technology development and client service into an art.

$125,000–$150,000/yr
US

  • Design, implement, and optimize robust and scalable data pipelines using SQL, Python, and cloud-based ETL tools such as Databricks.
  • Enhance our overarching data architecture strategy, assisting in decisions related to data storage, consumption, integration, and management within cloud environments.
  • Partner with data scientists, BI teams, and other engineering teams to understand and translate complex data requirements into actionable engineering solutions.

The New York Blood Center appears to be a medical organization. They are looking for a Senior Data Engineer to join their team.

US

  • Define and enforce data architecture and engineering standards across products and teams
  • Design scalable, secure, and optimized cloud data architectures that align with business goals
  • Collaborate with engineering and product teams to translate requirements into high-level and detailed system designs

Jobgether connects candidates with partner companies using an AI-powered matching process. They ensure applications are reviewed quickly and fairly, and their system identifies top candidates to share with the hiring company.

Global

  • Assist in designing and implementing Snowflake-based analytics solutions.
  • Build and maintain data pipelines adhering to enterprise architecture principles.
  • Act as a technical leader within the team, ensuring quality deliverables.

Jobgether is a company that uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company.

India

  • Design, build, and optimize data pipelines and workflows.
  • Drive scalable data solutions to support business decisions.
  • Contribute to architectural decisions and provide technical leadership.

Jobgether is a platform that uses AI to match candidates with jobs. They focus on ensuring fair and objective reviews of applications by using AI to identify top-fitting candidates for hiring companies.

$194,400–$305,500/yr
US

  • Play a Sr.tech lead & architect role to build world-class data solutions and applications that power crucial business decisions throughout the organization.
  • Enable a world-class engineering practice, drive the approach with which we use data, develop backend systems and data models to serve the needs of insights and play an active role in building Atlassian's data-driven culture.
  • Maintain a high bar for operational data quality and proactively address performance, scale, complexity and security considerations.

At Atlassian, they're motivated by a common goal: to unleash the potential of every team. Their software products help teams all over the planet and their solutions are designed for all types of work. They ensure that their products and culture continue to incorporate everyone's perspectives and experience, and never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status.

$135,000–$165,000/yr
US Unlimited PTO

  • Design, build, and maintain scalable data pipelines.
  • Develop and implement data models for analytical use cases.
  • Implement data quality checks and governance practices.

MO helps government leaders shape the future. They engineer scalable, human-centered solutions that help agencies deliver their mission faster and better. They are building a company where technologists, designers, and builders can serve the mission and grow their craft.

$110,572–$145,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content. They hold over 200 million fares across 160 countries and the travel industry relies on their technology and data solutions. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.

US Unlimited PTO 20w maternity 14w paternity

  • Build & Operate Data Pipelines, using AWS-native data tools and distributed processing frameworks.
  • Operate and improve core data platform services, addressing incidents, performance issues, and operational toil.
  • Partner with data producers and consumers to onboard pipelines, troubleshoot issues, and improve platform usability.

Fetch is a platform where millions of people use Fetch earning rewards for buying brands they love, and a whole lot more. With investments from SoftBank, Univision, and Hamilton Lane, and partnerships with Fortune 500 companies, it is reshaping how brands and consumers connect in the marketplace. Ranked as one of America’s Best Startup Employers by Forbes, Fetch fosters a people-first culture rooted in trust, accountability, and innovation.

$190,000–$220,000/yr
US

  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.

TRM is a blockchain intelligence company on a mission to build a safer financial system. They are a lean, high-impact team tackling critical challenges, and they empower governments, financial institutions, and crypto companies.

US Unlimited PTO

  • Design, build, and maintain pipelines that power all data use cases.
  • Develop intuitive, performant, and scalable data models that support product features.
  • Pay down technical debt, improve automation, and follow best practices in data modeling.

Patreon is a media and community platform where over 300,000 creators give their biggest fans access to exclusive work and experiences. They are leaders in the space, with over $10 billion generated by creators since Patreon's inception, with a team passionate about their mission.

US Unlimited PTO

  • Design and implement robust data infrastructure in AWS, using Spark with Scala
  • Evolve our core data pipelines to efficiently scale for our massive growth
  • Store data in optimal engines and formats, matching your designs to our performance needs and cost factors

tvScientific is the first CTV advertising platform purpose-built for performance marketers. They leverage data and cutting-edge science to automate and optimize TV advertising to drive business outcomes. tvScientific is built by industry leaders with history in programmatic advertising, digital media, and ad verification.

US

  • Build and maintain scalable data pipelines using Snowflake OpenFlow and related Snowflake-native tools.
  • Develop and maintain Snowflake semantic views that support analytics and reporting needs.
  • Deliver clean, governed data sets for Sigma dashboards and embedded analytics use cases.

They are building the next-generation analytics stack centered on Snowflake (AWS) and Sigma. They value diverse perspectives and innovation.

US

  • Impact the design and implementation of scalable data architecture solutions.
  • Collaborate with cross-functional teams to translate business needs into technical designs.
  • Drive enterprise-wide strategies and optimizing data solutions.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. The system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.