Design and implement scalable data ingestion and transformation pipelines using Databricks and cloud platforms
Lead architecture decisions for modern data platforms, including Medallion Architecture and Lakehouse patterns
Build and maintain ETL/ELT pipelines using Python and SQL, following engineering best practices
AOT Technologies helps enterprises and governments bring their ideas to life. As a boutique consulting firm, they partner with enterprises, startups, and governments to solve complex, mission-critical challenges. Their teams are collaborative and their leadership is transparent.
Design and implement scalable data architectures to support business needs.
Build and optimize data pipelines, ensuring data accessibility and security.
Develop and maintain data models, databases, and data lakes, with robust data governance.
Terawatt Infrastructure delivers large scale, turnkey charging solutions for companies rapidly deploying AV and EV fleets. With a growing portfolio of sites across the US, Terawatt is building the permanent transportation and logistics infrastructure of tomorrow through capital, real estate, development, and site operations solutions.
Understand the full picture and help operations with monitoring and dashboards.
I am sorry, but the company description is not available in the job posting. It is missing details about the company's activities, size, employee count and culture.
Architect, implement, and maintain scalable data architectures.
Develop, optimize, and maintain ETL processes.
Optimize data processing and query performance.
Blueprint Technologies is a technology solutions firm that helps organizations unlock value from existing assets by leveraging cutting-edge technology. Their teams have unique perspectives and years of experience across multiple industries. They believe in unique perspectives and build teams of people with diverse skillsets and backgrounds.
Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.
Trustonic makes smartphones affordable, enabling global access to devices and digital finance through secure smartphone locking technology. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. The company celebrates its diversity and is looking to do the right thing: for each other, the community and the planet.
Build and lead a team of 4-5 data engineers focused on reusable product artifacts
Own the product data engineering backlog in partnership with product management
Define and enforce technical standards for notebooks, pipelines, QC modules, and documentation
Qualified Health is redefining what’s possible with Generative AI in healthcare. They provide the guardrails for safe AI governance, healthcare-specific agent creation, and real-time algorithm monitoring, working alongside leading health systems to drive real change. They are a fast-growing company backed by premier investors.
Lead the implementation of a resilient, privacy-first data platform architecture.
Lead the design, infrastructure, and tooling decisions for platform optimization.
Develop AI-ready architecture by creating semantic layers that define and standardize business logic.
Headspace provides access to lifelong mental health support. They combine evidence-based content, clinical care, and innovative technology to help millions of members around the world get support that’s effective and personalized. They value connecting with courage, ownership, and iterating to great.
Design, develop, and maintain data pipelines using Azure Databricks
Build and optimize data transformations using PySpark and SQL in Databricks
Implement and maintain Lakehouse architectures using Delta Lake
Miratech is a global IT services and consulting company that brings together enterprise and start-up innovation, supporting digital transformation for some of the world's largest enterprises. They retain nearly 1000 full-time professionals, and their annual growth rate exceeds 25%.
Lead, mentor, and scale a high-performing data engineering team.
Design and evolve our core data infrastructure on AWS, Apache Airflow, and Apache Spark.
Tekmetric is an all-in-one, cloud-based platform helping auto repair shops run smarter, grow faster, and serve customers better. Officially founded in Houston in 2017, Tekmetric has grown from a single shop’s vision to the industry’s leading solution. They value transparency, integrity, innovation, and a service-first mindset.
Own and evolve the data infrastructure that powers Clever's core data products.
Maintain and improve data pipeline reliability, monitoring and resolving pipeline failures.
Design and implement ingestion for new operational data sources that support Clever's speed-to-match initiative.
Clever Real Estate is a venture-backed technology company aiming to revolutionize real estate transactions. They have built a leading online education platform helping consumers save money and have earned a 4.9 TrustPilot rating with over 3,800 reviews.
Contribute to architecture and implement robust data pipelines.
Drive the creation of a secure, compliant, and privacy-focused data warehousing solution.
Partner with the data analytics team to deliver a data platform that supports accurate, actionable reporting.
Headspace provides access to lifelong mental health support. They combine evidence-based content, clinical care, and innovative technology to help millions of members around the world get support that’s effective and personalized. The company values making the mission matter, iterating to great, owning the outcome, and connecting with courage.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.
Design, build, and maintain scalable data pipelines.
Develop and optimize ETL/ELT processes using cloud data technologies.
Partner with teams to understand data requirements and improve data capture strategies.
Blueprint is a technology solutions firm with a strong presence across the United States, solving complicated problems for their clients. They are bold, smart, agile, and fun, and believe in unique perspectives, building teams of people with diverse skillsets and backgrounds.
Shape, scale, and govern our modern data ecosystem.
Deliver high quality data products that power clinical, operational, financial, and analytical outcomes.
Work closely with teams across the organization to deliver governed, high‑quality, analytics‑ready data at scale.
Interwell Health is a kidney care management company that partners with physicians. They aim to reimagine healthcare and help patients live their best lives, driven by a mission to help people and create better ways if they exist.
Design and build mission critical data pipelines with a highly scalable distributed architecture.
Help continually improve ongoing reporting and analysis processes, simplifying self-service support for business stakeholders.
Build and support reusable framework to ingest, integration and provision data
StockX is a Detroit-based technology leader focused on the online market for sneakers, apparel, accessories, electronics, collectibles, trading cards, and more. They employ 1,000 people across offices and verification centers around the world and their platform connects buyers and sellers using dynamic pricing mechanics.
Maintain and continuously improve your technical expertise to be an Airflow expert.
Work with customers to educate and guide them regarding Airflow best practices.
Collaborate with team members to design, prototype, and implement engineering solutions.
Astronomer empowers data teams to bring software, analytics, and AI to life and is behind Astro, the unified DataOps platform powered by Apache Airflow®. They are trusted by more than 800 of the world's leading enterprises, letting businesses do more with their data.
Design, develop, and maintain scalable data pipelines using cloud data services.
Serve as a technical leader, defining data engineering standards and best practices.
Lead the design and implementation of optimized data models in our cloud data warehouse.
Constant Contact empowers people by giving them the help and tools they need to grow online. They are energized by new challenges and possibilities, and they celebrate diversity and inclusion with programs in place to bring people together.