Design, develop, and optimize EDP data pipelines using Python, Airflow, DBT, and Snowflake for scalable financial data processing.
Build performant Snowflake data models and DBT transformations following best practices, standards, and documentation guidelines.
Own ingestion, orchestration, monitoring, and SLA-driven workflows; proactively troubleshoot failures and improve reliability.
Miratech is a global IT services and consulting company that brings together enterprise and start-up innovation, supporting digital transformation for some of the world's largest enterprises. They retain nearly 1000 full-time professionals, and their culture of Relentless Performance has enabled over 99% of Miratech's engagements to succeed.
Design, build, and optimize scalable data lakes, warehouses, and data pipelines using Snowflake and modern cloud platforms.
Develop and maintain robust data models (ELT/ETL), ensuring clean, reliable, and well-documented datasets.
Design and manage end-to-end data pipelines that ingest, transform, and unify data from multiple systems.
Cobalt Service Partners is building the leading commercial access and security integration business in North America. Backed by Alpine Investors, with $15B+ in AUM, Cobalt has scaled rapidly since launch through acquisitions and is building a differentiated, data-driven platform.
Architect and develop cloud-native data platforms, focusing on modern data warehousing, transformation, and orchestration frameworks.
Design scalable data pipelines and models, ensure data quality and observability, and contribute to backend services and infrastructure supporting data-driven features.
Collaborate across multiple teams, influence architectural decisions, mentor engineers, and implement best practices for CI/CD and pipeline delivery.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Serve as a primary advisor to identify technical improvements and automation opportunities.
Build advanced data pipelines using the medallion architecture in Snowflake.
Write advanced ETL/ELT scripts to integrate data into enterprise data stores.
Spring Venture Group is a digital direct-to-consumer sales and marketing company focused on the senior market. They have a dedicated team of licensed insurance agents and leverage technology to help seniors navigate Medicare.
Monitor and support ETL processes moving data from on-premise or hosted servers into Snowflake
Design and maintain aggregate and reporting tables optimized for Tableau and Power BI dashboards
Optimize Snowflake performance and cost , including warehouse usage, query tuning, and table design
Affinitiv is the largest provider of end-to-end, data-driven marketing and software solutions exclusively focused on the automotive customer lifecycle. Backed by 20+ years of automotive and marketing expertise, they work with over 6,500 dealerships and every major manufacturer in the country.
Design, build, and optimize scalable data pipelines using Databricks, Apache Spark, Delta Lake, and Unity Catalog.
Develop ingestion frameworks for structured and semi‑structured data from multiple enterprise sources.
Implement data governance, data quality, and security controls across the data lifecycle.
Bridgenext is a digital consulting services leader that helps clients innovate with intention and realize their digital aspirations by creating digital products, experiences, and solutions around what real people need. Our global consulting and delivery teams facilitate highly strategic digital initiatives through digital product engineering, automation, data engineering, and infrastructure modernization services.
Design, develop, and optimize data architecture and pipelines aligned with ETL/ELT principles.
Architect workflows using DBT to convert raw data into actionable analytics.
Maintain production data pipelines with Python, DBT, Matillion, and Snowflake.
Jobgether is a platform that connects job seekers with partner companies. They use AI-powered matching to ensure applications are reviewed quickly and fairly.
Build and Maintain Bronze/Silver Layer Pipelines: ensure core data sources lands accurately, on time, and with full lineage.
Lead Data Ingestion, Transformation, and Enrichment: own the end-to-end pipeline from raw file landing through cleansed, conformed staging tables, including deduplication, standardization, code mapping, and entity resolution.
Develop Automated Ingestion Pipelines: use Snowpipe, Matillion, or custom solutions with reliability, observability, and minimal manual intervention in mind.
Precision AQ is building a centralized Data Hub to consolidate fragmented data infrastructure, establish enterprise-wide data governance, and enable AI-ready analytics across our life sciences portfolio. They value innovation and a collaborative environment.
Lead discovery conversations to understand client goals.
Design and deliver technical roadmaps for data platform adoption.
Build modern, reliable data pipelines and ETL/ELT frameworks.
InterWorks is a tech consultancy that empowers clients with customized, collaborative solutions. They value unique contributions, and their people are the glue that holds their business together as they pursue innovation alongside people who inspire them.
Lead the foundational setup of new data environments.
Design, build, and manage scalable data pipelines.
Develop and maintain robust DBT data models to support semantic layer build-outs.
CompassX is a boutique business and technology consulting firm that helps Fortune 500 and high-growth clients deliver their most strategic initiatives through digital and data-driven projects. Recognized as a three-time winner of Consulting Magazine’s Best Boutique Firms to Work For, consultants value the freedom to shape their client work and maintain a direct line to leadership.
Build and maintain scalable data pipelines from ingestion through transformation and delivery.
Design, build, and maintain our data warehouse and data marts.
Partner with stakeholders to translate business needs into clean data models.
Gurobi Optimization focuses on mathematical optimization. They empower customers to expand their use of mathematical optimization technology in order to make smarter decisions and solve some of the world's toughest and most impactful business problems.
Assemble and manage large, complex datasets that meet both non-functional and functional business requirements.
Identify, design, and implement internal process improvements to enhance scalability, optimize data delivery, and automate manual processes.
Build and maintain optimal data pipeline architecture for efficient extraction, transformation, and loading of data from various sources.
Sanford Health is one of the largest and fastest-growing not-for-profit health systems in the United States, dedicated to the work of health and healing. The organization has 53,000 employees and serves over 2 million patients across the upper Midwest.
Build and maintain reliable pipelines and datasets that enable Flex models.
Drive pragmatic improvements in architecture, tooling, and operating practices.
Implement testing, lineage/definitions, and guardrails so stakeholders can trust outputs.
Enode is a platform that powers the next generation of green energy apps by connecting and optimizing the world’s energy devices through our APIs, enabling flexible demand that prioritizes renewable energy. It is backed by leading investors like Y Combinator, Lowercarbon Capital, and Creandum, fostering a mission-driven, passionate team.
Architect and implement Databricks Lakehouse solutions for large-scale data platforms.
Design and optimize batch & streaming data pipelines using Apache Spark (PySpark/SQL).
Implement Delta Lake best practices (ACID, schema enforcement, time travel, performance tuning).
They are looking for a Databricks Architect to design and lead modern Lakehouse data platforms using Databricks. The role focuses on building scalable, high-performance data pipelines and enabling analytics and AI use cases on cloud-native data platforms.
Designing, developing, and maintaining robust, scalable, and well-documented data pipelines
Collaborating with the Tech Lead to ensure quality, performance, and maintainability of Data products
Contributing to the continuous improvement of engineering practices (CI/CD, automation, testing, documentation)
Accor Tech & Digital is the power engine of Accor technology, digital business and transformation. Their 5,000 talents are committed to deliver the best tech and digital experiences to its guests, hotels and staff across 110 countries and to shape the future of hospitality.
Design, develop, and maintain ETL/ELT pipelines on cloud-based data platforms.
Build data ingestion, transformation, and orchestration workflows using tools such as Azure Data Factory, Airflow, Fivetran, or similar.
Develop transformations and data processing logic using platforms such as Databricks, Snowflake, or equivalent.
Ankura Consulting Group, LLC is an independent global expert services and advisory firm. They deliver services and end-to-end solutions to help clients at critical inflection points related to conflict, crisis, performance, risk, strategy, and transformation, and consists of more than 2000 professionals.
Design, build, and maintain ETL/ELT pipelines and integrations across legacy and cloud systems.
Model, store, and transform data to support analytics, reporting, and downstream applications.
Build API-based and file-based integrations across enterprise platforms.
Jobgether is a platform that uses AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company.
Design, build, and maintain robust ETL/ELT pipelines to ingest large-scale datasets and high-frequency streams.
Lead the design and evolution of our enterprise data warehouse, ensuring it is scalable and performant.
Manage our data transformation layer using Dataform (preferred) or dbt to orchestrate complex, reliable workflows.
UW provides utilities all in one place, including energy, broadband, mobile, and insurance. They aim to double in size and offer savings to customers, fostering a culture that values imaginative and pragmatic problem-solvers.
Design, develop, and maintain dbt data models that support our healthcare analytics products.
Integrate and transform customer data to conform to our data specifications and pipelines.
Design and execute initiatives that improve data platform and pipeline automation and resilience.
SmarterDx, a Smarter Technologies company, builds clinical AI that is transforming how hospitals translate care into payment. Founded by physicians in 2020, our platform connects clinical context with revenue intelligence, helping health systems recover millions in missed revenue, improve quality scores, and appeal every denial.