Build and Maintain Bronze/Silver Layer Pipelines: ensure core data sources lands accurately, on time, and with full lineage.
Lead Data Ingestion, Transformation, and Enrichment: own the end-to-end pipeline from raw file landing through cleansed, conformed staging tables, including deduplication, standardization, code mapping, and entity resolution.
Develop Automated Ingestion Pipelines: use Snowpipe, Matillion, or custom solutions with reliability, observability, and minimal manual intervention in mind.
Design, develop, and optimize EDP data pipelines using Python, Airflow, DBT, and Snowflake for scalable financial data processing.
Build performant Snowflake data models and DBT transformations following best practices, standards, and documentation guidelines.
Own ingestion, orchestration, monitoring, and SLA-driven workflows; proactively troubleshoot failures and improve reliability.
Miratech is a global IT services and consulting company that brings together enterprise and start-up innovation, supporting digital transformation for some of the world's largest enterprises. They retain nearly 1000 full-time professionals, and their culture of Relentless Performance has enabled over 99% of Miratech's engagements to succeed.
Architect and develop cloud-native data platforms, focusing on modern data warehousing, transformation, and orchestration frameworks.
Design scalable data pipelines and models, ensure data quality and observability, and contribute to backend services and infrastructure supporting data-driven features.
Collaborate across multiple teams, influence architectural decisions, mentor engineers, and implement best practices for CI/CD and pipeline delivery.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Design, develop, and maintain dbt data models that support our healthcare analytics products.
Integrate and transform customer data to conform to our data specifications and pipelines.
Design and execute initiatives that improve data platform and pipeline automation and resilience.
SmarterDx, a Smarter Technologies company, builds clinical AI that is transforming how hospitals translate care into payment. Founded by physicians in 2020, our platform connects clinical context with revenue intelligence, helping health systems recover millions in missed revenue, improve quality scores, and appeal every denial.
Build and evolve our semantic layer, design, document, and optimize dbt models.
Develop and maintain ETL/orchestration pipelines to ensure reliable and scalable data flow.
Partner with data analysts, scientists, and stakeholders to enable high-quality data access and experimentation.
Customer.io's platform is used by over 7,500 companies to send billions of emails, push notifications, in-app messages, and SMS every day. They power automated communication and help teams send smarter, more relevant messages using real-time behavioral data; their culture values empathy, transparency, and responsibility.
Design and develop data integration pipelines across cloud and legacy systems.
Lead and support data engineering implementation efforts.
Apply advanced analytics techniques to support business insights and decision-making.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company, with the final decision managed by the internal team.
Assist in designing and implementing Snowflake-based analytics solutions.
Build and maintain data pipelines adhering to enterprise architecture principles.
Act as a technical leader within the team, ensuring quality deliverables.
Jobgether is a company that uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Architect, design, implement, and operate end-to-end data engineering solutions.
Develop and manage robust data integrations with external vendors.
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams.
SmartAsset is an online destination for consumer-focused financial information and advice, helping people make smart financial decisions. With over 59 million people reached each month, they operate SmartAsset Advisor Marketing Platform (AMP) to connect consumers with fiduciary financial advisors.
Contribute to the maintenance of data pipelines to improve ingestion and data quality.
Develop and maintain Python-based ingestion pipelines integrating data from APIs and third-party systems.
Maintain and optimize dbt transformation workflows to support curated data models.
National Debt Relief was founded in 2009 with the goal of helping consumers deal with overwhelming debt. They are one of the most-trusted and best-rated consumer debt relief providers in the United States with over 450,000 people helped.
Design and implement scalable data pipelines and solutions using Snowflake.
Develop and optimize SQL queries, stored procedures, and views for performance and efficiency.
Integrate Snowflake with ETL tools and cloud platforms.
Cayuse Commercial Services, LLC delivers technical solutions that meet customer needs. They foster teamwork and prioritize quality and inclusivity in deliverables.
Design and develop robust ETL/ELT pipelines using Python, Airflow, and DBT.
Build and optimize Snowflake data models for performance, scalability, and cost efficiency.
Implement ingestion pipelines for internal and external financial datasets (Market, Securities, Pricing, ESG, Ratings).
Miratech is a global IT services and consulting company that brings together enterprise and start-up innovation. They support digital transformation for some of the world's largest enterprises. Miratech retains nearly 1000 full-time professionals, with an annual growth rate exceeding 25%, and offers a ForeverRemote work culture.
Works as a positive team member to deliver quality applications and components within scope.
Leads and proactively identifies emerging technologies and promotes their usage.
Develops strategies for managing complex data sets through maintaining data standards.
Emory University is a leading research university fostering excellence and attracting world-class talent. They focus on innovation and preparing leaders for the future, welcoming candidates who contribute to their academic community's excellence.
Re-architect the data warehouse and drive the expansion of our data platform.
Migrate data pipelines to a modern architecture, improving governance, and enabling better internal data sharing.
Lead a high-performing data engineering team by driving collaboration, growth, and accountability.
Engine is transforming business travel into something personalized, rewarding, and simple. They are building a platform that brings together corporate travel, a charge card, and modern spend management in one place and more than 20,000 companies already rely on Engine. Engine is cash flow positive with rapid growth and has been recognized as one of the fastest-growing travel and fintech platforms in North America.
Contribute to the technical integrity and evolution of the Data Platform tech stack.
Design and implement core features and enhancements within the Data Platform.
Build and maintain robust data extraction, loading, and transformation processes.
Dimagi is an award-winning social enterprise and certified B-corp building software solutions and providing technology consulting services to improve the quality of essential services for underserved populations. They are passionate and committed to tackling complex health and social inequities and working towards a brighter future for all.
Design and launch our first production-grade data warehouse
Implement reliable ingestion from core clinical and operational systems
Establish data quality and monitoring standards used across engineering
Zócalo Health is a tech-enabled, community-oriented primary care organization. They serve people who have historically been underserved by the one-size-fits-all healthcare system and are scaling rapidly across states and populations.
Design and implement robust, production-grade pipelines using Python, Spark SQL, and Airflow.
Lead efforts to canonicalize raw healthcare data into internal models.
Onboard new customers by integrating their raw data into internal pipelines and canonical models.
Machinify is a healthcare intelligence company delivering value, transparency, and efficiency to health plan clients. They serve over 85 health plans, including many of the top 20, representing more than 270 million lives, with an AI-powered platform and expertise.
Monitor and support ETL processes moving data from on-premise or hosted servers into Snowflake
Design and maintain aggregate and reporting tables optimized for Tableau and Power BI dashboards
Optimize Snowflake performance and cost , including warehouse usage, query tuning, and table design
Affinitiv is the largest provider of end-to-end, data-driven marketing and software solutions exclusively focused on the automotive customer lifecycle. Backed by 20+ years of automotive and marketing expertise, they work with over 6,500 dealerships and every major manufacturer in the country.
Design scalable data models and transformation patterns across core data marts.
Partner cross-functionally to translate business problems into data solutions.
Define and drive data governance and certification standards.
AuditBoard is a leading audit, risk, ESG, and InfoSec platform, exceeding $300M ARR and experiencing continued growth. They empower over 50% of the Fortune 500 with their technology, fostering clarity and agility, and are recognized as one of the fastest-growing tech companies in North America.
Serve as a primary advisor to identify technical improvements and automation opportunities.
Build advanced data pipelines using the medallion architecture in Snowflake.
Write advanced ETL/ELT scripts to integrate data into enterprise data stores.
Spring Venture Group is a digital direct-to-consumer sales and marketing company focused on the senior market. They have a dedicated team of licensed insurance agents and leverage technology to help seniors navigate Medicare.
Design, build, and maintain enterprise-scale data pipelines on Snowflake Data Platform.
Design, build, and maintain cloud-native AI/ML solutions (AWS, Azure) that support advanced analytics and decision making.
Implement best practices for data quality, observability, lineage, and governance.
FUJIFILM Biotechnologies is dedicated to making a real difference in people’s lives. They partner with innovative biopharma companies to advance vaccines, cures, and gene therapies, fostering a culture that fuels passion and drive, known as Genki.