Collaborate with stakeholders to build robust services using data pipeline and ETL tools, and Snowflake data warehouse.
Translate advanced business data and analytics problems into technical approaches that yield actionable recommendations.
Communicate results and educate others through visualizations, reports, and presentations.
CNG Holdings, Inc. serves consumers by providing financial solutions which fill a need and deliver value. They strive to make a difference in their customers’ lives and the communities they serve.
Design and implement scalable data models in Snowflake
Build and maintain transformation pipelines using dbt
Develop optimized star/snowflake schemas for analytics and reporting
We are looking for a highly skilled Snowflake Data Engineer. We work closely with business stakeholders and deliver high-quality data models and insights.
Partner closely with business stakeholders to understand their challenges and design end-to-end architecture.
Design, develop, and own robust, efficient, and scalable data models in Snowflake and Iceberg using dbt and advanced SQL.
Build and manage reliable data pipelines and CI/CD workflows using tools like Airflow, Python, and Terraform.
Motive empowers people who run physical operations with tools to make their work safer, more productive, and more profitable. Motive serves nearly 100,000 customers and provides complete visibility and control across a wide range of industries.
Design, build, and maintain databases that power Hologram's operations.
Build and maintain ETL pipelines that move and transform data reliably.
Audit existing pipelines and data models, identify complexity, and refactor bad decisions.
Hologram is building the future of IoT connectivity, delivering internet access to millions of connected devices worldwide. They process over 5 billion transactions per month across their global infrastructure and values a fun, upbeat, and remote-first team united by their mission.
Develop engineering expertise within the Dataiku Platform to help maintain and develop system integrations, platform automations, and platform configurations.
Build & maintain python & SQL data replication & data pipelines on large & often complex data sets.
Identify opportunities for improvements & optimization for greater scalability & delivery velocity
Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.
Design, develop, and maintain production-quality dbt models with a focus on performance and readability.
Translate business requirements into clean, well-documented data models that analysts and downstream consumers can rely on.
Collaborate with Data Engineering on ingestion pipelines and architecture decisions, understanding Snowflake performance and costs.
Loop provides a commerce operations suite that helps merchants make smarter decisions with every transaction, focusing on returns and exchanges. They empower over 5,000 brands to protect margins and delight customers by providing data-driven insights and tools.
Primarily responsible for analyzing data integrity challenges and identifying root cause analysis.
Craft client code that is efficient, performant, testable, scalable, and secure.
Actively participate in agile software development, including daily stand-ups and sprint planning.
3Pillar is a company where senior software engineers can collaborate with industry leaders and spearhead transformative projects that redefine urban living, establish new media channels, or drive innovation in healthcare. They are a global team that values well-being and offers flexible work environments.
Enable self-service analytics for all team members by designing clean, intuitive data models and metrics through dbt, empowering employees to make informed, data-driven decisions.
Develop and refine custom data pipelines that ingest data from operational systems to our analytics platform, handling both streaming and batch data using third-party tooling and home-grown solutions
Maintain and optimize the data platform infrastructure, focusing on data quality, ELT efficiency, and platform hygiene.
Auto Integrate makes leased vehicle maintenance frictionless for millions of customers in the USA and Canada. The business is managed by a small, global team within Fleetio, combining the resources of a scaled SaaS company with the agility of a niche market leader.
Design and develop data pipelines, with an eye toward how they fit into the broader data architecture
Make data modeling and schema design decisions — choosing the right structure for how data is stored, accessed, and extended
Think beyond the immediate task: consider how today's integration point becomes tomorrow's platform surface area
Cross Screen Media is founded by industry veterans and gives customers a new way to plan and execute video advertising campaigns. They require an amazing team and offer a collaborative and creative atmosphere, with inspired leadership.
Lead, manage, and mentor a group of data engineers.
Own the design and development of data pipelines and systems.
Partner cross-functionally with Data Science and Product managers.
TrueML is a mission-driven financial software company that aims to create better customer experiences for distressed borrowers. The TrueML team includes inspired data scientists, financial services industry experts and customer experience fanatics building technology to serve people.
Design, build, and maintain data pipelines using Snowflake, Airflow, and DBT
Lead architectural discussions around the modern data stack
Develop scalable ETL and ELT processes using Python and SQL
They are a well-funded healthcare technology company using AI and modern data infrastructure to transform how healthcare and public health decisions are made. The team is small, mission-driven, and building systems that turn raw healthcare data into actionable intelligence at scale.
Design, build, and maintain scalable data pipelines for clients across industries.
Architect and optimize cloud data warehouse solutions, adapting to each client's stack.
Collaborate with analysts and data scientists to ensure data is clean, reliable, and well-modeled.
NuView Analytics helps companies accelerate the time to insights from their data through data analytics, diligence, and fractional data science. They are a growth-stage company looking to drive additional value from the data they are sitting on and value humility, intellectual rigor, and stewardship.
Develop and maintain dbt transformation pipelines under the direction of a senior technical lead.
Write and optimize SQL used for transformation logic, data preparation, and warehouse-based development work.
Work in Snowflake to query, transform, and manage data in support of production delivery.
FormativGroup operates within the critical middle layer of business technology, where applications and systems connect infrastructure to business processes. With deep technical expertise across cloud architecture, system integration, AI, and data strategy, they bridge the gap between business goals and modern platforms.
Lead and manage a team of ~6 data engineers, driving execution, performance, and career development.
Own Kin’s data platform, including ingestion, storage, transformation, pipeline orchestration, and governance.
Build and optimize scalable data pipelines and architectures using tools like Snowflake, Databricks, DBT, and Airflow.
Kin simplifies homeowners' lives with smarter insurance, expanding to meet all homeowner needs. They employ Kinfolk across 35+ states and are recognized for growth, customer satisfaction, and a focus on long-term sustainability, fostering a culture of meaningful work and real impact.
You will join a team of talented engineers working closely with Data Scientists to build and scale our next-generation Ad EnGage data pipeline.
You will work with large-scale datasets (hundreds of TBs to petabyte-scale systems) using a modern data stack centered on AWS, Airflow, dbt, and Snowflake.
You’ll contribute to building reliable, high-quality data pipelines and improving the performance, scalability, and observability of our data platform.
EDO is the TV outcomes company. Their leading measurement platform connects convergent TV airings to the ad-driven consumer behaviors most predictive of future sales. They are headquartered in New York City and Los Angeles with an office space in San Francisco and recognize the benefits of hybrid working.
Building and maintaining production-grade data pipelines in cloud data warehouses.
Designing and developing dbt models across bronze, silver, and gold layers.
Crafting easy-to-understand visualizations and dashboards in Looker or equivalent BI tools.
Plume is a trans-founded, mission-driven company with a vision to transform healthcare for every trans life by making gender-affirming hormone therapy easily accessible. They offer an affirming, trans-centered, culturally inclusive, and fun work environment filled with purpose.
Design, build, and maintain scalable data infrastructure to support analytics and reporting across the organization.
Develop and operate ETL pipelines to ingest, transform, and deliver large-scale datasets.
Partner closely with Data Analysts and cross-functional stakeholders to provide reliable datasets and guide them in using data effectively.
Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York. With over two decades of experience, they deliver top-tier technology solutions to companies of all sizes. Their team of 600+ highly skilled tech professionals, based in Latin America, drives digital disruption by partnering with U.S. companies in their projects.
Extend, optimize, and maintain core data models for reports, machine learning, and generative AI.
Implement automation and operationalize ML models to streamline operational processes and improve efficiency.
Partner with engineering, product, and analytics teams to deliver seamless integrations and customer-facing data products.
Boulevard provides a client experience platform for appointment-based, self-care businesses, helping customers enhance client experiences. They value diversity and inclusivity, offering equal opportunities and aiming to create a supportive work environment.
Guide clients on optimizing their data environment.
Develop system engineering, integrations, and architectures based on client needs.
Implement and advise on data warehouse solutions, ETL pipelines, and BI reporting tools.
Jobgether helps candidates get their applications reviewed quickly and objectively. They use AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements.