Partner closely with business stakeholders to understand their challenges and design end-to-end architecture.
Design, develop, and own robust, efficient, and scalable data models in Snowflake and Iceberg using dbt and advanced SQL.
Build and manage reliable data pipelines and CI/CD workflows using tools like Airflow, Python, and Terraform.
Motive empowers people who run physical operations with tools to make their work safer, more productive, and more profitable. Motive serves nearly 100,000 customers and provides complete visibility and control across a wide range of industries.
Collaborate with stakeholders to build robust services using data pipeline and ETL tools, and Snowflake data warehouse.
Translate advanced business data and analytics problems into technical approaches that yield actionable recommendations.
Communicate results and educate others through visualizations, reports, and presentations.
CNG Holdings, Inc. serves consumers by providing financial solutions which fill a need and deliver value. They strive to make a difference in their customers’ lives and the communities they serve.
Design, build, and maintain databases that power Hologram's operations.
Build and maintain ETL pipelines that move and transform data reliably.
Audit existing pipelines and data models, identify complexity, and refactor bad decisions.
Hologram is building the future of IoT connectivity, delivering internet access to millions of connected devices worldwide. They process over 5 billion transactions per month across their global infrastructure and values a fun, upbeat, and remote-first team united by their mission.
Design, build, and maintain scalable data pipelines.
Strategic partner to design scalable data solutions.
Develop reliable data models.
Optro is the leading audit, risk, ESG, and InfoSec platform on the market, surpassing $300M ARR and continuing to grow. More than 50% of the Fortune 500 leverage their award-winning technology. They innovate and are proud of what they are producing, assisting each other and breaking through barriers.
Evaluate and optimize the reporting layer of our Snowflake data warehouse to enhance cost efficiency.
Utilize advanced analytic engineering techniques within Snowflake to optimize data transformations and computations.
Optimize query performance by tuning Snowflake configurations and query execution plans.
The Motley Fool is a purpose-driven financial services company with a mission to make the world smarter, happier, and richer. They are a fast-moving, collaborative team that values high-quality work, curiosity, initiative and cares deeply about the impact their work.
Design and implement backend APIs that handle hundreds of millions of rows while maintaining high speed and security.
Work on projects that enable data access through modern APIs and AI channels (chatbots, natural language querying).
Design data models, tune query performance, and directly shape how we utilize the Snowflake platform.
IDC provides data and analytical insights. They are building channels that deliver data quickly and reliably, creating a 'single source of truth' for IDC’s data products and modern API layers, including AI tools.
Design and implement scalable data models in Snowflake
Build and maintain transformation pipelines using dbt
Develop optimized star/snowflake schemas for analytics and reporting
We are looking for a highly skilled Snowflake Data Engineer. We work closely with business stakeholders and deliver high-quality data models and insights.
Build and maintain data pipelines, transform raw data into reliable models.
Develop Tableau dashboards that put insights in front of clients.
Work directly with clients and shape how their platform evolves.
DataDrive is a fast-growing managed analytics service provider. They support ongoing training, adoption, and growth of their clients’ data cultures and offer a unique team-oriented environment.
Design, build, and maintain production data pipelines using Python, Prefect, Airflow, Jenkins or any other orchestration framework multi-phase algorithmic workflows.
Build and optimize advanced SQL transformations in Snowflake, including window functions, CTEs, stored procedures, UDFs, and semi-structured data processing.
Build and maintain dbt models for data transformation, identity resolution, and slowly changing dimension (SCD Type 2) tracking across 80+ models and multiple pipeline stages.
Kalibri helps to redefine and rebuild the hotel industry. They are looking for passionate, energetic, and hardworking people with an entrepreneurial spirit, who dream big and challenge the status quo; their team is working on cutting-edge solutions for the industry.
Responsible for building core infrastructure software (pipelines, APIs, data modelling) as part of our client's data platform team.
Coach & mentor other engineers to support the growth of their technical expertise.
Implementing the appropriate technologies for scaling data access patterns, batch processing, and data streaming for soft real-time consumption.
YLD is a software engineering and design consultancy that creates digital capabilities for their clients. The company has offices in London, Lisbon, and Porto and aims to attract, inspire, develop, and retain extraordinary people.
Design, build, and maintain data pipelines using Snowflake, Airflow, and DBT
Lead architectural discussions around the modern data stack
Develop scalable ETL and ELT processes using Python and SQL
They are a well-funded healthcare technology company using AI and modern data infrastructure to transform how healthcare and public health decisions are made. The team is small, mission-driven, and building systems that turn raw healthcare data into actionable intelligence at scale.
Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology.
Develop and manage robust data integrations with external vendors and organizations (including complex API integrations).
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions.
SmartAsset is an online destination for consumer-focused financial information and advice, whose mission is helping people make smart financial decisions, reaching over an estimated 59 million people each month. A successful $110 million Series D funding round in 2021 valued the company at over $1 billion.
Design fault-tolerant dbt models to synthesize data from multiple sources into mart tables
Design and implement Sigma dashboards and Streamplit apps to provide clear insights into performance
Automate regular reporting workflows to reduce manual effort and increase data consistency
Weedmaps is a global leader in the cannabis industry. They are dedicated to transparency, education, and community and serve cannabis to consumers and businesses in the U.S. and worldwide.
Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion.
Implement data quality checks, monitoring, and validation processes.
Automate manual processes into centralized and scalable solutions.
Informa TechTarget accelerates growth from R&D to ROI, informing and connecting technology buyers and sellers. They are a vibrant community of over 2000 colleagues worldwide and traded on Nasdaq as part of Informa PLC.
QAD Inc. is a leading provider of adaptive, cloud-based enterprise software and services for global manufacturing companies. They help customers in various industries rapidly adapt to change and innovate for competitive advantage.
Build pipelines to load data from various systems into Dataiku via S3 or Snowflake.
Increase the robustness of existing production pipelines, identify bottlenecks, and set up a robust monitoring, testing processes, and documentation templates.
Build custom applications and integrations to automate manual tasks related to customer operations to help Product Operations / Support / SRE in their day-to-day activities
Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.
Lead and manage a team of ~6 data engineers, driving execution, performance, and career development.
Own Kin’s data platform, including ingestion, storage, transformation, pipeline orchestration, and governance.
Build and optimize scalable data pipelines and architectures using tools like Snowflake, Databricks, DBT, and Airflow.
Kin simplifies homeowners' lives with smarter insurance, expanding to meet all homeowner needs. They employ Kinfolk across 35+ states and are recognized for growth, customer satisfaction, and a focus on long-term sustainability, fostering a culture of meaningful work and real impact.
Lead and grow a team of data engineers, providing mentorship and technical guidance.
Own execution of customer integrations across multiple product lines, ensuring on-time delivery.
Improve data quality and pipeline reliability by investing in better alerting and resilience.
Afresh is the leading AI company in fresh food, partnering with grocers to order billions of dollars of fresh food. They are on a mission to eliminate food waste and make fresh food accessible to all and has saved 200M lbs of food waste in 2025 alone.
Perform technical and business tasks from analysts related to our core tools
Participate in code reviews of analysts and identifying suboptimal processes
Monitor load and alerts from our services
P2P.org is the largest institutional staking provider with a TVL of over $10B and a market share exceeding 20% in restaking. They unite talented individuals globally, sharing a passion for decentralized finance to shape finance's future with code, learning, and connection.