Design, build, and maintain reliable ETL pipelines, integrating data from multiple sources into the Google Cloud Data Warehouse.
Own the product data structure, mapping product features and behaviors to analytics-ready data models, and define meaningful KPIs.
Act as the primary bridge between Backend Engineering and BI, owning the flow from data production to analytics consumption.
TuoTempo, part of the Docplanner group since 2019, develops the market-leading CRM solution dedicated to hospitals, medical centers, and health insurance providers. The platform manages and automates the entire patient journey, centralizing contacts, communications, and processes in a single modular system integrated with the software already used by organizations.
Understand client’s business goals and technical requirements, and turn complex problems into understandable, achievable solutions using Tinybird.
Help and teach customers to work in real-time at scale; work on complex optimization problems to reduce latencies and to increase overall solution performance.
Develop quick prototypes to illustrate how certain things work, as writing efficient SQL is important for the platform heavily based on SQL.
Tinybird helps developers and data teams unlock the power of real-time data to build data products faster. They are growing quickly and expanding their marketing team to amplify their reach and accelerate their demand engine.
Design, build, and optimize scalable data architectures that power marketing analytics and survey measurement initiatives.
Deliver automated, high-impact data solutions and insights that enhance decision-making across teams.
Build robust pipelines, dashboards, and analytical frameworks in fast-paced environments.
ItD is a consulting and software development company that blends diversity, innovation, and integrity with real business results. They reject any strong hierarchy, empowering teams to deliver excellent results in a woman- and minority-led firm.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.
Build and maintain data pipelines, transform raw data into reliable models.
Develop Tableau dashboards that put insights in front of clients.
Work directly with clients and shape how their platform evolves.
DataDrive is a fast-growing managed analytics service provider. They support ongoing training, adoption, and growth of their clients’ data cultures and offer a unique team-oriented environment.
Design fault-tolerant dbt models to synthesize data from multiple sources into mart tables
Design and implement Sigma dashboards and Streamplit apps to provide clear insights into performance
Automate regular reporting workflows to reduce manual effort and increase data consistency
Weedmaps is a global leader in the cannabis industry. They are dedicated to transparency, education, and community and serve cannabis to consumers and businesses in the U.S. and worldwide.
Build, optimize, and maintain data pipelines that power our business
Define and build out abstracted reusable data sets to be used for Business Intelligence, Marketing, and Data Science Research
Design, build, and evangelize a federated data validation frameworks to be used to monitor potential data inconsistencies
Garner Health strives to transform the healthcare economy, delivering accessible, high-quality healthcare. They are a fast-growing healthcare technology company dedicated to making a meaningful impact on healthcare at scale with a team of talented, mission-driven individuals.
Design and maintain scalable data models that support reporting, experimentation, and machine learning.
Build and maintain reliable data pipelines using modern transformation frameworks.
Power the dashboards and reporting used across marketing, product, operations, and leadership.
Mood is building the future of legal cannabis in the U.S. through a digital-first, customer-obsessed approach. They are growing fast and passionate about transforming the cannabis industry through seamless, personalized customer experiences and innovative use of analytics, machine learning, and AI.
Own the delivery of scalable internal data solutions.
Translate business needs into clear technical designs and working systems.
Build and improve data pipelines, integrations, and automation.
Transparent Hiring is recruiting for a fast-growing reinsurance company operating across Germany and the United States. The environment is collaborative and driven by a strong “build and ship” mindset.
Design, build, and maintain scalable and reliable batch and real-time ETL/ELT data pipelines.
Architect and implement robust data infrastructure capable of handling high-volume data ingestion and processing.
Implement automated data quality checks, validation rules, and monitoring frameworks.
ShyftLabs is a data product company founded in early 2020 that works with Fortune 500 companies. They deliver digital solutions to help accelerate the growth of businesses across various industries through innovation; they also value strong business awareness.
Design, build, and maintain scalable batch and real-time data pipelines that power analytics, experimentation, and machine learning
Partner cross-functionally with analytics, product, engineering and operations to deliver high-quality data solutions that drive measurable business impact
Champion data quality, reliability, and observability by implementing best practices in testing, monitoring, lineage, and incident response
Gopuff is reimagining how people purchase everyday essentials, from snacks to household goods to alcohol, all delivered in minutes. They are assembling a team of thinkers, dreamers and risk-takers who know the value of peace of mind in an unpredictable world.
Own the architecture and delivery of scalable internal solutions.
Translate business needs into clear technical designs and working systems.
Build and improve data pipelines, integrations, and automation.
Transparent Hiring is recruiting for a fast-growing reinsurance company operating across Germany and the United States. The environment is collaborative and hands-on, driven by a strong “build and ship” mindset.
Build pipelines to load data from various systems into Dataiku via S3 or Snowflake.
Increase the robustness of existing production pipelines, identify bottlenecks, and set up a robust monitoring, testing processes, and documentation templates.
Build custom applications and integrations to automate manual tasks related to customer operations to help Product Operations / Support / SRE in their day-to-day activities
Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.
Create innovative solutions for handling peta-bytes of data with billions of rows & joins.
Create real time and offline features generation pipelines to managing our data infrastructure to be reliable and fast!
Develop and productionize data pipelines for our ML models in both bare-metal and the cloud environment.
Kayzen is a mobile demand-side platform (DSP) dedicated to democratizing programmatic advertising. They enable leading apps, agencies, media buyers, and brands to run programmatic customer acquisition, retargeting, and brand performance campaigns through their self-serve and managed service options.
Experience with the integration of data from multiple data sources.
Experience with various database technologies such as SQLServer, Redshift, Postgres, and RDS.
Experience designing, building, and maintaining data pipelines.
Bluelight Consulting is a leading software consultancy dedicated to designing and developing innovative technology that enhances users' lives. With a presence across the United States and Central/South America, Bluelight is in an exciting phase of expansion, continually seeking exceptional talent to join its dynamic and diverse community.
Perform technical and business tasks from analysts related to our core tools
Participate in code reviews of analysts and identifying suboptimal processes
Monitor load and alerts from our services
P2P.org is the largest institutional staking provider with a TVL of over $10B and a market share exceeding 20% in restaking. They unite talented individuals globally, sharing a passion for decentralized finance to shape finance's future with code, learning, and connection.
Lead the migration of legacy workflows from Alteryx to dbt.
Manage and monitor data pipelines using Dagster or Airflow.
Translate complex “data speak” into clear insights.
Turnitin is a recognized innovator in global education, partnering with educators and institutions to develop learning integrity solutions. Turnitin is a global organization with team members in over 35 countries and a remote-first culture.
Responsible for building core infrastructure software (pipelines, APIs, data modelling) as part of our client's data platform team.
Coach & mentor other engineers to support the growth of their technical expertise.
Implementing the appropriate technologies for scaling data access patterns, batch processing, and data streaming for soft real-time consumption.
YLD is a software engineering and design consultancy that creates digital capabilities for their clients. The company has offices in London, Lisbon, and Porto and aims to attract, inspire, develop, and retain extraordinary people.
Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology.
Develop and manage robust data integrations with external vendors and organizations (including complex API integrations).
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions.
SmartAsset is an online destination for consumer-focused financial information and advice, whose mission is helping people make smart financial decisions, reaching over an estimated 59 million people each month. A successful $110 million Series D funding round in 2021 valued the company at over $1 billion.