Build CDC pipelines and real-time streaming (Kafka/Flink)
Design and maintain data models (raw to staging to core)
Implement observability, data transformations and quality checks
Jobgether is a platform that connects job seekers with companies. They use an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly.
Own the delivery of scalable internal data solutions.
Translate business needs into clear technical designs and working systems.
Build and improve data pipelines, integrations, and automation.
Transparent Hiring is recruiting for a fast-growing reinsurance company operating across Germany and the United States. The environment is collaborative and driven by a strong “build and ship” mindset.
Design, build, and maintain scalable batch and real-time data pipelines that power analytics, experimentation, and machine learning
Partner cross-functionally with analytics, product, engineering and operations to deliver high-quality data solutions that drive measurable business impact
Champion data quality, reliability, and observability by implementing best practices in testing, monitoring, lineage, and incident response
Gopuff is reimagining how people purchase everyday essentials, from snacks to household goods to alcohol, all delivered in minutes. They are assembling a team of thinkers, dreamers and risk-takers who know the value of peace of mind in an unpredictable world.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.
Architect, build, and operate data infrastructure that powers Tebra’s intelligent features.
Translate business requirements into software solutions that accelerate our ability to deploy AI.
Monitor data pipelines, detect anomalies, and implement automated recovery systems.
Tebra unites Kareo and PatientPop, providing a digital backbone for practice well-being, supporting both products with a shared vision for modernized care. Over 100,000 providers trust Tebra to elevate patient experience and grow their practice, building the future of well-being with compassion and humanity.
Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology.
Develop and manage robust data integrations with external vendors and organizations (including complex API integrations).
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions.
SmartAsset is an online destination for consumer-focused financial information and advice, whose mission is helping people make smart financial decisions, reaching over an estimated 59 million people each month. A successful $110 million Series D funding round in 2021 valued the company at over $1 billion.
Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion.
Implement data quality checks, monitoring, and validation processes.
Automate manual processes into centralized and scalable solutions.
Informa TechTarget accelerates growth from R&D to ROI, informing and connecting technology buyers and sellers. They are a vibrant community of over 2000 colleagues worldwide and traded on Nasdaq as part of Informa PLC.
Design and evolve scalable data pipelines and architectures.
Act as the primary anchor for data ingestion, transformation, and storage solutions.
Ensure mission-critical data is accessible and reliable.
CodeRoad provides end-to-end software development services, helping businesses scale with ideal infrastructure solutions. From staff augmentation to dedicated IT teams and general software engineering, their nearshore technology services empower businesses to thrive in an ever-evolving digital landscape.
Build and maintain scalable data pipelines from ingestion through transformation and delivery.
Design, build, and maintain our data warehouse and data marts.
Partner with stakeholders to translate business needs into clean data models.
Gurobi Optimization focuses on mathematical optimization. They empower customers to expand their use of mathematical optimization technology in order to make smarter decisions and solve some of the world's toughest and most impactful business problems.
Design, implement, and maintain robust, scalable data pipelines to support AI, analytics, and operational reporting
Own and evolve the data warehouse architecture, ensuring it meets performance, flexibility, and governance needs
Ensure data integrity, availability, lineage, and observability across complex pipelines
Remote People is building the infrastructure to power borderless teams. Their technology handles global payroll, benefits, taxes, and compliance, enabling businesses to compliantly hire anyone anywhere at the push of a button. They are a growing, international family.
Architect, develop, and deploy robust, scalable data solutions using Azure tools.
Design and optimize ETL/ELT data pipelines using Python, PySpark, and SQL.
Build and manage modern data architectures, including data lakes and warehouses.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. The system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Design and implement scalable, high-performing data pipelines and optimize our data architecture.
Build and deploy cloud-native solutions leveraging Azure Data Services, Databricks and other Big Data technologies.
Collaborate across teams to understand and support their data needs while ensuring the data architecture supports ongoing and future initiatives.
Redcare Pharmacy is Europe’s No.1 e-pharmacy, powered by passionate teams and cutting-edge innovation. They strive to create a healthy collaborative work environment where every employee feels valued and inspired to contribute to their vision “Until every human has their health”.
Build and maintain production quality data pipelines between operational systems and BigQuery.
Implement data quality and freshness checks and monitoring processes to ensure data accuracy and consistency.
Maintain and contribute to our ingestion framework that leverages various purpose-built data load tool (dlt) connectors.
Grafana Labs is a remote-first, open-source powerhouse that provides visualization tools. They help companies manage their observability strategies with the Grafana LGTM Stack and have a global collaborative culture with a passion for meaningful work.
Build, optimize, and maintain data pipelines that power our business
Define and build out abstracted reusable data sets to be used for Business Intelligence, Marketing, and Data Science Research
Design, build, and evangelize a federated data validation frameworks to be used to monitor potential data inconsistencies
Garner Health strives to transform the healthcare economy, delivering accessible, high-quality healthcare. They are a fast-growing healthcare technology company dedicated to making a meaningful impact on healthcare at scale with a team of talented, mission-driven individuals.
Manage and mentor a high-performing team, fostering a culture of technical excellence.
Define the Data Engineering team vision, balancing immediate business needs with a long-term shift towards a self-service data mesh architecture.
Oversee the development of core data pipelines and platform tools, ensuring high performance for ingestion services.
UW provides utilities all in one place with one bill for energy, broadband, mobile and insurance, targeting savings for customers. They are aiming to double in size and are looking for people to help them achieve this goal through innovation and impact.
Design, build, and optimize scalable data architectures that power marketing analytics and survey measurement initiatives.
Deliver automated, high-impact data solutions and insights that enhance decision-making across teams.
Build robust pipelines, dashboards, and analytical frameworks in fast-paced environments.
ItD is a consulting and software development company that blends diversity, innovation, and integrity with real business results. They reject any strong hierarchy, empowering teams to deliver excellent results in a woman- and minority-led firm.
Assemble and manage large, complex datasets that meet both non-functional and functional business requirements.
Identify, design, and implement internal process improvements to enhance scalability, optimize data delivery, and automate manual processes.
Build and maintain optimal data pipeline architecture for efficient extraction, transformation, and loading of data from various sources.
Sanford Health is one of the largest and fastest-growing not-for-profit health systems in the United States, dedicated to the work of health and healing. The organization has 53,000 employees and serves over 2 million patients across the upper Midwest.
Design, develop, and maintain scalable data pipelines using cloud data services.
Serve as a technical leader, defining data engineering standards and best practices.
Lead the design and implementation of optimized data models in our cloud data warehouse.
Constant Contact empowers people by giving them the help and tools they need to grow online. They are energized by new challenges and possibilities, and they celebrate diversity and inclusion with programs in place to bring people together.
Design, build, and maintain data pipelines connecting internal systems to the organisation’s Delta Lake environment.
Develop and optimise SQL-based data transformations and relational data models to support analytics and reporting.
Integrate new data sources and systems into the data platform as the organisation expands its technology landscape.
Smart Working believes your job should not only look right on paper but also feel right every day. It connects skilled professionals with outstanding global teams and products for full-time, long-term roles, offering a genuine community that values growth and well-being.