Design, build, and optimize scalable data pipelines using Databricks, Apache Spark, Delta Lake, and Unity Catalog.
Develop ingestion frameworks for structured and semi‑structured data from multiple enterprise sources.
Implement data governance, data quality, and security controls across the data lifecycle.
Bridgenext is a digital consulting services leader that helps clients innovate with intention and realize their digital aspirations by creating digital products, experiences, and solutions around what real people need. Our global consulting and delivery teams facilitate highly strategic digital initiatives through digital product engineering, automation, data engineering, and infrastructure modernization services.
Design, develop, and maintain data pipelines using Azure Databricks
Build and optimize data transformations using PySpark and SQL in Databricks
Implement and maintain Lakehouse architectures using Delta Lake
Miratech is a global IT services and consulting company that brings together enterprise and start-up innovation, supporting digital transformation for some of the world's largest enterprises. They retain nearly 1000 full-time professionals, and their annual growth rate exceeds 25%.
Design and implement scalable data ingestion and transformation pipelines using Databricks and cloud platforms
Lead architecture decisions for modern data platforms, including Medallion Architecture and Lakehouse patterns
Build and maintain ETL/ELT pipelines using Python and SQL, following engineering best practices
AOT Technologies helps enterprises and governments bring their ideas to life. As a boutique consulting firm, they partner with enterprises, startups, and governments to solve complex, mission-critical challenges. Their teams are collaborative and their leadership is transparent.
Design and implement scalable data architectures to support business needs.
Build and optimize data pipelines, ensuring data accessibility and security.
Develop and maintain data models, databases, and data lakes, with robust data governance.
Terawatt Infrastructure delivers large scale, turnkey charging solutions for companies rapidly deploying AV and EV fleets. With a growing portfolio of sites across the US, Terawatt is building the permanent transportation and logistics infrastructure of tomorrow through capital, real estate, development, and site operations solutions.
Implement robust data infrastructure in AWS, using Spark with Scala
Evolve core data pipelines to efficiently scale for our massive growth
Store data in optimal engines and formats
tvScientific is a CTV advertising platform purpose-built for performance marketers. They leverage data and science to automate and optimize TV advertising to drive business outcomes. TvScientific is built by industry leaders with a history in programmatic advertising and has a CTV performance platform.
Responsible for building core infrastructure software (pipelines, APIs, data modelling) as part of our client's data platform team.
Coach & mentor other engineers to support the growth of their technical expertise.
Implementing the appropriate technologies for scaling data access patterns, batch processing, and data streaming for soft real-time consumption.
YLD is a software engineering and design consultancy that creates digital capabilities for their clients. The company has offices in London, Lisbon, and Porto and aims to attract, inspire, develop, and retain extraordinary people.
Design and implement scalable, high-performing data pipelines and optimize our data architecture.
Build and deploy cloud-native solutions leveraging Azure Data Services, Databricks and other Big Data technologies.
Collaborate across teams to understand and support their data needs while ensuring the data architecture supports ongoing and future initiatives.
Redcare Pharmacy is Europe’s No.1 e-pharmacy, powered by passionate teams and cutting-edge innovation. They strive to create a healthy collaborative work environment where every employee feels valued and inspired to contribute to their vision “Until every human has their health”.
Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.
Trustonic makes smartphones affordable, enabling global access to devices and digital finance through secure smartphone locking technology. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. The company celebrates its diversity and is looking to do the right thing: for each other, the community and the planet.
Identify structural weaknesses and eliminate operational fragility.
Define clear ingestion, validation, and testing standards across the platform.
Drive ambiguous initiatives from concept to production-ready outcomes.
Life360's mission is to keep people close to the ones they love. By continuing to innovate and deliver for our customers, they have become a household name and the must-have mobile-based membership for families. Life360 has more than 500 (and growing!) remote-first employees.
Build and lead a team of 4-5 data engineers focused on reusable product artifacts
Own the product data engineering backlog in partnership with product management
Define and enforce technical standards for notebooks, pipelines, QC modules, and documentation
Qualified Health is redefining what’s possible with Generative AI in healthcare. They provide the guardrails for safe AI governance, healthcare-specific agent creation, and real-time algorithm monitoring, working alongside leading health systems to drive real change. They are a fast-growing company backed by premier investors.
Architect, implement, and maintain scalable data architectures.
Develop, optimize, and maintain ETL processes.
Optimize data processing and query performance.
Blueprint Technologies is a technology solutions firm that helps organizations unlock value from existing assets by leveraging cutting-edge technology. Their teams have unique perspectives and years of experience across multiple industries. They believe in unique perspectives and build teams of people with diverse skillsets and backgrounds.
Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion.
Implement data quality checks, monitoring, and validation processes.
Automate manual processes into centralized and scalable solutions.
Informa TechTarget accelerates growth from R&D to ROI, informing and connecting technology buyers and sellers. They are a vibrant community of over 2000 colleagues worldwide and traded on Nasdaq as part of Informa PLC.
Architect, develop, and deploy robust, scalable data solutions using Azure tools.
Design and optimize ETL/ELT data pipelines using Python, PySpark, and SQL.
Build and manage modern data architectures, including data lakes and warehouses.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. The system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Design and evolve scalable data pipelines and architectures.
Act as the primary anchor for data ingestion, transformation, and storage solutions.
Ensure mission-critical data is accessible and reliable.
CodeRoad provides end-to-end software development services, helping businesses scale with ideal infrastructure solutions. From staff augmentation to dedicated IT teams and general software engineering, their nearshore technology services empower businesses to thrive in an ever-evolving digital landscape.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.
Design, build, and maintain robust and scalable data pipelines.
Leverage the Microsoft Fabric ecosystem to unify data storage and analytics.
Architect and maintain automated CI/CD pipelines using Azure DevOps.
66degrees is an end-to-end AI transformation partner that guides enterprises from complex business challenges to clear, quantifiable outcomes. They are a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology.
Design, build, and maintain scalable batch and real-time data pipelines that power analytics, experimentation, and machine learning
Partner cross-functionally with analytics, product, engineering and operations to deliver high-quality data solutions that drive measurable business impact
Champion data quality, reliability, and observability by implementing best practices in testing, monitoring, lineage, and incident response
Gopuff is reimagining how people purchase everyday essentials, from snacks to household goods to alcohol, all delivered in minutes. They are assembling a team of thinkers, dreamers and risk-takers who know the value of peace of mind in an unpredictable world.
Lead the development of robust data pipelines and optimize data architecture.
Translate complex requirements into scalable data solutions.
JBS is an equal opportunity employer that values its employees. They are committed to hiring individuals authorized for employment in the United States on a W2 basis.
Design and implement scalable, high-throughput data ingestion systems.
Build and evolve a centralized data lake using Apache Iceberg.
Provide technical leadership through mentorship, code reviews, and design discussions.
Coupa provides a total spend management platform for businesses, which uses community-generated AI to multiply margins. They have a collaborative culture driven by transparency, openness, and a shared commitment to excellence, and are expanding their impact across the globe.