Architect our AWS-based data warehouse and ingestion pipelines.
Transform high-volume simulation outputs into clean, trusted datasets.
Establish schema standards and data contracts with engineering.
Onebrief provides collaboration and AI-powered workflow software designed for military staffs, making them faster, smarter, and more efficient. The company, founded in 2019, values ownership and excellence, with a team spanning veterans and technologists; it has raised $320m+ from investors and is valued at $2.15B.
Design and build robust, highly scalable data pipelines and lakehouse infrastructure with PySpark, Databricks, and Airflow on AWS.
Improve the data platform development experience for Engineering, Data Science, and Product by creating intuitive abstractions, self‑service tooling, and clear documentation.
Own and maintain core data pipelines and models that power internal dashboards, ML models, and customer-facing products.
Parafin aims to grow small businesses by providing them with the financial tools they need through the platforms they already sell on. They are a Series C company backed by prominent venture capitalists, with a tight-knit team of innovators from companies like Stripe, Square, and Coinbase.
Identify structural weaknesses and eliminate operational fragility.
Define clear ingestion, validation, and testing standards across the platform.
Drive ambiguous initiatives from concept to production-ready outcomes.
Life360's mission is to keep people close to the ones they love. By continuing to innovate and deliver for our customers, they have become a household name and the must-have mobile-based membership for families. Life360 has more than 500 (and growing!) remote-first employees.
Design, develop, and maintain scalable data pipelines using cloud data services.
Serve as a technical leader, defining data engineering standards and best practices.
Lead the design and implementation of optimized data models in our cloud data warehouse.
Constant Contact empowers people by giving them the help and tools they need to grow online. They are energized by new challenges and possibilities, and they celebrate diversity and inclusion with programs in place to bring people together.
Build and operate data services driving our applications and APIs
Collaborate with team members and across Engineering to iteratively prototype and develop new functionality
Partner with product managers and other Zusers
Zus is a shared health data platform designed to accelerate healthcare data interoperability by providing easy-to-use patient data via API, embedded components, and direct EHR integrations. Founded in 2021, it partners with HIEs and other data networks to aggregate patient clinical history and then translates that history into user-friendly information at the point of care.
Architect and sustain self-healing pipelines using Astronomer/Airflow to ensure 24/7 data availability.
Design and optimize event-driven API ingestion frameworks leveraging AWS Lambda and DLT (Data Load Tool).
Manage high-performance modeling within AWS Redshift, utilizing DBT to transform raw transactional data into high-fidelity business intelligence.
Odisea helps close the opportunity gap between Colombia and the United States by redefining nearshoring. They are building a passionate team of professionals committed to this purpose.
Architect, build, and maintain highly scalable batch and streaming pipelines on the Snowflake Data Platform.
Architect and deliver ML/GenAI solutions using managed cloud services.
Implement modern data modeling and architecture patterns; establish and enforce standards for data quality.
FUJIFILM Biotechnologies focuses on making the next vaccine, cure, or gene therapy in partnership with some of the most innovative biopharma companies across the globe. They cultivate a culture that will fuel your passion, energy, and drive.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.
Design and implement modern data architectures and lakehouse solutions.
Provide technical leadership throughout the project lifecycle and collaborate with business stakeholders.
Create effort estimations and provide technical expertise during the pre-sales phase.
Hiflylabs is a Budapest-based company that provides effective solutions to business problems. They have 250+ employees and their application development team and business intelligence experts work in various industries such as the financial, telecommunication, and energy sectors.
Design, build, maintain, and operate scalable streaming and batch data pipelines.
Work with AWS services, including Redshift, EMR, and ECS, to support data processing and analytics workloads.
Develop and maintain data workflows using Python and SQL.
Southworks helps companies with software development and digital transformation. They focus on solving complex problems and delivering innovative solutions.
Lead the development of robust data pipelines and optimize data architecture.
Translate complex requirements into scalable data solutions.
JBS is an equal opportunity employer that values its employees. They are committed to hiring individuals authorized for employment in the United States on a W2 basis.
Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.
Trustonic provides smartphone locking technology, enabling global access to devices and digital finance. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. They celebrate diversity and aim to do the right thing for each other, the community, and the planet.
Manage and mentor a high-performing team, fostering a culture of technical excellence.
Define the Data Engineering team vision, balancing immediate business needs with a long-term shift towards a self-service data mesh architecture.
Oversee the development of core data pipelines and platform tools, ensuring high performance for ingestion services.
UW provides utilities all in one place with one bill for energy, broadband, mobile and insurance, targeting savings for customers. They are aiming to double in size and are looking for people to help them achieve this goal through innovation and impact.
Design and implement scalable, high-performing data pipelines and optimize our data architecture.
Build and deploy cloud-native solutions leveraging Azure Data Services, Databricks and other Big Data technologies.
Collaborate across teams to understand and support their data needs while ensuring the data architecture supports ongoing and future initiatives.
Redcare Pharmacy is Europe’s No.1 e-pharmacy, powered by passionate teams and cutting-edge innovation. They strive to create a healthy collaborative work environment where every employee feels valued and inspired to contribute to their vision “Until every human has their health”.
Play a key role in designing, developing, and delivering modern data solutions that drive business insight and innovation.
Implement scalable, high-performing cloud architectures that support analytics, AI, and operational excellence.
Be responsible for technical delivery, authoring solution documentation, and ensuring data pipelines and models meet enterprise standards for performance, reliability, and cost efficiency.
3Cloud is a company where people aren’t afraid to experiment or fail. They hire people who care about the collective growth and success of the company, challenging each other to live by 3Cloud’s core values, and resulting in amazing experiences and solutions for clients and each other.
Design and evolve scalable data pipelines and architectures.
Act as the primary anchor for data ingestion, transformation, and storage solutions.
Ensure mission-critical data is accessible and reliable.
CodeRoad provides end-to-end software development services, helping businesses scale with ideal infrastructure solutions. From staff augmentation to dedicated IT teams and general software engineering, their nearshore technology services empower businesses to thrive in an ever-evolving digital landscape.
Design, build, and maintain reliable software within a defined problem space.
Focus on strong execution, sound technical decision making, and delivering high quality software.
Support modernization and quality improvements through hands-on development and technical leadership.
Reveleer likely offers solutions in the healthcare technology sector. They value strong technical skills, code quality, and delivering high-quality software efficiently.
Build and maintain Azure Data Factory pipelines for data ingestion.
Write Python code in Databricks for data cleaning and transformation.
Monitor daily jobs and troubleshoot pipeline failures to ensure reliability.
Jobgether is a platform that helps candidates find relevant jobs through AI-powered matching. The company ensures applications are reviewed quickly, objectively, and fairly against the role's core requirements.
Create and maintain optimal data pipeline architecture.
Assemble large, complex data sets that meet functional / non-functional business requirements.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for high-intensive applications and greater scalability, etc.
We are a leading software provider of Item Chain Management solutions to consumer brand, retail and industrial enterprises around the globe. We also provide development services and support to third-party customers across the globe.