Architect, design, implement, and operate end-to-end data engineering solutions.
Develop and manage robust data integrations with external vendors.
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams.
SmartAsset is an online destination for consumer-focused financial information and advice, helping people make smart financial decisions. With over 59 million people reached each month, they operate SmartAsset Advisor Marketing Platform (AMP) to connect consumers with fiduciary financial advisors.
Partner with clients and implementation teams to understand data distribution requirements.
Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.
Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.
Build and evolve our semantic layer, design, document, and optimize dbt models.
Develop and maintain ETL/orchestration pipelines to ensure reliable and scalable data flow.
Partner with data analysts, scientists, and stakeholders to enable high-quality data access and experimentation.
Customer.io's platform is used by over 7,500 companies to send billions of emails, push notifications, in-app messages, and SMS every day. They power automated communication and help teams send smarter, more relevant messages using real-time behavioral data; their culture values empathy, transparency, and responsibility.
Design, build, and maintain scalable data pipelines.
Develop and implement data models for analytical use cases.
Implement data quality checks and governance practices.
MO helps government leaders shape the future. They engineer scalable, human-centered solutions that help agencies deliver their mission faster and better. They are building a company where technologists, designers, and builders can serve the mission and grow their craft.
Design, develop, and maintain reliable end-to-end data pipelines that connect internal and external systems.
Contribute to the performance, scalability, and reliability of our entire data ecosystem.
Work with analysts to engineer data structures and orchestrate workflows that encode core business logic.
Roo is on a mission to empower animal healthcare professionals with opportunities to earn more and achieve greater flexibility in their careers and personal lives. Powered by groundbreaking technology, Roo has built the industry-leading veterinary staffing platform, connecting Veterinarians, Technicians, and Assistants with animal hospitals for relief work and hiring opportunities.
Design, build, and maintain robust data pipelines.
Own and scale ETL/ELT processes using tools like dbt, BigQuery, and Python.
Build modular data models that power analytics, product features, and LLM agents.
Jobgether is a platform that uses AI to match candidates with jobs. They aim to review applications quickly and fairly, ensuring the top-fitting candidates are identified and shared with hiring companies.
Partner with our customer teams to develop engineering plans to implement our health system partners
Build and support robust batch and streaming pipelines
Evolve the maturity of our monitoring systems and processes to improve visibility and failure detection in our infrastructure
Paradigm is rebuilding the clinical research ecosystem by enabling equitable access to trials for all patients. Incubated by ARCH Venture Partners and backed by leading healthcare and life sciences investors, Paradigm’s seamless infrastructure implemented at healthcare provider organizations, will bring potentially life-saving therapies to patients faster.
Design, implement, and optimize robust and scalable data pipelines using SQL, Python, and cloud-based ETL tools such as Databricks.
Enhance our overarching data architecture strategy, assisting in decisions related to data storage, consumption, integration, and management within cloud environments.
Partner with data scientists, BI teams, and other engineering teams to understand and translate complex data requirements into actionable engineering solutions.
The New York Blood Center appears to be a medical organization. They are looking for a Senior Data Engineer to join their team.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
ATPCO is the world's primary source for air fare content. They hold over 200 million fares across 160 countries and the travel industry relies on their technology and data solutions. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Design, build, and maintain scalable data pipelines and workflows in Snowflake.
Integrate and ingest data from multiple systems into Snowflake.
Develop and optimize SQL queries, views, and materialized datasets.
GTX Solutions is a consulting firm specializing in modern data architecture, Customer Data Platforms (CDPs), and marketing technology enablement. They work with enterprise clients across industries including Retail, Travel, Hospitality, and Financial Services to design and implement scalable data ecosystems.
Design and develop data integration pipelines across cloud and legacy systems.
Lead and support data engineering implementation efforts.
Apply advanced analytics techniques to support business insights and decision-making.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company, with the final decision managed by the internal team.
Collaborate with business leaders, engineers, and product managers to understand data needs.
Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
NBCUniversal is one of the world's leading media and entertainment companies that creates world-class content, which we distribute across our portfolio of film, television, and streaming, and bring to life through our global theme park destinations, consumer products, and experiences. We champion an inclusive culture and strive to attract and develop a talented workforce to create and deliver a wide range of content reflecting our world.
Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Maintain, configure, and optimize the existing data warehouse platform and pipelines.
Design and implement incremental data integration solutions prioritizing data quality, performance, and cost-efficiency.
Drive innovation by experimenting with new technologies and recommending platform improvements.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They appreciate your interest and wish you the best!
Design, develop, and optimize data pipelines and ETL processes to ensure high-quality data is available for analysis.
Analyze complex datasets to identify trends, patterns, and actionable insights that drive business performance.
Implement data quality checks and governance best practices to ensure data accuracy and reliability.
Modeling Data Solutions is seeking an experienced data analytics engineer to join its personal lines property team. This is an exciting opportunity to join the US Data Science Infrastructure department helping to support creating cutting edge pricing programs.
Design, build, and optimize ETL/ELT workflows using Databricks, SQL, and Python/PySpark.
Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets.
Collaborate with cross-functional teams to deliver impactful data solutions.
Jobgether is an AI-powered platform that helps job seekers find suitable opportunities. They connect top-fitting candidates with hiring companies, streamlining the recruitment process through objective and fair assessments.
Oversee the design, development, implementation, and maintenance of all data-related systems.
Build and maintain scalable and reliable data pipelines, ensuring data quality and consistency across all systems.
Collaborate with data scientists, business analysts, and other stakeholders to identify data-related needs and requirements.
MoneyHero Group is a market-leading financial products platform in Greater Southeast Asia, reaching millions of monthly unique users and working with hundreds of commercial partners. They have a team of over 350 talented individuals and are backed by world-class organizations and companies.
Creating and maintaining optimal data pipeline architecture.
Assembling large, complex data sets that meet functional & non-functional business requirements.
Building the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of data sources using relevant technologies.
Mercer Advisors works with families to help them amplify and simplify their financial lives through integrated financial planning, investment management, tax, estate, and insurance services. They serve over 31,300 families in more than 90 cities across the U.S. and are ranked the #1 RIA Firm in the nation by Barron’s.
Design, build, and optimize robust and scalable data pipelines into our production BigQuery data warehouse.
Mentor other engineers, lead complex projects, and set high standards for data quality and engineering excellence.
Empower our BI tools, reporting, Marketing, and Data Science initiatives by ensuring a highly reliable and performant data ecosystem.
Peerspace is the leading online marketplace for venue rentals for meetings, productions, and events, opening doors to inspiring spaces worldwide. They have facilitated over $500M in transactions and are backed by investors like GV (Google Ventures) and Foundation Capital.