Design, build, and maintain scalable data pipelines.
Develop and optimize ETL/ELT processes using cloud data technologies.
Partner with teams to understand data requirements and improve data capture strategies.
Blueprint is a technology solutions firm with a strong presence across the United States, solving complicated problems for their clients. They are bold, smart, agile, and fun, and believe in unique perspectives, building teams of people with diverse skillsets and backgrounds.
Design, develop, and maintain scalable data pipelines using cloud data services.
Serve as a technical leader, defining data engineering standards and best practices.
Lead the design and implementation of optimized data models in our cloud data warehouse.
Constant Contact empowers people by giving them the help and tools they need to grow online. They are energized by new challenges and possibilities, and they celebrate diversity and inclusion with programs in place to bring people together.
Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion.
Implement data quality checks, monitoring, and validation processes.
Automate manual processes into centralized and scalable solutions.
Informa TechTarget accelerates growth from R&D to ROI, informing and connecting technology buyers and sellers. They are a vibrant community of over 2000 colleagues worldwide and traded on Nasdaq as part of Informa PLC.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.
Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology.
Develop and manage robust data integrations with external vendors and organizations (including complex API integrations).
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions.
SmartAsset is an online destination for consumer-focused financial information and advice, whose mission is helping people make smart financial decisions, reaching over an estimated 59 million people each month. A successful $110 million Series D funding round in 2021 valued the company at over $1 billion.
Build and maintain data pipelines, transform raw data into reliable models.
Develop Tableau dashboards that put insights in front of clients.
Work directly with clients and shape how their platform evolves.
DataDrive is a fast-growing managed analytics service provider. They support ongoing training, adoption, and growth of their clients’ data cultures and offer a unique team-oriented environment.
Experience with the integration of data from multiple data sources.
Experience with various database technologies such as SQLServer, Redshift, Postgres, and RDS.
Experience designing, building, and maintaining data pipelines.
Bluelight Consulting is a leading software consultancy dedicated to designing and developing innovative technology that enhances users' lives. With a presence across the United States and Central/South America, Bluelight is in an exciting phase of expansion, continually seeking exceptional talent to join its dynamic and diverse community.
Own the delivery of scalable internal data solutions.
Translate business needs into clear technical designs and working systems.
Build and improve data pipelines, integrations, and automation.
Transparent Hiring is recruiting for a fast-growing reinsurance company operating across Germany and the United States. The environment is collaborative and driven by a strong “build and ship” mindset.
Design and implement scalable data architectures to support business needs.
Build and optimize data pipelines, ensuring data accessibility and security.
Develop and maintain data models, databases, and data lakes, with robust data governance.
Terawatt Infrastructure delivers large scale, turnkey charging solutions for companies rapidly deploying AV and EV fleets. With a growing portfolio of sites across the US, Terawatt is building the permanent transportation and logistics infrastructure of tomorrow through capital, real estate, development, and site operations solutions.
Design, build, and maintain scalable batch and real-time data pipelines that power analytics, experimentation, and machine learning
Partner cross-functionally with analytics, product, engineering and operations to deliver high-quality data solutions that drive measurable business impact
Champion data quality, reliability, and observability by implementing best practices in testing, monitoring, lineage, and incident response
Gopuff is reimagining how people purchase everyday essentials, from snacks to household goods to alcohol, all delivered in minutes. They are assembling a team of thinkers, dreamers and risk-takers who know the value of peace of mind in an unpredictable world.
Design, implement, and maintain robust, scalable data pipelines to support AI, analytics, and operational reporting
Own and evolve the data warehouse architecture, ensuring it meets performance, flexibility, and governance needs
Ensure data integrity, availability, lineage, and observability across complex pipelines
Remote People is building the infrastructure to power borderless teams. Their technology handles global payroll, benefits, taxes, and compliance, enabling businesses to compliantly hire anyone anywhere at the push of a button. They are a growing, international family.
Design and evolve scalable data pipelines and architectures.
Act as the primary anchor for data ingestion, transformation, and storage solutions.
Ensure mission-critical data is accessible and reliable.
CodeRoad provides end-to-end software development services, helping businesses scale with ideal infrastructure solutions. From staff augmentation to dedicated IT teams and general software engineering, their nearshore technology services empower businesses to thrive in an ever-evolving digital landscape.
Build, optimize, and maintain data pipelines that power our business
Define and build out abstracted reusable data sets to be used for Business Intelligence, Marketing, and Data Science Research
Design, build, and evangelize a federated data validation frameworks to be used to monitor potential data inconsistencies
Garner Health strives to transform the healthcare economy, delivering accessible, high-quality healthcare. They are a fast-growing healthcare technology company dedicated to making a meaningful impact on healthcare at scale with a team of talented, mission-driven individuals.
Assist in building and maintaining ETL/ELT pipelines for healthcare datasets.
Support the development of data models and data transformations aligned with healthcare standards.
Contribute to data quality checks, validation rules, and documentation for healthcare data assets.
Curana Health is committed to radically improving the health, happiness, and dignity of older adults. They are a national leader in value-based care, serving 200,000+ seniors in 1,500+ communities across 32 states with over 1,000 clinicians and other professionals.
QAD Inc. is a leading provider of adaptive, cloud-based enterprise software and services for global manufacturing companies. They help customers in various industries rapidly adapt to change and innovate for competitive advantage.
Design, build, and maintain data pipelines connecting internal systems to the organisation’s Delta Lake environment.
Develop and optimise SQL-based data transformations and relational data models to support analytics and reporting.
Integrate new data sources and systems into the data platform as the organisation expands its technology landscape.
Smart Working believes your job should not only look right on paper but also feel right every day. It connects skilled professionals with outstanding global teams and products for full-time, long-term roles, offering a genuine community that values growth and well-being.
Responsible for building core infrastructure software (pipelines, APIs, data modelling) as part of our client's data platform team.
Coach & mentor other engineers to support the growth of their technical expertise.
Implementing the appropriate technologies for scaling data access patterns, batch processing, and data streaming for soft real-time consumption.
YLD is a software engineering and design consultancy that creates digital capabilities for their clients. The company has offices in London, Lisbon, and Porto and aims to attract, inspire, develop, and retain extraordinary people.
Assemble and manage large, complex datasets that meet both non-functional and functional business requirements.
Identify, design, and implement internal process improvements to enhance scalability, optimize data delivery, and automate manual processes.
Build and maintain optimal data pipeline architecture for efficient extraction, transformation, and loading of data from various sources.
Sanford Health is one of the largest and fastest-growing not-for-profit health systems in the United States, dedicated to the work of health and healing. The organization has 53,000 employees and serves over 2 million patients across the upper Midwest.
Lead the architecture and evolution of scalable, distributed data pipelines, ensuring high availability and performance at scale
Build and maintain distributed web scraping systems using tools such as Playwright, Selenium, and BeautifulSoup
Integrate AI and LLMs into engineering workflows for code generation, automation, and optimization
MercatorAI is building scalable data infrastructure to power high-quality, data-driven decision making at scale. As an early-stage company, the team is focused on creating robust, future-ready systems that can handle complex data ingestion, transformation, and delivery across a growing national footprint.
Design, build, and maintain scalable and reliable batch and real-time ETL/ELT data pipelines.
Architect and implement robust data infrastructure capable of handling high-volume data ingestion and processing.
Implement automated data quality checks, validation rules, and monitoring frameworks.
ShyftLabs is a data product company founded in early 2020 that works with Fortune 500 companies. They deliver digital solutions to help accelerate the growth of businesses across various industries through innovation; they also value strong business awareness.