The Sr Data Engineer, DevX creates the best developer experience for data and application engineers at Basis. They design, implement and maintain deployment and ETL pipelines for data products. Integrate diverse data sources and vendor products, including databases, APIs, and third-party services.
Basis Technologies empowers agencies and brands with cutting-edge software that automates digital media operations, offering flexible work options across the U.S.
As a key member of our Data Engineering team, you will: Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives. Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture. Collaborate with the team to meet performance, scalability, and reliability goals.
PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences.
Optimize SQL queries to maximize system performance.
RefinedScience is dedicated to delivering high-quality emerging tech solutions. While the job description does not contain company size or culture information, the role seems to value innovation and collaboration.
Looking for young talent ready to go all in. Offering significant equity to people who want to build something that matters. Define the future of AI in influencer marketing.
Influur is redefining how advertising works through creators, data, and AI, aiming to make influencer marketing as measurable, predictable, and scalable as paid ads.
As a Senior Data Engineer, shape a scalable data platform that drives business insights. Design and maintain robust data pipelines and collaborate with cross-functional teams. Tackle complex data challenges, implement best practices, and mentor junior engineers.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.
Own the design, build, and optimization of end-to-end data pipelines. Establish and enforce best practices in data modeling, orchestration, and system reliability. Collaborate with stakeholders to translate requirements into robust, scalable data solutions.
YipitData is the leading market research and analytics firm for the disruptive economy and most recently raised $475M from The Carlyle Group at a valuation of over $1B.
Lead the creation and maintenance of the company's data infrastructure.
Build reporting capabilities for the business team.
Develop test infrastructure to maintain data integrity.
Real is a fast-growing global real estate brokerage powered by technology and driven by people, reimagining the residential real estate experience since 2014.
Design, build, and maintain scalable and reliable data pipelines.
Develop and maintain ETL data pipelines for large volumes of data, writing clean, maintainable, and efficient code.
Work closely with product managers, data scientists, and software engineers to create and prepare datasets from disparate sources.
Curinos empowers financial institutions to make better, faster and more profitable decisions through industry-leading proprietary data, technologies and insights.
Architect and maintain robust data pipelines to transform diverse data inputs.
Integrate data from various sources into a unified platform.
Build APIs with AI assistance to enable secure access to consolidated insights.
Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.
Design, build, and oversee the deployment of technology for managing structured and unstructured data.
Develop tools leveraging AI, ML, and big-data to cleanse, organize, and transform data.
Design and maintain CI/CD pipelines using GitHub Actions to automate deployment, testing, and monitoring.
NBCUniversal is one of the world's leading media and entertainment companies creating world-class content across film, television, streaming, theme parks, and more.
Design, implement, and maintain scalable ETL/ELT pipelines using Python, SQL, and modern orchestration frameworks. Build and optimize data models and schemas for cloud warehouses and relational databases, supporting AI and analytics workflows. Lead large-scale data initiatives from planning through execution, ensuring performance, cost efficiency, and reliability.
This position is posted by Jobgether on behalf of a partner company.
Design and build scalable data pipelines to ensure seamless data flow from multiple sources. Automate data collection, transformation, and delivery processes to support real-time and batch processing requirements. Work with stakeholders to define and enforce data governance policies and standards.
Goods & Services is looking for a Data Governance Engineer to design, build, and maintain the data collection, storage, and analysis infrastructure.
The Senior Data Analytics Engineer designs, implements, and optimizes data analytics infrastructure. The role requires both technical expertise and strong analytical skills. The engineer develops and maintains data models, dashboards, and reports to support business analytics and decision-making.
Uniswap Labs builds products that help millions of people access DeFi simply and securely ‒ from the Uniswap Web App and Wallet to crypto infrastructure.
Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Collaborate with Product, Design, Operations, and Security to deliver well-architected, scalable compute solutions.
Participate in technical architecture discussions with a focus on query engines, storage systems, and distributed database design.
Champion a culture of technical excellence and innovation, influencing engineering direction across multiple teams or domains.
dbt Labs is the pioneer of analytics engineering, helping data teams transform raw data into reliable, actionable insights. They have grown into the leading analytics engineering platform since 2016 and now serve over 5,400 dbt Cloud customers.