Lead complex implementation projects that transform how organizations handle and process their data at scale. You will partner closely with Product and Engineering organizations to grow the Atlan Platform and connector ecosystem. You will also guide customers through complex implementations, providing thought leadership on data engineering best practices and optimization.
Remote Data Jobs
718 results
FiltersJob listings
Shape and evolve the backbone of Blip's real-time data processing layer. Design and implement streaming and batch data systems that move and transform information reliably and efficiently across Blip’s platform. Champion data modeling excellence, ensuring high-quality, discoverable data for real-time consumption. Key responsibilities include data engineering & pipeline architecture, modeling & storage, and reliability, quality & observability.
- Manage and optimize database systems hosted on Amazon Web Services (AWS).
- Collaborate with the DevOps team to implement infrastructure as code using Terraform.
- Ensure the performance, reliability, and security of all database environments.
Join a dynamic Managed Services team as a Data Architect, delivering architecture, support, and optimization for client data platforms. In this role, you will design and implement advanced data solutions, including data lakes, warehouses, and analytics pipelines. The role demands both deep technical expertise and strong consulting skills to drive client success, working in a fast-paced, remote-friendly environment alongside highly skilled professionals.
- Lead the team’s fraud monitoring and detection efforts, owning fraud escalations as Affirm scales internationally. Work with cross-functional partners to develop fraud detection and decisioning systems. The successful candidate will:
- Develop strategies for monitoring and preventing fraud.
- Leverage data analytics to optimize fraud management strategies.
- Report on fraud trends and loss drivers.
- Assist with the development of new features to improve fraud detection capabilities.
Apply expertise in text mining, natural language processing, and data analysis as part of Amplity's innovative Intel team, delivering impactful data products for healthcare clients in this fully remote role. The ideal candidate is detail-oriented, self-motivated, and thrives in a fast-paced environment.
Responsible for designing, building, and maintaining scalable data infrastructure that empowers the organization to access, process, and analyze information efficiently. Works closely with data science, IT, and business teams to optimize cloud-based solutions, automate data pipelines, and ensure real-time data availability. Hands-on experience with modern data technologies is emphasized, and focuses on data quality, efficiency, and reliability.
Shape and evolve a modern, cloud-based data platform. Design and implement domain-oriented data products, optimize ETL/ELT pipelines. Build a robust data lake and support near real-time reporting and analytics.
Design, implement, and maintain robust data pipelines and transformation workflows. Work with cloud data warehouses, orchestration tools, and modern ETL frameworks. Collaborate with cross-functional stakeholders to standardize best practices.
You’ll be joining a small, high-impact data engineering team within Federato’s AI/ML organization, where the focus is on building the infrastructure and internal frameworks that empower machine learning engineers. Data and tooling are reliable, scalable, and aligned with the needs of our AI-native platform which enables machine learning engineers to develop, deploy, and iterate on AI-powered features.