The Senior Data Engineer is a key contributor to the Data Engineering team, interfacing directly with nearly every other team, and helps guide the creation and maintenance of the world’s most inspiring Virtual Power Plant. You’ll work with engineers, product, support, and the business to control millions of devices and manage multiple gigawatts of power. Your skills of mentorship, optimization, triage, prioritization, will help make our collective work lives more rewarding and enjoyable.
Remote Data Jobs · AWS
89 results
FiltersJob listings
You will own the design, development, and maintenance of TextNow's data platform, enabling effective data-informed decisions as an experienced Data Engineer with hands-on experience designing and developing data platforms. You will be part of cross-functional efforts to build scalable and reliable frameworks that support all TextNow's business and data products. In this role, you can interact with different functional areas within the business and influence decision-making in a fast-growing mobile communications start-up.
As a Machine Learning Data Scientist on Extend’s Risk & Fraud Machine Learning Team, you will develop and deploy cutting-edge machine learning models to detect and prevent fraud, enhance decision-making, and drive business value. You’ll work closely with product, engineering, and operations teams to build scalable, production-ready machine learning applications that support Extend’s post-purchase products.
As a Manager Data Scientist, you'll collaborate closely with business leaders to address our clients' diverse business challenges. You will lead and mentor junior data scientists, helping them achieve their goals and ensuring the team's success. This role requires strong business acumen to understand and solve complex problems, ensuring the successful implementation of data-driven solutions.
As an AI Architect, you’ll be the company’s go-to expert for all things artificial intelligence — defining our vision, leading architectural strategy, and ensuring AI is seamlessly integrated into every product and platform we build. You’ll work at the intersection of innovation and execution, partnering closely with Product, Engineering, Data, and Security teams to turn advanced AI technologies into business value. From developing architecture blueprints to guiding implementation and governance, this role is both hands-on and strategic.
Join the team as a Data Engineer and help improve the lives of people with diabetes through data-driven innovation. As a senior member of the Data Engineering & Analytics team, you’ll lead big data engineering and cloud-based analysis to support real-time decision-making, define scalable data strategies, and build secure, efficient pipelines. You’ll operationalize machine learning models and uncover insights across key domains.
You will design and maintain high-throughput, low-latency backend systems that process and serve massive amounts of data, collaborate across teams to deliver scalable, reliable, and high-performance solutions used by thousands of enterprise customers worldwide. Solve complex technical challenges, from optimizing concurrency and synchronization to debugging distributed behaviors and improving system reliability.
We are looking for a Data Engineering Intern eager to gain hands-on experience in database management, data pipeline development, and ETL processes. This internship provides an excellent opportunity to work alongside experienced data engineers, building foundational skills in data integration and data quality management. Ideal candidates should have some experience in scripting (Python preferred) and a curiosity about creating effective data solutions to support business operations.
Wavicle Data Solutions is hiring a Databricks Solution Architect, who will be responsible for leading design and implementation of scalable and optimized solutions that leverage the latest Databricks for features. This individual will work closely with customers, understanding their needs and business drivers, and helping them adopt and optimize Databricks for their analytics, data science, and AI/ML workloads.