Remote Data Jobs

Job listings

We are looking for a Data Engineering Intern eager to gain hands-on experience in database management, data pipeline development, and ETL processes. This internship provides an excellent opportunity to work alongside experienced data engineers, building foundational skills in data integration and data quality management. Ideal candidates should have some experience in scripting (Python preferred) and a curiosity about creating effective data solutions to support business operations.

This is a 12 week internship within the Data Science team at Ever.ag, assisting in continued development and deployment of leading data science applications in the food production supply chain space. Potential projects include cleaning and analyzing large complex data sets for commodity food processors, translating customer problems into application development and data science requirements, assessing cutting edge machine learning techniques, and developing data science forecasting techniques.

Focus on project work, debugging performance degradations, audits, and health checks. Design and help implement new MongoDB and Cassandra deployments. Evaluate existing clusters and provide recommendations on best practices. Debug high priority issues on mission critical production environments. Engage with the Mongo DB and Cassandra community. Contribute to rapid brainstorming, designing and developing of prototypes. Automate and provide documentation on operational procedures.

As a Senior Data Product Manager, you will own the vision, roadmap, and delivery of key data initiatives that enable the entire organization to make better decisions, faster. Your scope will include data infrastructure, internal analytics tools, and data compliance topics. You will define and drive the data platform and pipeline roadmap, ensuring scalability, reliability, and accessibility of data across products and teams.

Design, maintain, and scale streaming ETL pipelines for blockchain data. Build and optimize ClickHouse data models and materialized views for high-performance analytics. Develop and maintain data exporters using orchestration tools. Implement data transformations and decoding logic. Establish and improve testing, monitoring, automation, and migration processes for pipelines. Ensure timely delivery of new data features in alignment with product goals.

India Unlimited PTO

Combine and transform diverse data sources using SQL. Produce report-ready data tables, by integrating various data sources, complex logic, and business rules. Develop a deep understanding of Acquia’s internal systems with a focus on data structure and corresponding processes that impact the data. Collaborate with stakeholders globally to develop automated reporting solutions. Drive data initiatives by refining processes and data structures to enable scalable reporting solutions.

As a Data Scientist, you will convert in reality the business requirements related to our digital transformation where cloud technology is a key enabler. You will be working in the implementation of data MVPs as part of a Digital Builders Organization squad. You will collaborate with Business Units to have a deep understanding of the business problem that AI and Machine Learning may solve and provide technical expertise in the use cases execution.

$90,000–$145,000/yr
Unlimited PTO

The Data & AI Senior Consultant is responsible for being a partner to team members, a mentor, and directly enabling growth of the Data & AI Services team at MCA. The Sr. Consultant is expected to lead project implementations, drive innovation, upskill and mentor junior teammates, and partner with leadership to drive continuous improvement. Within a project framework, the Data & AI Senior Consultant is responsible for building client relationships

US Unlimited PTO

Wavicle Data Solutions is hiring a Databricks Solution Architect, who will be responsible for leading design and implementation of scalable and optimized solutions that leverage the latest Databricks for features. This individual will work closely with customers, understanding their needs and business drivers, and helping them adopt and optimize Databricks for their analytics, data science, and AI/ML workloads.