Remote Data Jobs · SQL

Job listings

Looking for candidates who are passionate about technology, proficient in English, and excited to engage in remote collaboration for a worldwide presence. Must have 5+ years of professional experience as a Python Developer and strong hands-on experience with Apache Airflow, experience working with Snowflake and solid knowledge of SQL. Ability to design, build, and maintain data pipelines and integrations is required, as well as the ability to work independently and troubleshoot effectively.

Support the Claims organization by developing advanced reporting and dashboard solutions, leveraging expertise in PowerBI, DAX and associated tools, to evaluate data and identify opportunities to improve efficiency and effectiveness in our programs. Utilize business intelligence tools to identify/project significant shifts in key business metrics and communicate observations to business owners and technical partners.

$130,000–$140,000/yr

Looking for a Data Engineer II to help design, build, and continuously improve the GoGuardian Analytics and AI/ML ecosystem. This position sits on the Data Engineering team, a group responsible for building and maintaining the core data platform that powers analytics, product insights, and machine learning across the company. Collaborate closely with Data Science, Business Intelligence, and other teams to enable the next generation of data-driven products and AI capabilities.

Build and be responsible for PlanetScale’s data and analytics efforts from zero to one, and play a leading role in shaping decision making through data. You will be the key lynchpin connecting the business with data to inform how we are executing against our company goals and where to direct our resources. Your scope will be broad and cover GTM Analytics, Product Analytics, and Analytics Engineering.

$145,600–$149,760/yr

Design and build Data Warehouse and Data lakes. Use your expertise working in large scale Data Warehousing applications and databases such as Oracle, Netezza and SQL Server. Use public cloud-based data platforms especially Snowflake and AWS. Experience in design and development of complex data pipelines. Expert in SQL and at least one scripting language, adept in tracing and resolving data integrity issues.

The contractor will focus on developing machine learning algorithms, data analysis, and optimizing data pipelines. The ideal contractor should have strong expertise in developing machine learning algorithms, natural language processing (NLP), and experience working with large-scale data systems. The contractor will develop and implement machine learning algorithms to solve complex problems. They will design and maintain data pipelines, including transformations and preprocessing. They will conduct data analysis to identify trends and insights.

You will work on research, prototype, and productionize generative AI models, developing scalable GenAI pipelines that generate high-quality content, from product descriptions, reviews, titles, and other product content, while defining best practices for model monitoring and collaborating cross-functionally with product, design, and engineering to integrate models into user-facing applications.

In this role, you will be part of the Recommender Systems group, tasked with delivering personalised product ranking and recommendations to users across the entire Viator product. You will design, code, experiment and implement models and algorithms to enhance customer satisfaction, increase supplier value, optimize business results, and ensure infrastructure efficiency. Collaborate with product managers and various business stakeholders to ensure top-quality outcomes to meet internal objectives.

Data Engineer

BHFT

The Data Engineering team is responsible for designing, building, and maintaining the Data Lake infrastructure, including ingestion pipelines, storage systems, and internal tooling for reliable, scalable access to market data. Implement and tune data storage for petabyte‑scale analytics and collaborate with Data Science, Quant Research, Backend and DevOps to translate requirements into platform capabilities.