Impact the lives of everyday people and help them move from surviving to thriving with digital personal finance solutions. You’ll lead product requirements and advanced analytics requirements gathering efforts, and work with other teams to help with automating data analysis and visualization needs. You'll use technology to solve business problems and drive positive outcomes, build a scalable technology platform to support a growing business, and deliver high-quality code to production consistently.
Remote Data Jobs · Python
419 results
FiltersJob listings
Collect and clean data from various sources to ensure accuracy and consistency. Perform exploratory data analysis to identify trends, patterns, and anomalies. Develop and maintain dashboards and reports to track key performance indicators (KPIs) and other relevant metrics. Lead the analysis during the experimentation phase of new initiatives to determine viability and success of hypotheses.
The Senior Data Engineer plays an integral role in creating and maintaining Business Wire’s data pipelines, APIs, and services. As a member of our Engineering team, this role partners with other engineers, business analysts, and project managers to create applications for news processing, syndication, and metrics reporting systems.
As a Data Analyst, you will be part of a team responsible for owning and simplifying the flow of data used by production trading algorithms at Voleon. The primary responsibility of our team is to ensure continuous operation of production-critical systems by leveraging an on-call rotation. When not taking part in this rotation, your work will focus on improvements to our data-consumption pipelines.
This is a remote Data Analyst position with our Columbus, OH client. They offer competitive salaries and benefits, as well as a great work-life balance. The position is remote, but you will occasionally go in for meetings or celebrations., emphasizing understanding their customers and consumers challenges and providing the right resolutions to solve them.
We are seeking a Power BI Report Developer with strong experience in Databricks to design, build, and optimize high-impact analytics solutions. In this role, you will develop interactive dashboards and data visualizations that help transform complex data into actionable insights for business decision-makers. You’ll collaborate closely with cross-functional teams, including data engineering, analytics, and operations, to ensure data accuracy, performance, and scalability.
Looking for candidates who are passionate about technology, proficient in English, and excited to engage in remote collaboration for a worldwide presence. Must have 5+ years of professional experience as a Python Developer and strong hands-on experience with Apache Airflow, experience working with Snowflake and solid knowledge of SQL. Ability to design, build, and maintain data pipelines and integrations is required, as well as the ability to work independently and troubleshoot effectively.
Looking for a Data Engineer II to help design, build, and continuously improve the GoGuardian Analytics and AI/ML ecosystem. This position sits on the Data Engineering team, a group responsible for building and maintaining the core data platform that powers analytics, product insights, and machine learning across the company. Collaborate closely with Data Science, Business Intelligence, and other teams to enable the next generation of data-driven products and AI capabilities.
Build and be responsible for PlanetScale’s data and analytics efforts from zero to one, and play a leading role in shaping decision making through data. You will be the key lynchpin connecting the business with data to inform how we are executing against our company goals and where to direct our resources. Your scope will be broad and cover GTM Analytics, Product Analytics, and Analytics Engineering.
Design and build Data Warehouse and Data lakes. Use your expertise working in large scale Data Warehousing applications and databases such as Oracle, Netezza and SQL Server. Use public cloud-based data platforms especially Snowflake and AWS. Experience in design and development of complex data pipelines. Expert in SQL and at least one scripting language, adept in tracing and resolving data integrity issues.