In this role, you will analyze sales data, create insightful reports, and provide strategic recommendations to enhance sales performance and driving data-driven insights and initiatives to optimize sales performance and efficiency. This role involves interpreting complex data sets, identifying trends, and collaborating with cross-functional teams to optimize processes. Strong analytical skills and the ability to communicate findings effectively are essential.
Remote Data Jobs · Python
419 results
FiltersJob listings
Peek is seeking a data-driven Data Analyst to transform data into valuable insights, experiments, and product enhancements. In this position, you'll collaborate with Product, Engineering, Operations, and Leadership, comprehending customer behaviors, pinpointing friction in product flows, and influencing Peek's analytics ecosystem. This is suitable for someone who enjoys SQL, comprehends BI tooling, and is enthusiastic about AI in data work.
We are looking for a DP-100–certified Azure Data Scientist passionate about applied Machine Learning and delivering measurable improvements for clients. As part of a growing innovation and technology team, you’ll contribute to the design, training, and deployment of AI and ML models within the Azure ecosystem, integrating ML models into real business processes that drive meaningful outcomes.
The Data Platform Engineering team is building the infrastructure to support a best-in-class decision engine. The role requires excellent communication to collaborate with Machine Learning Engineers, Data Scientists, and Analytics Engineers. The Data Platform Engineering team serves internal clients by maintaining infrastructure and tooling. The team advises backend and frontend teams to adopt the Event-Driven Data Mesh paradigm.
We are looking for a Machine Learning Engineer who is skilled and has significant experience in developing machine learning models. You will join a small, innovative team and lead efforts to advance our capabilities, drive model development, and support our vision for a future where Grass is transformative in the evolution of the internet.
Responsible for designing, developing, and maintaining large-scale data ingestion and transformation pipelines on Databricks. A key contributor in implementing modern DataOps practices, ensuring data reliability, scalability, and alignment with business requirements through the integration of data contracts and automated quality checks. Design, build, and optimize data ingestion and transformation pipelines using Databricks and other modern cloud-based data platforms.
Join the pricing and underwriting domain to bridge the gap between machine learning/data science and engineering. You will help build, publish, and maintain our complex data products and pipelines, key elements that have a significant impact on the company’s growth. Shape the architecture of data products designed for data analytics and data science.
We are looking for a passionate Senior Security Data Analyst/Python Developer to help us parse, transform, and analyze dirty data. The ideal candidate has a thorough understanding of Python, Data analysis techniques, AWS, ETL patterns, and Automation techniques. You will parse and transform structured and unstructured datasets, build Python-based automation for the parsing platform, and bring order to dirty and/or unstructured data.
In this high-impact role, you’ll take ownership of designing, building, and scaling the data pipelines and infrastructure that power our ML and AI models. You’ll work across ingestion, transformation, modeling, and orchestration — ensuring our data is reliable, accessible, and primed for analytics and machine-learning use cases.
Staff Data Engineers are responsible for being interpersonal force multipliers across the organization, augmenting our internal capabilities to effectively manage and leverage client data effectively. Staff Data Engineers work closely with functional leadership and data solutions team members to understand common client business needs and requirements to identify and develop data engineering solutions to meet those needs.