Remote Data Jobs · Data Integration

Job listings

Apply and develop advanced artificial intelligence and machine learning methods to analyze imaging and multi-omics data across diverse biomedical research areas, including cancer, aging, longevity, and neurodegenerative disorders. Collaborate closely with research faculty and Scientific Services to advance JAX’s strategic priorities. Process and harmonize multi-omics and imaging datasets and analyze them using existing bioinformatics or AI/ML tools and adapt methods as needed.

$100,000–$115,000/yr

In this role, you will join a fast-paced data environment where reporting innovation drives meaningful business decisions. You will play a central part in modernizing reporting infrastructure, improving data processes, and partnering with product, business, and vendor teams to ensure accuracy, scalability, and efficiency. This position combines analytical depth with operational ownership, offering the opportunity to influence reporting strategy and enhance data-driven capabilities across the organization.

This position is ideal for someone who thrives in a fast-paced, dynamic environment where everyone's opinions and efforts are valued and appreciated, contributing to challenging and meaningful projects, developing high-quality applications that stand out in the market. Bluelight values continuous learning, personal growth, and hard work, offering a collaborative environment that promotes professional development.

The expert will work in an international and collaborative environment. His/her main missions will be to design and develop advanced KNIME workflows for data collection, transformation and enrichment. Develop KNIME Data Apps (interactive applications) for the creation of dashboards and self-service analytics tools. Deploy, administer and optimize solutions on KNIME Business Hub (scheduling, versioning, governance, collaboration).

The job involves being responsible for data ingestion, analysis, and migration, with a focus on Business Intelligence and data engineering. The candidate will develop, optimize, and maintain ETL/ELT pipelines to migrate data from various sources using PostgreSQL. They will also analyze and understand business requirements, transforming insights into actions.

You'll be joining a strategic data platform integration project to consolidate data from different acquired companies into one unified Azure-based data warehouse. Integrate disparate data systems (NetSuite, SAP, Dynamics NAV, Business Central, Oracle) from multiple acquired companies, build a modern Azure data platform that scales globally, and enable executive decision-making with unified financial and operational reporting.

Europe 5w PTO

Analyze and enable business requests for advanced data analytics and insights generation. Build, integrate and sustainably deploy data pipelines. Develop, maintain, optimize data structures in a data lake platform (data preparation, data ingestion, resource development, data quality). Conduct data integration across on-prem, cloud and hybrid infrastructure. Derive data model and architecture. Implement data security and privacy guidelines based on Bosch standards. Develop and implement data provisioning strategy (APIs, exports and others).