Build out tests and observability tooling (logs, metrics, alerts).
Develop and integrate system/integration tests into the automated testing pipeline.
Develop new data plotting and file writing utilities for data output.
LeoLabs is building a living map of activity in space through its global radar network and AI platform, collecting millions of measurements daily on over 25,000 objects in low Earth orbit. They are a trusted partner for space domain awareness, traffic management, and satellite operations for top-tier space operators and allied defense organizations.
Manage GIS data collection, storage, and version control across team members.
Living Carbon PBC's mission is to fight climate change by transforming marginal land into high-value environmental assets. They specialize in restoring abandoned mineland and degraded agricultural land into diverse, thriving forests and backed by prominent investors.
Automate and streamline secure data transfer between cloud and on-prem systems
Enhance laboratory testing programs to import/export data in a computer literate format
Design and implement Python-based data ingestion pipelines to support scalable and efficient data processing
Sika is a specialty chemicals company with a globally leading position in the development and production of systems and products for bonding, sealing, damping, reinforcing, and protection in the building sector and industry. In 2024, Sika’s around 33,000 employees generated annual sales of CHF 11.76 billion.
Remotely provision and configure edge compute nodes immediately following physical installation.
Activate and integrate newly deployed systems into a distributed global network.
Monitor system health and serve as the escalation point for diagnosing and resolving technical issues
Sitreps is building a next-generation maritime intelligence platform that transforms global fleets into a persistent sensing network. They combine edge computing, advanced sensors, and satellite connectivity, operating in remote environments to deliver critical insights.
Process Dataflow Management Requests (DMRs) and implement efficient dataflow solutions.
Create and manage dataflows using Apache NiFi.
Manage and support operational systems that process and maintain mission data.
Peraton is a next-generation national security company that drives missions of consequence spanning the globe and extending to the farthest reaches of the galaxy. As the world’s leading mission capability integrator and transformative enterprise IT provider, they deliver trusted, highly differentiated solutions and technologies to protect our nation and allies.
Partner with Sales and Field Engineering to design and architect complex, enterprise-grade solutions tailored to customer needs.
Lead the implementation of custom solutions within customer environments across multi-cloud and hybrid architectures.
Optimize solutions for performance, scalability, and reliability in production environments.
Striim is a unified data integration and streaming platform that connects clouds, data, and applications. We believe and expect all of our employees to operate as one with unlimited potential and dignity.
Work across application, system, and runtime layers.
Deliver reliable, maintainable, and production-ready solutions.
Ontrac Solutions is a strategic consulting and technology partner helping organizations innovate, create, and elevate through modern digital solutions. They support their clients with product development, cloud consulting, automation, data solutions, and staff augmentation by connecting them with high-quality talent ready to make an immediate impact.
Design, build, and maintain databases that power Hologram's operations.
Build and maintain ETL pipelines that move and transform data reliably.
Audit existing pipelines and data models, identify complexity, and refactor bad decisions.
Hologram is building the future of IoT connectivity, delivering internet access to millions of connected devices worldwide. They process over 5 billion transactions per month across their global infrastructure and values a fun, upbeat, and remote-first team united by their mission.
Maintain and augment the automation and services ultimately responsible for the operation of Planet’s satellites.
Specify and implement new HTTP APIs and help improve existing ones.
Contribute with a team that values open and honest communication, collaboration, self-learning and initiative to solve problems big and small.
Planet designs, builds, and operates the largest constellation of imaging satellites in history. Planet is a global company with employees working remotely world wide and joining us from offices in San Francisco, Washington DC, Germany, Austria, Slovenia, and The Netherlands.
Build an ingestion pipeline to extract, transform, and load data from legacy systems.
Standardize and normalize data across multiple heterogeneous sources to ensure consistency and usability.
Develop and maintain a configuration engine using Python and Ansible to manage network parameters.
Miratech is a global IT services and consulting company that brings together enterprise and start-up innovation, supporting digital transformation for some of the world's largest enterprises. They retain nearly 1000 full-time professionals, experiencing an annual growth rate exceeding 25%, and fostering a values-driven organization with a culture of Relentless Performance.
Research adversary tradecraft, translate threat intelligence into detection logic
Tune and optimize existing detections to reduce alert fatigue while maintaining detection fidelity
Document detection logic, response guidance, and follow-on analysis to support SOC and incident responders
Fidelity National Financial (FNF) is seeking a Detection Engineer to join our Information Security Office (ISO). They are an Equal Opportunity employer.
Implement the Core Aggregation Node: Lead the on-site architecture, deployment, and configuration of our central data hub within a highly secure, distributed network environment.
Master Edge-to-Core Synchronization: Architect and troubleshoot complex, peer-to-peer data synchronization across edge devices utilizing CRDTs to manage state in decentralized environments.
Optimize Multi-Protocol Routing: Ensure reliable data flow from the tactical edge to the core by configuring and optimizing routing across heterogeneous and constrained network transports, like BLE, local Wi-Fi, and tactical radios.
Ditto is redefining how data moves at the edge, enabling developers to build resilient, real-time applications regardless of network conditions. It's a globally distributed, fast-growing startup committed to a diverse and inclusive team.
Drive the full product lifecycle from prototype through scalable production release.
Develop databases and backend APIs that are scalable and maintainable.
Integrate and manage internal and third-party APIs for seamless data flow.
INFUSE unites forward-thinkers who anticipate how technology shapes business and delivers high-impact projects with real potential. They appear to be innovation-driven but the number of employees and the culture are not mentioned in the job description.
Prepare and manage pre-stage files for backbook conversion activities.
Support and execute data ingestion tasks in alignment with scheduled project events, including key mock events.
Monitor and ensure data ingestion completion within defined SLA windows.
Kunai builds full-stack technology solutions for banks, credit and payment networks, infrastructure providers, and their customers. They help their clients modernize, capitalize on emerging trends, and evolve their business for the coming decades by remaining tech-agnostic and human-centered.
Collaborate with hardware and data science teams to develop software applications.
Design, implement, and maintain robust hardware-software interfaces.
Create and maintain data infrastructure on the cloud and ensure efficient server operations.
iLoF is building a photonics and AI-powered platform transforming precision medicine, democratizing access for millions with complex diseases. Recognized by CB Insights and Financial Times, they're expanding their team to achieve this mission.
Design and research new methods for data collection.
Perform data analysis on internally and externally derived datasets.
Ideate new product features and enhancements based on the latest security trends.
Bitsight is a cyber risk management leader transforming how companies manage exposure, performance, and risk for themselves and their third parties. They have over 750 teammates dispersed throughout Boston, Raleigh, New York, Lisbon, Singapore, and remote.
Design, build, and maintain data products that support R&D, analytics, Lab, and scientific workflows.
Build and maintain data pipelines for large and complex datasets ensuring high data quality.
Partner with scientists and engineers to translate research needs into reusable data assets.
Natera is a global leader in cell-free DNA (cfDNA) testing, dedicated to oncology, women’s health, and organ health. They aim to make personalized genetic testing and diagnostics part of the standard of care to protect health and enable earlier and more targeted interventions that lead to longer, healthier lives.
Design and implement software systems that perform de-identification rules at high scale and throughput.
Generate and execute quality assurance plans to validate de-identification processes.
Run de-identification pipelines in health system cloud environments, optimizing for error rates and processing efficiency.
Dandelion Health, founded in 2020, is building the world’s largest AI training and clinical development platform, making data access easy for AI developers, pharma, and medical devices. Their culture emphasizes learning from data to help clients improve health through AI.
Build and enhance our video ingestion, encoding, packaging, and validation systems.
Own features and services that manage the content lifecycle.
Collaborate with other engineering teams to optimize compatibility and performance across diverse platforms.
VERSANT is a leading force in news, sports and entertainment, fostering iconic brands. They are an independent, publicly traded company, fueled by innovation and entrepreneurial spirit, committed to delivering exceptional experiences across every screen and service.
Build pipelines to load data from various systems into Dataiku via S3 or Snowflake.
Increase the robustness of existing production pipelines, identify bottlenecks, and set up a robust monitoring, testing processes, and documentation templates.
Build custom applications and integrations to automate manual tasks related to customer operations to help Product Operations / Support / SRE in their day-to-day activities
Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.