Remote Software engineering Jobs · Helm

Job listings

  • Design and implement AI inference and model training cloud products optimized for Kubernetes.
  • Write clean, efficient, and maintainable Go code to power Kubernetes controllers, operators, and custom resources supporting AI workloads.
  • Build APIs, CLIs, and developer tools that simplify the deployment, lifecycle management, and monitoring of AI applications.

Gcore provides infrastructure and software solutions for AI, cloud, network, and security, powering everything from real-time communication and streaming to enterprise AI and secure web applications. With 210+ edge locations, 50+ cloud regions, and thousands of GPUs, Gcore has a global team of 550+ professionals.

  • Contribute to the development of the Everywhere Inference platform, a Kubernetes-based solution.
  • Design and implement APIs and developer tools to simplify deployment, management, and monitoring of AI applications.
  • Focus on packaging and integrating new ML models into the platform, using Python and common ML frameworks.

Gcore provides infrastructure and software solutions for AI, cloud, network, and security. They power everything from real-time communication and streaming to enterprise AI and secure web applications, with over 550 professionals globally and partnerships with technology leaders.

Canada 3w PTO

  • Own model serving: Design, build, and maintain low-latency, highly-available serving stacks for in-house ML model serving and integrating with LLM serving partners.
  • Automate training pipelines: Orchestrate data prep, training, evaluation, and registry workflows on Kubernetes with solid MLOps practices.
  • Optimize at scale: Profile and tune throughput, memory, and cost; introduce caching, sharding, batching, and GPU/CPU autoscaling where it pays off.

Cresta aims to turn every customer conversation into a competitive advantage by unlocking the true potential of the contact center. Their platform combines AI and human intelligence to help contact centers discover customer insights and automate conversations.

  • Contribute to the development of the Everywhere Inference platform, a Kubernetes-based solution.
  • Design and implement APIs and developer tools to simplify deployment, management, and monitoring of AI applications.
  • Optimize serverless container workflows for AI workloads, ensuring performance, scalability, and seamless autoscaling.

Gcore provides infrastructure and software solutions for AI, cloud, network, and security. They have 550+ professionals globally and collaborate with technology partners such as Intel, NVIDIA, Dell, and Equinix.