Presented by:

Rudraksh Karpe

from open source contributor at openSUSE project

Rudraksh is an AI Engineer at ZS Associates, where he builds enterprise-grade Generative AI solutions with a strong emphasis on privacy, security, and scalability. He is an active open-source contributor, having participated twice in Google Summer of Code with the openSUSE Project, focusing on AI/ML workloads, containerization, and edge–cloud orchestration.

He has presented internationally at leading conferences, including the openSUSE Conference 2025 in Nuremberg, the Early Adopter Tech Summit in Florida, OpenSearhCon Korea, PyCon US 2025, PyCon Japan 2025, and the openSUSE Asia Summit 2024 in Tokyo, among others. His talks often highlight the intersection of GenAI, open-source innovation, and cloud-native technologies, reflecting his commitment to advancing the global developer community.

Satyam Soni

from GSoC'24

Satyam Soni is a Software Engineer at Devtron.ai, working at the intersection of AI and cloud-native technologies. He contributes to open-source projects and has been part of the Kubernetes release team since v1.30, recently supporting release engineering. He was former Google Summer of Code Mentee at openSUSE.

He has spoken at KubeCon Europe, KubeCon China, and the openSUSE Asia Summit 2024 and 2025, focusing on MLOps, Kubernetes, and community building. He also leads Asia’s largest Cloud Native community in New Delhi and is an organizer of KCD New Delhi.

No video of the event yet, sorry!

Deploying containerized AI/ML workloads across edge, core, and cloud infrastructures is crucial for optimizing data processing, enhancing performance, and managing complex computations efficiently. This talk highlights our practical experience from the Analytics Edge Ecosystem project during Google Summer of Code 2024, demonstrating how openSUSE Leap can be leveraged to achieve these goals. By utilizing Kubernetes and lightweight distributions like K3s on edge devices, we have reduced latency and minimized bandwidth usage, while hybrid and multi-cloud deployments offer scalability and flexibility.

In this session, we will explore the technical aspects of deploying various AI/ML workloads at the edge using openSUSE Leap.

Key topics:

  • Using openSUSE Leap as base layer for deployment using KVM
  • Containerization of workloads with Podman/Docker
  • Kubernetes orchestration and cluster management with Rancher by SUSE
  • Implementing a data pipeline that seamlessly transfers data from Edge devices
  • Using AI/ML models deployed at the edge to provide real-time insights
  • Cost optimization using edge devices for local processing
  • Ensuring data security by processing sensitive information locally at the Edge

Date:
2024 November 3 - 11:00
Duration:
40 min
Room:
Room C
Language:
en
Track:
openSUSE
Difficulty:
Medium

Happening at the same time:

  1. Overview of Package Management in openSUSE MicroOS
  2. Start Time:
    2024 November 3 11:00

    Room:
    Room A

  3. Status of CJK language support and activities in LibreOffice 2024
  4. Start Time:
    2024 November 3 11:00

    Room:
    Room B