2026 Registration is Now Open! Register on Eventbrite ↗

Build Your Own Private ChatGPT: Local AI with Docker, Ollama, Open WebUI, and ZImage Turbo

1:30 pm in Workshop Track

Lwin & Min Maung

Everyone is experimenting with AI — but most organizations are sending sensitive data to third-party cloud services without fully understanding the tradeoffs.

What if you could run your own private ChatGPT-style system locally (and even hook up additional agents and management systems like OpenClaw)?

In this hands-on, live-build session, we’ll create a fully functional private Chat AI assistant that will create code, images, and talk to you via chat and voice interfaces using:

Using only a laptop and open-source tools, we’ll build a system that:

No enterprise AI budget required. No cloud lock-in. No sending your data to the internet.

By the end of this session, you’ll understand how local AI actually works — and how you can deploy it yourself or inside your organization.

This is not theory. We will build it live.

Workshop Prerequisites:

Working computer with:

The Maungs are going to be working and performing their workshop demos on Intel NUC 10th Gen i5 with 16GB of DDR4 for basic setup. Basic setup:

They will demo / show you steps on how to enable Image creation using ZImage Turbo and add that feature on to OpenWebUI. This will be demoed on MacBook Pro M3 with 128GB of vRAM (and / or) Nvidia DGX Spark (128GB of vRAM), (and / or) Ubuntu Server with Undisclosed NVidia GPUs. ;)