Hello homelabbers,

I want to share my story of building a home cluster for AI and software for it. I would appreciate your feedback.

Over time, I have acquired several computers, including two personal ones for work and gaming, one under the TV for gaming with gamepads and watching videos, an old PC repurposed as a server, and a collection of laptops. I’ve set up ethernet cables and employed Unifi for networking.

Considering the abundance of GPUs at our disposal, I’ve concluded that we can leverage them for various projects, such as running AI models for image generation. However, it wasn’t evident how I could use all my machines running different OSes and frequently used for gaming or work.

I’ve started tinkering and built a project to solve my needs over time. It gives me remote access to the cluster, allowing me to run docker containers, inspect them, and more. It runs on all three major OSes and has an auto-update feature to simplify cluster management. I apologize for posting a link to my project web page. I haven’t figured out a better way to illustrate the idea.

Is there anyone else who has built something similar? Do you believe this tool would benefit those engaged in home lab setups?

Does building a product out of this software make sense? My idea is to utilize the spare computational capacity from home or office computers to power your projects and save money on cloud services. The goal is to build a service that simplifies the cluster setup and management (i.e., avoid installing custom Linux distros, configure networking, NAT traversal, install dashboard software, etc. - just install the client on your machines, and that’s it). Here’s a demo featuring a stable diffusion running on my home machine.