Here is an amazing interactive tool I found on X/Twitter made by Brendan Bycroft that helps you understand how GPT LLMs work.
LLM Visualization
A visualization and walkthrough of the LLM algorithm that backs OpenAI’s ChatGPT. Explore the algorithm down to every add & multiply, seeing the whole process in action.
LLM Visualization Github
This project displays a 3D model of a working implementation of a GPT-style network. That is, the network topology that’s used in OpenAI’s GPT-2, GPT-3, (and maybe GPT-4).
The first network displayed with working weights is a tiny such network, which sorts a small list of the letters A, B, and C. This is the demo example model from Andrej Karpathy’s minGPT implementation.
The renderer also supports visualizing arbitrary sized networks, and works with the smaller gpt2 size, although the weights aren’t downloaded (it’s 100’s of MBs).
This post is an automated archive from a submission made on /r/LocalLLaMA, powered by Fediverser software running on alien.top. Responses to this submission will not be seen by the original author until they claim ownership of their alien.top account. Please consider reaching out to them let them know about this post and help them migrate to Lemmy.
Lemmy users: you are still very much encouraged to participate in the discussion. There are still many other subscribers on !localllama@poweruser.forum that can benefit from your contribution and join in the conversation.
Reddit users: you can also join the fediverse right away by getting by visiting https://portal.alien.top. If you are looking for a Reddit alternative made for and by an independent community, check out Fediverser.