30,000 AI models
too many really. But from what I read in conversations and posts I notice one thing: you all try out Model all the time and that’s fine, but I haven’t yet read that anyone habitually uses one Model over others. It seems like you use one template for a few days and then start with a new one. Don’t have your favorite? Which?
What’s stopping us from building a mesh of web crawlers and creating a distributed database that anyone can host and add to the total pool of indexers/servers? How long would it take to create a quality dataset by deploying bots that crawl their way “out” of the most popular and trusted sites for particular knowledge domains and just compress and dump that into a format for training into said global p2p mesh? If we got a couple of thousand nerds on Reddit to contribute compute and storage capacity to this network we might be able to build it relatively fast. Just sayin…