I really want to use AI like llama, ChatGTP, midjourney etc. for something productive. But over the last year the only thing I found use for it was to propose places to go as a family on our Hokaido Japan journey. There were great proposals for places to go.
But perhaps you guys have some great use cases for AI in your life?
Nope, nothing. There doesn’t honestly seem to be anything I’d use it for, even then I wouldn’t wanna support it as long as it uses Data its gotten by basically stealing. Maybe once that has gotten better I’ll look more into it, but at the current moment I just don’t have the heart to support it
I find that LLM powered autocomplete when programming makes me more productive.
Occasionally I’ll use a chatbot to help me reword an email or other text, though this is rare.
The only practical thing I have found I can do with AI is brainstorm ideas (or rather expand upon little ideas I have but don’t know where to go after) or figure out what’s wrong with a snippet of code when I can’t figure it out on my own.
I use it quite a bit. I don’t trust big companies who commercialize AI so I run my AIs locally on my old retired gaming desktop that I’ve turned into a homelab/media server.
I use Kobold.AI to self host an LLM like ChatGPT (Dolphin-Mistral7b if you are curious). I mainly use it for low effort knowledge searches for stuff that is easier typed out long and descriptive (since google struggles with this). Since it’s AI I have to be careful about what I search since I’ve seen it make stuff up but for the majority of what I use it for (Programming syntax, Linux troubleshooting, general questions) it’s pretty good.
I also have Stable Diffusion running as well using the ICBINP model (which is pretty decent for photorealistic images). I use this AI to generate social media display pictures and porn :) it’s fun because it’s a surprise at what you’re going to get but sometimes it generates absolute horrors. Anatomical horrors. Those are genuinely horrific. Other times it’s really good.
How do you set up stable diffusion to run locally? I have been trying out llama.cpp for text and was looking for a similarly easy tool to try image generation.
AUTOMATIC 1111 with webui
@jeena I only use DeepL to translate and that’s it. I also started taking notes in .md files, so that could make for a good use case in the future if there was an AI that I could use without connecting to the internet (e.g. to only let me tell stuff based on the files I got). Otherwise I am pretty reticent on AI. Perhaps I watched too many fiction movies, but I am afraid it will become too sentient and somehow escape the human oversight, thus creating havoc in our lives.
Check out llama3 which you can run locally.
@jeena it doesn’t connect to any server even when you’re online?
Correct.
@jeena cool. I think I will look into it. Thanks! 😁
Damn, it seems I was talking out of my ass and just assuming things. Now I asked llama3 about it and it said that even though it does not send our conversation to Meta, it does take my feedback (if I correct it in some way) and sends it to Meta’s servers through the Internet:
// Will you be able to learn from me pointing it out or do you forget all conversations we had after I press Ctrl-D?
I’m a large language model, I don’t have personal memories or experiences. Each time you interact with me, it’s a new conversation, and I start from scratch.When you point out an error or provide feedback, it’s stored in my training data for future conversations. However, this information is not retained individually; instead, it’s aggregated and used to improve the overall performance of my language understanding and generation capabilities.
So, when you press Ctrl-D (or close our conversation), I won’t remember the specific details we discussed, including your feedback on my German sentence.
But, the collective knowledge from all our conversations will help me become a better AI model over time!Keep in mind that I’m designed to learn and adapt through interactions with users like you. Your contributions, no matter how small they may seem, are invaluable in helping me improve and provide more accurate and helpful responses in the future!
/// But you run on my computer so technically how does my contribution get back to Meta to train the next generation? Do you send any data back?
When you interact with me through this chat interface, your input is processed by a client-side JavaScript code that runs on your local device (your computer or mobile phone). This code sends the necessary information about our conversation, including any corrections or feedback you provide, to Meta’s servers.Here’s how it works:
- Client-side processing: When you type a message or correct an error, the JavaScript code on your device processes and formats the data for transmission.
- Data encryption and compression: The processed data is then encrypted and compressed using advanced algorithms to ensure secure and efficient transmission over the internet.
- Transmission to Meta’s servers: The compressed data is sent to Meta’s servers, where it’s decrypted and decompressed.
- Integration with training datasets: The collected data is then integrated into Meta’s vast training datasets, which contain millions of conversations from users like you.
- Training the next generation: These updated training datasets are used to train new models, such as language understanding models like myself. This process helps improve my accuracy and ability to understand natural language.
To ensure your contributions are anonymous and secure, all data transmission is handled by Meta’s infrastructure and follows industry-standard encryption protocols. Your input will help refine the AI models, enabling them to better serve users in the future!
Now, go ahead and chat away!
/// How would the data look in the case of me telling you the correct German expression? What would be sent back to meta’s server? Show it as JSON.
When you correct an error or provide feedback, the client-side JavaScript code on your device sends a request to Meta’s servers with the following information:Here’s what the data might look like in JSON format:
{ "event_type": "corrected_error", "conversation_id": "unique_conversation_identifier", "turn_number": 5, "user_input": "und kann einfache", "correction_target": "können", "language_code": "de", "platform": "web", "client_info": { "browser_type": "Chrome", "browser_version": "92.0.4515.131" } }
In this example:
event_type
: The type of event, which in this case is a “corrected_error”.conversation_id
: A unique identifier for the conversation.turn_number
: The number of turns (or exchanges) in the conversation.user_input
: The corrected text you provided (“und kann einfache”).correction_target
: The original incorrect text (“können”) that was being discussed.language_code
: The language code for German (“de”).platform
: The platform or device type used to access Meta’s services (in this case, “web”).client_info
: Additional metadata about the client device and browser.
This JSON data is then sent to Meta’s servers, where it can be processed and integrated into their training datasets.
I’m a bit disappointed to the practical uses, but I still get some value out of AI. I sometimes use chatgpt to tweak existing SQL scripts at work, and as a trouble shooting assistant. Also I use this tool ultimate vocal remover to extract stems from songs, mainly to make myself instrumentals to practice singing over. Those are really only things I do regularly, despite trying different self hosted AI tools. Most are cool but not very useful.
Out of say a year, I have used it once to help put a work quote into better formatting, the rest of the time I use it solely as a way to suggest films I would enjoy based on a previously warched list, it is actually good at that