• 1 Post
  • 108 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle






  • Yeah I’ve noticed I’m spending less and less time here the more every community either devolves into doom and gloom politics or Linux.

    Reddit sucks, Tildes is too small to matter, but Lemmy just doesn’t have any kind of variety whatsoever. The niche communities all died out in June after the people who thought they’d be kick starting them for the reddit Exodus had to carry them with no conversation or posts, and now it’s just this shit in every community.




  • None of which exist currently and won’t be planned until after the overthrow of the system has resulted in another power grab and we end up in the same situation with a different hierarchy.

    Stonks.

    Seriously though, anarchy and whatever it means to you is purely theoretical in nature. “Anarchy is insert thing that’s never existed and OP can’t even explain in detail” is a useless statement, all you’re saying is it’s not what we have now and somehow it won’t have any kind of vertical elevation of humans above one another.

    We can all create fantasy utopias in our heads and claim they are or aren’t whatever we want.









  • To make the analogy actually comparable the human in question would need to be learning about it for the first time (which is analogous to the training data) and in that case you absolutely could convince the small child of that. Not only would they believe it if told enough times by an authority figure, you could convince them that the colors we see are different as well, or something along the lines of giving them bad data.

    A fully trained AI will tell you that you’re wrong if you told it the sky was orange, it’s not going to just believe you and start claiming it to everyone else it interacts with. It’s been trained to know the sky is blue and won’t deviate from that outside of having its training data modified. Which is like brainwashing an adult human, in which case yeah you absolutely could have them convinced the sky is orange. We’ve got plenty of information on gaslighting, high control group and POW psychology to back that up too.



  • People who don’t understand or use AI think it’s less capable than it is and claim it’s not AGI (which no one else was saying anyways) and try to make it seem like it’s less valuable because it’s “just using datasets to extrapolate, it doesn’t actually think.”

    Guess what you’re doing right now when you “think” about something? That’s right, you’re calling up the thousands of experiences that make up your “training data” and using it to extrapolate on what actions you should take based on said data.

    You know how to parallel park because you’ve assimilated road laws, your muscle memory, and the knowledge of your cars wheelbase into a single action. AI just doesn’t have sapience and therefore cannot act without input, but the process it does things with is functionally similar to how we make decisions, the difference is the training data gets input within seconds as opposed to being built over a lifetime.