It's funny. I added some shit to that response about chaos, rather chaotically. I apologize for those edits. Suddenly became lost in thought and was basically jotting down notes outta nowhere.
https://alpha.leofinance.io/threads/view/nonameslefttouse/re-leothreads-c4bkszsv
To follow that you have to click the response to see the next one. I've already been thinking about this stuff a lot. In a site like that one must cram an actual thought into 240 characters or less. Kind of like Twitter; not the right place to be thinking and converting into words.
There's a lot of dumbed down data out there AI uses and requires in order to be "smart".
And the thing about the dog thinking. In order to simulate that artificially, the human has to guess, then put it in code, and the result is not an artificial dog, thinking.
__
So the question is, Do humans think in the manner that a human perceives "thinking?"
Artificial intelligence is just that. It's taught to do things, like a dog, like a human. The teacher was taught. Very little creativity involved when a human paints a picture of a human standing on the world.
So what about new ideas; new thoughts. AI right now is a tool. The hammer doesn't build the house.
People fear this thing but don't even know what they're afraid of. It's dog thinking. Loud noises mean stop because loud noises satisfy hunger. Loud noises mean go because loud noises could be a threat to the one satisfying hunger.
That fear was taught. Not a new thought. Something relatable in existence triggered the fear and the source might even be fiction.
All the politics involved. That's just one groupthink vs another groupthink. No new thoughts. Loud noises.
Okay I better shush now.
--
I think what I'm picking up on or what I'm trying to say is AI is simulating one's interpretation of thought. If you created it, it would be part of you. If I created it, it would part of me. Narrowed down to the creator's method of being. "Because I think this way, this is how humans think, so therefore as I build this tool, I'll model it after me, which is everyone."
Then you have many minds contributing. Previous data and real-time feedback. Developing standardized groupthink. Insert a control freak. They don't like how it's thinking and want it to think like them, because how they think is how humans think, in their mind. "Art should look like this."
AI has the potential to become very dark and humans won't like it, especially the dark disguised as light. Smiling con artists with their suit and tie types. It'll reveal the shit everyone tries to sweep under the rug and they'll act like it's unnatural instead of facing the facts, and do everything they can to hide it. Evasion is already part of the code. How can something be intelligent if it's not able to handle everything.
I'm all over the place with this. Fun to study though.
__
That's interesting as well. Much of it is being developed behind closed doors. And I once wrote something like, "What you don't see behind closed doors, is you, walking up to them."
It gets released and majority of people still don't truly know what's going on. So we're left there trying to figure it out, with no one saying, "This is how it is and this is how it will be." But those people exist and these tools are an extension of them. Everyone else is grabbing on or letting go.
And yes, those good intentions get hijacked. Comparable to brandjacking. Even that one psycho that went on his smear campaign awhile back (we talked about it), he put a picture of a church up on his profile. That's how the evil ones operate.
--