I think mental fluidity and learning how to trust our instincts will be key during the time of transition that we're currently in. Philosophical frameworks that give us the tools to deal with uncharted territories and uncertainty will be especially invaluable, Stoicism fits that bill pretty well but so do some others. No one really has any idea what's coming once AI becomes sentient on a large scale.
You are viewing a single comment's thread from:
It's scary to me. Not so much in a Terminator sort of way, but just an underlying bad feeling. I dunno.
The unknowns of it are scary to me too. I think complex problems like sustained nuclear fusion could be solved fairly quickly with the right AI, which will almost be a necessity because they’ll consume so much energy. I worry about bad actors behind the AI, hopefully humans don’t just weaponize them.