I've never heard this theory about AI before but it's super interesting to ponder. A number of physicists have theorized that rogue AI is every bit as likely as natural cataclysms (climate change, asteroids, etc) to end intelligent civilizations out there in the universe.
As with everything in life, the way the development is barreling forward on this it's almost as though it was meant to be. I have a feeling superintelligent AI is part of the grand design that will teach humanity its next round of lessons (good and bad) to help us discover our true power and evolve further. It's too bad it takes adversity for us to learn our deepest lessons. I wonder if that will ever change?