I think you're right with your argument of everything, including automation, is at risk of slowing down.
Maybe a needed breather for us to think and discuss the implications of AI grounded automation just a moment longer.
On the other hand growing economic pressures often lead to cutting edge new technology's to break through. So I'm a little ambivalent on this.
We've also seen that the "discussion" around the globe , or is it propaganda(?), to ease the outcome of massive job loss risks by UBI, robot taxation and so on gain traction.
And this happens at a stage in this arena where we possibly are at the brink of AGI becoming a reality. To me it's absolutely thinkable that Ray Kurzweil's estimations are timeline wise a underestimation of the "singularity" becoming a reality and I'm afraid that this, even if it takes e little longer, will still hit us like nothing else has ever impacted humans.
I find myself reading a lot about this general matter and thinking to myself we are trying to handle little sideeffect's like job loss or banning certain technology's from the battlefield or other aspects around this, trying to understand "the good, the bad and the ugly" fragments of this quantum leap we have right in front of us.
...and we're not ready! Not a bit.
Cheers!
Thanks for the invite @crypto.piotr!
Another brilliant feedback @doifeellucky.
Thank you for sharing your view on that particular topic with me. Appreciate it a lot.
We're clearly on the same page.
Yours, Piotr
Dear Piotr,
thank you! Yes, again we're on the same page!
Cheers!