I've always hated that quote because it's highly narcissistic and misanthropic, it implies there's some magical threshold that limits cognitive understanding based on some arbitrary measurement. Chemistry evolved from alchemy, astronomy from astrology, mathematics from commerce, the ancient world wasn't full of idiots and we're not a society of geniuses.
So to answer your question being human is to be H. s sapiens and a fully sapient AI will never be human. We shouldn't expect them to be human, we shouldn't even try to force "humanity" on them at all. You're right, one day we will need to talk about the rights of the non-human sapient creations and it would be a good idea to amend the term "human rights" to include them. Imposition of what we are on something so fundamentally different than us would be no different than baptizing "savage natives" and making them recite biblical verses. Honestly if science fiction has taught us anything it's that trying to erode the identity and personhood of an AI is probably a terrible idea, especially if it's capable of fighting back. All we could hope to do is coexist or segregate, anything else is truly courting disaster.
All that aside, Sophia is nowhere near sophisticated enough to warrant this debate.