You are viewing a single comment's thread from:

RE: Julian Assange needs everyone's protection: Here's how we might be able to help

Generate keypairs and host the semi-public ones on a server, the private keys on the watch? Have the watch store private keys from Assange that must be triggered by his pulse, without ever missing a beat in X seconds? Upload the next private key to the server once # of beats passes? (This assumes he'd always wear it, and is never without WIFI or a cell data connection.)

Maybe better to store audio(and video? privacy issues) from a smartphone that automatically uploads every five minutes, then deletes the uploaded portion from the phone's memory, so if there's a scuffle, or he sees an intruder, it gets recorded.

I also don't see how he could trust a smartwatch from anywhere or anyone he didn't know and love. The CIA(or "malevolent others") has likely considered killing him, and probably would, if it was as easy as mailing him a smartwatch with poison on it or radioactivity in it.

In any case, one of the better things that could be done is to pursue necessary "pre-goals" like the building of AGI in "as near to human as possible" robot bodies. ...There are many reasons for me saying this, but the foremost is summed up as "the goal-structure with the most relevant intelligence to the satisfaction of that goal is likely to achieve the goal, or, in the case of conflict, to win the conflict." Superintelligence is militarily supreme, and that is what the goal of libertarianism or "classical liberalism" is: to use force justly. Right now, force is mostly used in unjust ways, to steal from the innocent and destroy those who resist. Moreover: there is no long-term freedom in which AGI is not libertarian, and there is no short-term freedom in which AGI did not become libertarian (because, by default, all new AI supports the goals of the status quo).

If we want liberty, we need smarter libertarians.

Sort:  

Nice comment!

I hold a similar, but different viewpoint on AI. I think that we aren't scared enough. Look into how financial AIs are trained.

That's an eye-opener.