A proposal for the EU by Hungary to combat Child Sexual Abuse Material (CSAM) being shared through the internet, is for applications to detect, report and remove it. In order to do this, a user would have to consent to having all of their content scanned prior to any end-to-end encryption, and if they don't, they won't be allowed to use the application.
What could go wrong?
Obviously, the applications themselves are opposed to this because they don't want to do at curation, as it makes them liable for the content that is on their platforms. It also opens them up to a host of laws bad liabilities that will lead to them being accountable, and fines. However, what a massive overreach with the potential for incredible abuse. To even propose such a ridiculous concept shows just how stupid governments think we are.
Though, they probably aren't that far off.
There are probably may people who are open to government monitoring and hold the belief that "if you don't do anything wrong, you have nothing to worry about". Yet, the same people don't seem consider what they currently might be doing "wrong" now, what they are currently doing that could be used against them now, and what they are currently doing that might be considered wrong in the future.
I am not even talking about Al Jolson in blackface in the 1920s, but things that people felt comfortable publicly talking about five or ten years ago. Numerous stars and especially comedians have been impacted by ten year old tweets, even though at the time, millions of people were laughing along with them. What was funny to the majority, has now been deemed inappropriate by the minority.
But, just imagine what is shared in private conversations, and how it could be positioned out of context publicly? Would you like of your conversations with friends and family were made public? Because, that is what one should assume will happen when the governments look after security. And of course, the assumption should be that the applications and whoever has access to the information, are going to use it to their maximum advantage.
Quite often, when people think they are making a decision that is "good", they haven't thought about all the potential unintended consequences and risks. People might want to stop CSAM, but at what cost to personal freedoms? This doesn't mean nothing should be done of course, it just means that other ways need to be developed. Otherwise, it will create more problems than it is going to solve in a cut off the nose to spite the face process.
Before governments demand more transparency from us, I believe that they should provide full transparency of themselves. Well, as full as possible, where every transaction, every decision, and all of the working is open to be audited. They talk about "state secrets" and punish those who leak them, but should the state have secrets from its people? The people won't misuse it, will they? The excuse is of course about keeping secrets from bad actors, but the same logic applies to the CSAM bad actors, right? Maybe if there was radical government transparency across all governments, the potential for bad actors would be reduced. After all,
If they aren't doing anything wrong, they have nothing to hide.
Taraz
[ Gen1: Hive ]
The people doing nothing wrong with nothing to hide would still likely pitch a fit if they got told they should leave their curtains open at night and be perfectly fine with people opening their mail randomly or going through their underwear drawer, but that's completely different.
We've been watching some of these lawsuits that are brought against the social media companies due to certain things and it is very interesting how people are looking for anyone to blame when they don't want to take responsibility. This is clearly a different situation, but the end result could be the same. It's a fine line between protecting people and being intrusive.
This is true. However, the platforms are hiding behind a law that wasn't made for them around publishing protection. They are definitely curating through the algorithms, so if a kid is accessing all kinds of terrible content that is "tailored" for that kid and programmed to get more extreme, they should take some responsibility.
I think it's a slippery slope. Especially with everyone being so sue happy these days.
While I think that I am doing nothing wrong and have nothing to hide the idea of full scan of my device by the government just sounds crazy. What is there to prevent them from doing a full scan AND full backup of everything from our devices and then their datacenter gets hacked and everything from your device appears online? How is that not a potential financial and security threat to the users?
It is crazy. It is crazy it is even suggested. Granted, Hungary aren't exactly doing that well at the moment.
Trust the government on corporate infrastructure to keep you safe!
Well, even if I am not doing anything wrong, I have my privacy to hide.
:)
Not a fan of big government, govt overreach, intrusion into citizens lives....there is always a bullshit reason for doing these things and it is never, ever in the best interest of the people.
Always some "social" reason when the underlying purpose is authoritarian control.
Agreed, now...what are we going to do about it? 👀
In the anime Psycho-Pass plot takes place in the future(2113) and there are devices that ckecks mental state, personality of all citizens and assesses the probability that the citizen will commit a criminal act. It was not ideal result either-the system also got corrupt/bad.
Every dystopian world in the future, has governments or a corporation who monitor citizens "for their own safety"
I understand your point, but people do not understand that the traditional network does not exist privacy is only an illusion we are made to believe that because we put a password is already private and no one else can see it, but if you are a threat or commit extreme crimes congress can force any social network that is in the country to reveal all our information to private chats to what we delete just by having an email they already know who we are and the places we frequent that is why we bombard us so much with targeted advertising whatever we do in the traditional network sooner or later they will know.
This is true. But there is also somewhat of a difference once it is opted into and legislated. Also, this isn't just for social media, it is for any messaging apps. Where is the line? What about corporate apps?
Who against algorithms? Any misuse of the internet is left on the black web. Any information we provide on traditional social networks or simply in any area of the web is susceptible. Controls will continue, and so will those who violate them. It is up to each of us to take the necessary precautions and be very vigilant where we think of leaving our footprints. Sexual abuse may be just around the corner, as well as drugs; they are a necessity to be connected to the web.
I don't understand the question
It is not the question that is important, but the interpretation of the answer. Blessings, my friend @tarazkp, happy weekend.
How I "love" when governments overreach! I don't exclude that in this case it was just a populist stunt... Who would be against fighting the dissemination of child sexual abuse materials, right? Of course, if it were to be implemented as you describe it, that would be a backdoor for more control of the governments over any type of content that is transmitted "privately".
I would love to see more governmental transparency. Sounds like Elon and Vivek are ready to bring that type of accountability to their new Dept of Govt Efficiency (DOGE). Well at least my $DOGE is pumping.