Go | New | Find | Notify | Tools | Reply |
Minor Deity |
Don't know what's most disturbing - that it's so widespread (that it's happening at all is utterly sickening and tragic), or that it's online and accessible, or that the Big Tech companies are - incredibly - fighting its removal. "Don't be evil!" yeah, sure! How CAN this be? How can they rationalize refusal to remove it even when it's shown to them ("doesn't meet the reporting threshhold"??). Don't the Google and DropBox executives know about this, don't they have children? Don't even understand what they get out of these human violations financially. That the victimized children often have to live with the knowledge (often leading to their being stalked) that their recognizable images are disseminated around the world is the last straw in their exploitation. I said there was "good news" - so what is it, you ask. Well, as disturbing as the overall content is, I was heartened to see that the "good guys" of technology are coming up with sophisticated recognition software which allows them to find and delete child victims' images. It's really quite extraordinary. Article describes how. At least, there's that! SO many of these preyed upon children end up killing themselves or trying to, because of learning how their images last (and last) online. ALL the worse in a way for the older victims, as they remain more enduringly recognizable. This I've read about elsewhere, and it is indeed one of the special hells of this new technology combined with the age-old perversion preserved digitally indefinitely. Jesus! And to quote Greta Thunberg, "HOW DARE THEY?" ("They" being Google and other tech giants who fought against removing identified instances of child torture and rape on their sites. At least Microsoft seems responsive and active in the recognition/removal software.) If those were pictures of you, you'd understand"
| ||
|
Has Achieved Nirvana |
This sounds very different to me depending on how it’s framed: “Should the tech companies use their technology to detect and report child pornography?” “Should the tech companies scan your private content on behalf of law enforcement with no warrant and no probable cause?”
| |||
|
Pinta & the Santa Maria Has Achieved Nirvana |
Agreed. Though I think a lot of the privacy clutching of pearls crowd goes a bit nuts, this strikes me as excessive. If the crime being detected were less emotional than child sexual abuse, would people still be OK with wholesale scanning of private content? I'm contrasting this to the whole debate on speed cams. I'm OK with that, personally, and don't see it as the infringement on privacy that many do. The difference? The speed cam is triggered when you actually drive over the speed limit. It's not just on at all times. However, even that seems archaic now, since so many cities now have public cameras on 24/7 regardless. I think people have a reasonable expectation of privacy online (except for public fora and the like). I don't think anyone has a reasonable expectation of privacy with any public activity any longer. Apologies for the grammar. | |||
|
Minor Deity |
Perhaps it's because I happen to know young women who've been horribly victimized by sexual predators, mostly their fathers, that I feel as strongly as I do about protecting children from such horrors. That said, I simply can't fathom these, to me, spurious idealistic rationalizations for refusing to do whatever can possible to protect them from the terrible things being done to them. That is, especially now with the added dimension of the internet's being used to exploit them twice - the initial torture and rape, plus the dissemination of their images online on a world-wide scale. This is all the more so because for whatever reason, the rate these images have been appearing online has increased so dramatically I'll link an item detailing the specifics of this nightmarish rate of increase. Not only are there the self-evident reasons for protecting them from this exploitation but I feel sure that there is a contagion to it, so that the more adults profit from exposing these crimes, the more they spread to new audiences and "producers". I think all are encouraged by a sick sense of cameraderie and by technical advice provided. These techniques help them to get away with both clandestine viewing and (worse) how to produce these sick records of the children's suffering without being caught. THEIR technology is growing ever more sophisticated - so ought the technology combatting their unspeakable crimes! Yes, indeed! Spot check user content, videos or whatever, using the most sophisticated technology possible to protect this vulnerable population. Especially if they focus on children's images I think political risks are minimal and well justified. (The first article I posted describes techniques not only to zero in on children, but even on specific children) 45 million images of child sexual abuse last year compared to a million a decade ago!
| |||
|
Has Achieved Nirvana |
The problem is once you put the power in the hands of the Feds they wouldn’t just use it for child exploitation crimes. Remember asset forfeiture was sold as a way to get speedboats out of the hands of drug dealers and RICO was about punishing the mob. Just curious, Amanda, would you support doing this on people’s home computers? If Apple and Microsoft can do this on the cloud they can do it on the desktop.
| |||
|
Minor Deity |
On the phones and tablets too, and on the external harddisks and USB thumb drives and any mass storage device anytime you connect one to a computer. The technology is there, but the legal requirement and commercial incentives are not there, so they do not do it yet. E.g., Apple Inc. tells you outright that they can scan/recognize faces in your photograph collections and automatically index photographs by people that appear in them; But there is no requirement/incentive to scan for "abuse" pictures/videos. Technology aside, I do believe that technology companies sometimes do drag their feet when it comes to entertaining content removal requests from individual consumers. There is no money in it, and removing content also means reducing eyeballs and thus reducing advertising revenue, so this function is not prioritized and not adequately staffed.
| |||
|
Minor Deity |
Jon, don't see where the great precautionary note is in recognizing how often good intentions, legal and otherwise, pave the way to hell. Bad intentions, especially unchecked and excused, do so even more often. As Ax says with IMO excessive understatement, the big tech companies tend to "drag their feet" in removing objectionable content (especially, he says, that called to their attention by individuals.) Facebook, for instance, tacitly recognizes the social desirability of removing such content by placing option "dots" after every post where a viewer or poster can flag it for the purpose. Flagging it then leads to a menu of options - what kind of undesirability one sees and eventually (whether or not the content is removed), the "flagger" receives a response from a Facebook bot or apparent bot, indicating the alleged care with which the objection has been treated. I've done my damnedest - unsuccessfully - to have horrific content removed in different realms (Holocaust Denial and gruesome scenes of violence tendentiously blaming political foes). Facebook refuses to remove them, no matter how ghastly or inaccurate the contents. I've heard the same from many friends with credible examples. That includes scenes of child sexual torture. (And still Facebook seems to enforce a no-nipples policy, even in photographed breast feeding scenes! I'm now guessing this outlandish rule exists simply because a clever AI progammer succeeded to create an algorithm to recognize nipples - a precursor technology to the far more sophisticated recognition now developed.) Ax says there's no commercial incentive to do otherwise nd I guess that's the case, although I'd hate to think the disincentives takes the opposite form - what amount to payoffs by the posters. As we all know these days, even using roubles in payment apparently weren't a clear enough signal that election-influencing posts were placed by Russians! Hence, much of the all-too effective disinformation campaign which ushered in Trump's surprise victory in the last election. And once more Zuckerberg is virtuously refusing to make an effort to police the ultimate "fake news" on the internet before the upcoming election. (So much for Millenials' holier-than-thou "ok Boomers" dismissals, when one of THE prime despoilers of our democracy and with it policies ruinous to the physical environment, is one of their own!). No accident that it's Facebook and Google which have been steadfastly refusing to remove ghastly images of child torture, called to their attention by both the law and individuals, claiming that it "doesn't meet their reporting threshhold". (Really don't get how such a phrase dare be uttered!). Of course, the terrible truth is that Facebook, for one, DOES spend a good deal on reviewing videos and images by hiring censors around the world. As has been acknowledged periodically in the mainstream news, despite receiving (by local standards) high pay, these censors not only have a rapid burnout rate, they suffer PTSD and breakdowns because the content to which they are exposed, is so intolerable. Indeed, it seems that seeing repeated scenes illustrating the grotesque underbelly of modern man is enough to drive them literally mad. All the more reason to expend funds on technology allowing for its recognition and removal, before it sickens the species still more. What else does it mean to take responsibility for protection of the young? And IMO this applies no less to virtually depicted scenes of such brutality (cartoons or manipulated videos). I don't see why acknowledging that technology is a two-edged sword ought to mean failing to use it to prevent its misuse. Right now, it seems clear that the technology is already available for the highest bidder, so why not insist it be used for the powers of good - protecting our democracy and our children from despoiling? While we debate fine points of how ACLU principles might be applied to a Vile New World, we are losing the right to claim we have anything left to save in the environment, our children's minds, and our political systems.
| |||
|
Powered by Social Strata |
Please Wait. Your request is being processed... |