The Digital Drop Piece
You don't have to be a Law & Order fan to know what a drop piece is. It's a fixture in just about every police detective story ever told. But in case you've been a literary Rip Van Winkle, I'll remind you that a drop piece is a small, unregistered hand gun that police detectives usually carry attached to their ankles. Its serial numbers have been filed off to make it untraceable, and it remains hidden under the pants leg until it needs to be used.
The gun is not there for self-defense or predatory purpose. In fact, while it is loaded with live ammunition, the gun is rarely, if ever, fired. That's because the sole purpose of a drop piece is to plant evidence on a perpetrator at the scene of a crime. In case a bust becomes questionable or lacks evidence, a detective can claim that "this small, illegal handgun was found at the scene," which would ostensibly escalate the event into one of life-threatening circumstance, justifying the cop's next-level behaviors.
The fraudulent use of drop pieces are, in every way, completely illegal, providing false evidence which at trial, is tantamount to perjured testimony.
And yet, drop pieces have been used plenty of times, unquestioned against the word of law enforcement and context of circumstances. After all, it's far easier for juries and judges to believe the bad guy had a gun than it is for them to accuse the police of framing an innocent citizen. If you happen to be arrested by mistake, this forms the perfect storm for a mistaken conviction.
Enter Tim Cook, the CEO of Apple.
In case you haven't heard, Apple now intends to scan every single image on every single iPhone for "images of child pornography" or those which they suspect could be such. If that doesn't scare you, it should, for a few reasons:
First, it completes big tech's intrusion into, and the total destruction of, any illusions of privacy you may have had about your digital data. While we all know that big tech has been sharing our sales and contact information for decades, this completes the loop to include visual data.
Second, it assigns Apple and others the dubious task of determining what is objectionable and who should be reported to authorities. Since there is no discernible standard against which these images are judged, the exact same image could be viewed as innocent for one person but criminally suspect for another. The entire program is arbitrary at least, capricious at best.
Third, these scans can and will be used as digital drop pieces, where "undesirables" (as arbitrarily defined by big tech) can and will suddenly be found in possession of said images, prompting reports to authorities and subsequent prosecution, opening a whole new world of political harassment and persecution to those that big tech simply doesn't like.
Think it can't happen to you? Think again: Do you know anyone whose credit card hasn't been compromised? How many have been the victim of a persistent wrong number or something as harmless as a misdirected e-mail or spam? The FBI already has huge arsenals of digital weaponry, listening, watching everyone from muslim terrorists to anti-vaccine activists to take one wrong step.
Now think about big tech generating its own political enemies list matched up to its image scanning weapon and the picture for you -- even unscanned -- gets pretty ugly.
Be careful out there.