• News Categories
    ▼
    • Surveillance & Technology
    • U.S. News & Reports
    • International News
    • Finance
    • Defense & Security
    • Politics
    • Videos
  • Blog
  • Directory
  • Support Us
  • About
  • Contact

T-Room

The Best in Alternative News

  • News Categories
    • Surveillance & Technology
    • U.S. News & Reports
    • International News
    • Finance
    • Defense & Security
    • Politics
    • Videos
  • Blog
  • Directory
  • Support Us
  • About
  • Contact

August 6, 2021 at 6:30 pm

Fury At Apple’s Plan to Scan iPhones for Child Abuse Images and Reporting ‘Flagged’ Owners to Police…

Apple_iPhone
ParlerGabTruth Social

by Chris Ciaccia at The Daily Mail

Data privacy campaigners are raging today over Apple’s plans to automatically scan iPhones and cloud storage for child abuse images and report ‘flagged’ owners to the police after a company employee has looked at their photos.

The new safety tools will also be used to look at photos sent by text messages to protect children from ‘sexting’, automatically blurring images Apple’s algorithm’s could detect child sexual abuse material [CSAM].

But campaigners have accused the tech giant of opening a new back door to accessing personal data and ‘appeasing’ governments who could harness it to snoop on citizens.

While the measures are initially only being rolled out in the US, Apple plans for the technology to soon be available in the UK and other countries worldwide.

But the controversial plans have already been blasted as a ‘huge and regressive step for individual privacy’ over fears the system could easily be adapted to spot other material and is open to abuse.

Greg Nojeim of the Center for Democracy and Technology in Washington DC said that ‘Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship.’

The iPhone maker said the new detection tools have been designed to protect user privacy and do not allow the tech giant to see or scan a user’s photo album.

Instead, the system will look for matches, securely on the device, based on a database of ‘hashes’ – a type of digital fingerprint – of known CSAM images provided by child safety organizations.

As well as looking for photos on the phone, cloud storage and messages, Apple’s personal assistant Siri will be taught to ‘intervene’ when users try to search topics related to child sexual abuse.

The technology will allow Apple to:…

ParlerGabTruth Social
Continue Reading
This website lives off the kindness of your donations. If you would like to support The T-Room please visit our PayPal.

Editor’s Picks

Dramatic Video: Anti-ICE Protester Fires Weapon at Federal Agents During Immigration Raid…

Several Reports Say Deputy FBI Director Dan Bongino Threatening to Quit over Epstein Issues and DOJ Response…

Metadata Shows the FBI’s ‘Raw’ Jeffrey Epstein Prison Video Was Likely Modified…

At Last: John Brennan and James Comey Under Criminal Investigation for Russiagate…

“I’ve Seen ALL the EPSTEIN DOCUMENTS” | Alan Dershowitz…

Any publication posted at The T-Room and/or opinions expressed therein do not necessarily reflect the views of The T-Room. Such publications and all information within the publications (e.g. titles, dates, statistics, conclusions, sources, opinions, etc) are solely the responsibility of the author of the article, not The T-Room.

Twitter Icon

View Old Archives

Copyright © 2025 T-Room

Site by Creative Visual Design