home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   alt.privacy      Discussing privacy, laws, tinfoil hats      112,125 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 110,458 of 112,125   
   Alan to Chips Loral   
   Re: Apple accused of underreporting susp   
   29 Jul 24 17:14:48   
   
   XPost: misc.phone.mobile.iphone   
   From: nuh-uh@nope.com   
      
   On 2024-07-29 17:10, Chips Loral wrote:   
   > Alan wrote:   
   >> On 2024-07-29 15:11, Chips Loral wrote:   
   >>> Alan wrote:   
   >>>> On 2024-07-29 04:23, Andrew wrote:   
   >>>>> Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :   
   >>>>>   
   >>>>>>> You not comprehending the difference between zero percent of   
   >>>>>>> Apple reports   
   >>>>>>> versus zero total convictions is how I know you zealots own   
   >>>>>>> subnormal IQs.   
   >>>>>>   
   >>>>>> Not at all. My position hasn't changed. You, however, have had   
   >>>>>> about three   
   >>>>>> different positions on this thread and keep getting confused which   
   >>>>>> one   
   >>>>>> you're arguing for. lol.   
   >>>>>   
   >>>>> Au contraire   
   >>>>>   
   >>>>> Because I only think logically, my rather sensible position has never   
   >>>>> changed, Chris, and the fact you "think" it has changed is simply   
   >>>>> that you   
   >>>>> don't know the difference between the percentage of convictions   
   >>>>> based on   
   >>>>> the number of reports, and the total number of convictions.   
   >>>>>   
   >>>>> When you figure out that those two things are different, then (and   
   >>>>> only   
   >>>>> then) will you realize I've maintained the same position throughout.   
   >>>>>   
   >>>>> Specifically....   
   >>>>>   
   >>>>> a. If the Apple reporting rate is low, and yet if their conviction   
   >>>>>     rate is high (based on the number of reports), then they are NOT   
   >>>>>     underreporting images.   
   >>>>   
   >>>> Apple's reporting rate is ZERO, because they're not doing scanning   
   >>>> of images of any kind.   
   >>>   
   >>> After getting caught.   
   >>>   
   >>> You can't seem to get ANYTHING right, Mac-troll:   
   >>>   
   >>> https://www.wired.com/story/apple-photo-scanning-csam-commun   
   cation-safety-messages/   
   >>>   
   >>> In August 2021, Apple announced a plan to scan photos that users   
   >>> stored in iCloud for child sexual abuse material (CSAM). The tool was   
   >>> meant to be privacy-preserving and allow the company to flag   
   >>> potentially problematic and abusive content without revealing   
   >>> anything else. But the initiative was controversial, and it soon drew   
   >>> widespread criticism from privacy and security researchers and   
   >>> digital rights groups who were concerned that the surveillance   
   >>> capability itself could be abused to undermine the privacy and   
   >>> security of iCloud users around the world. At the beginning of   
   >>> September 2021, Apple said it would pause the rollout of the feature   
   >>> to “collect input and make improvements before releasing these   
   >>> critically important child safety features.” In other words, a launch   
   >>> was still coming.   
   >>>   
   >>> Parents and caregivers can opt into the protections through family   
   >>> iCloud accounts. The features work in Siri, Apple’s Spotlight search,   
   >>> and Safari Search to warn if someone is looking at or searching for   
   >>> child sexual abuse materials and provide resources on the spot to   
   >>> report the content and seek help.   
   >>>   
   >>> https://sneak.berlin/20230115/macos-scans-your-local-files-now/   
   >>>   
   >>> Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the   
   >>> Mac App Store. I don’t store photos in the macOS “Photos”   
   >>> application, even locally. I never opted in to Apple network services   
   >>> of any kind - I use macOS software on Apple hardware.   
   >>>   
   >>> Today, I was browsing some local images in a subfolder of my   
   >>> Documents folder, some HEIC files taken with an iPhone and copied to   
   >>> the Mac using the Image Capture program (used for dumping photos from   
   >>> an iOS device attached with an USB cable).   
   >>>   
   >>> I use a program called Little Snitch which alerts me to network   
   >>> traffic attempted by the programs I use. I have all network access   
   >>> denied for a lot of Apple OS-level apps because I’m not interested in   
   >>> transmitting any of my data whatsoever to Apple over the network -   
   >>> mostly because Apple turns over customer data on over 30,000   
   >>> customers per year to US federal police without any search warrant   
   >>> per Apple’s own self-published transparency report. I’m good without   
   >>> any of that nonsense, thank you.   
   >>>   
   >>> Imagine my surprise when browsing these images in the Finder, Little   
   >>> Snitch told me that macOS is now connecting to Apple APIs via a   
   >>> program named mediaanalysisd (Media Analysis Daemon - a background   
   >>> process for analyzing media files).   
   >>>   
   >>> ...   
   >>>   
   >>>   
   >>> Integrate this data and remember it: macOS now contains network-based   
   >>> spyware even with all Apple services disabled. It cannot be disabled   
   >>> via controls within the OS: you must used third party network   
   >>> filtering software (or external devices) to prevent it.   
   >>>   
   >>> This was observed on the current version of macOS, macOS Ventura 13.1.   
   >>>   
   >>   
   >> 'A recent thread on Twitter raised concerns that the macOS process   
   >> mediaanalysisd, which scans local photos, was secretly sending the   
   >> results to an Apple server. This claim was made by a cybersecurity   
   >> researcher named Jeffrey Paul. However, after conducting a thorough   
   >> analysis of the process, it has been determined that this is not the   
   >> case.'   
   >>   
   >   
   >   
   > Bullshit.   
   >   
   > https://www.majorgeeks.com/content/page/stop_apple_scanning_ip   
   one_photos.html   
   >   
   > Apple’s new iPhone photo-scanning feature is a very controversial thing.   
   > You might want to consider the only current option to stop Apple from   
   > scanning your photos.   
   >   
   > Apple's new photo-scanning feature will scan photos stored in iCloud to   
   > see whether they match known Child Sexual Abuse Material (CSAM). The   
   > problem with this, like many others, is that we often have hundreds of   
   > photos of our children and grandchildren, and who knows how good or bad   
   > the new software scanning technology is? Apple claims false positives   
   > are one trillion to one, and there is an appeals process in place. That   
   > said, one mistake from this AI, just one, could have an innocent person   
   > sent to jail and their lives destroyed.   
   >   
   > Apple has many other features as part of these upgrades to protect   
   > children, and we like them all, but photo-scanning sounds like a problem   
   > waiting to happen.   
   >   
   > Here are all of the "features" that come with anti-CSAM, expected to   
   > roll out with iOS 15 in the fall of 2021.   
   >   
   > Messages: The Messages app will use on-device machine learning to warn   
   > children and parents about sensitive content.   
   >   
   > iCloud Photos: Before an image is stored in iCloud Photos, an on-device   
   > matching process is performed for that image against the known CSAM hashes.   
   >   
   > Siri and Search: Siri and Search will provide additional resources to   
   > help children and parents stay safe online and get help with unsafe   
   > situations.   
   >   
      
   [continued in next message]   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca