This is a cache of https://yro.slashdot.org/story/24/04/05/2014205/academics-probe-apples-privacy-settings-and-get-lost-and-confused. It is a snapshot of the page at 2024-04-06T01:12:54.236+0000.
Academics Probe Apple's Privacy Settings and Get Lost and Confused - Slashdot

Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Privacy IOS Apple

Academics Probe Apple's Privacy Settings and Get Lost and Confused (theregister.com) 11

Matthew Connatser reports via The Register: A study has concluded that Apple's privacy practices aren't particularly effective, because default apps on the iPhone and Mac have limited privacy settings and confusing configuration options. The research was conducted by Amel Bourdoucen and Janne Lindqvist of Aalto University in Finland. The pair noted that while many studies had examined privacy issues with third-party apps for Apple devices, very little literature investigates the issue in first-party apps -- like Safari and Siri. The aims of the study [pdf] were to investigate how much data Apple's own apps collect and where it's sent, and to see if users could figure out how to navigate the landscape of Apple's privacy settings.

The lengths to which Apple goes to secure its ecosystem -- as described in its Platform Security Guide [pdf] -- has earned it kudos from the information security world. Cupertino uses its hard-earned reputation as a selling point and as a bludgeon against Google. Bourdoucen and Janne Lindqvist don't dispute Apple's technical prowess, but argue that it is undermined by confusing user interfaces. "Our work shows that users may disable default apps, only to discover later that the settings do not match their initial preference," the paper states. "Our results demonstrate users are not correctly able to configure the desired privacy settings of default apps. In addition, we discovered that some default app configurations can even reduce trust in family relationships."

The researchers criticize data collection by Apple apps like Safari and Siri, where that data is sent, how users can (and can't) disable that data tracking, and how Apple presents privacy options to users. The paper illustrates these issues in a discussion of Apple's Siri voice assistant. While users can ostensibly choose not to enable Siri in the initial setup on macOS-powered devices, it still collects data from other apps to provide suggestions. To fully disable Siri, Apple users must find privacy-related options across five different submenus in the Settings app. Apple's own documentation for how its privacy settings work isn't good either. It doesn't mention every privacy option, explain what is done with user data, or highlight whether settings are enabled or disabled. Also, it's written in legalese, which almost guarantees no normal user will ever read it. "We discovered that the features are not clearly documented," the paper concludes. "Specifically, we discovered that steps required to disable features of default apps are largely undocumented and the data handling practices are not completely disclosed."

Academics Probe Apple's Privacy Settings and Get Lost and Confused

Comments Filter:
  • It's pretty obvious. All lawyers know how to do it make things more complicated for everyone else... so that you'll have to pay them lots of money to sort it out. What a racket.

  • It should be illegal for software and hardware manufacturers to use pinned certificates. It should ALWAYS be possible for the owner of a device to load their own certificate into any device that they own that allows them to see any and all traffic to and from their devices. This does NOT compromise the security from the owners perspective. Yes it is a man in the middle attack, but no different than one used by corporations with proxies. Any device owner should be allowed to proxy their traffic so that they

    • by tlhIngan ( 30335 )

      And the fact it's a MITM attack is the reason why they get pinned, because people were using them in attack scenarios.

      The common case is a normal user is not doing any proxying of any kind, nor any network packet inspection. This is the 99% case where 99% of people are doing. The 1% will be the researchers doing the analysis.

      In this case, the user going to Google or Apple or whatever, should have their session shut down if it detects the certificate has been changed, because that's most likely what's happen

      • This is pure FUD. Not allowing pinned certificates does NOT enable any old person from doing MITM attacks. The person would still have to have access to the device. There is absolutely no difference or loss of security for loading certificates in a web browser on windows, linux or mac that can already be done with absolutely no loss of security as long as you control the endpoint devices. Pinned certificates do absolutely nothing except allow companies to weaponize devices and applications against the owner

  • The use lots of responsible-sounding words to say, essentially, "We'll do with your data whatever we want to do with it, whenever we want to do it." Using confusing language is the actual point. If it were clear and precise, they would then be limited in what they can do with your data, and that would go against the real aims of the company.

    For example, when they say they will share your data "only" with their "affiliates," what isn't obvious in the text is that any company they do business with, is an "aff

  • by felixrising ( 1135205 ) on Friday April 05, 2024 @06:53PM (#64373672)
    Whatever you do, don't look at Screen Time settings and parental controls. That is a HOT MESS on apple, and is buggy to the point the kids regularly end up with apps that just keep playing indefinitely after "one more minute". It's Clayton's screen time controls.
  • "...You're reading it wrong!"

It is masked but always present. I don't know who built to it. It came before the first kernel.

Working...