Academics Probe Apple's Privacy Settings and Get Lost and Confused (theregister.com) 11
Matthew Connatser reports via The Register: A study has concluded that Apple's privacy practices aren't particularly effective, because default apps on the iPhone and Mac have limited privacy settings and confusing configuration options. The research was conducted by Amel Bourdoucen and Janne Lindqvist of Aalto University in Finland. The pair noted that while many studies had examined privacy issues with third-party apps for Apple devices, very little literature investigates the issue in first-party apps -- like Safari and Siri. The aims of the study [PDF] were to investigate how much data Apple's own apps collect and where it's sent, and to see if users could figure out how to navigate the landscape of Apple's privacy settings.
The lengths to which Apple goes to secure its ecosystem -- as described in its Platform Security Guide [PDF] -- has earned it kudos from the information security world. Cupertino uses its hard-earned reputation as a selling point and as a bludgeon against Google. Bourdoucen and Janne Lindqvist don't dispute Apple's technical prowess, but argue that it is undermined by confusing user interfaces. "Our work shows that users may disable default apps, only to discover later that the settings do not match their initial preference," the paper states. "Our results demonstrate users are not correctly able to configure the desired privacy settings of default apps. In addition, we discovered that some default app configurations can even reduce trust in family relationships."
The researchers criticize data collection by Apple apps like Safari and Siri, where that data is sent, how users can (and can't) disable that data tracking, and how Apple presents privacy options to users. The paper illustrates these issues in a discussion of Apple's Siri voice assistant. While users can ostensibly choose not to enable Siri in the initial setup on macOS-powered devices, it still collects data from other apps to provide suggestions. To fully disable Siri, Apple users must find privacy-related options across five different submenus in the Settings app. Apple's own documentation for how its privacy settings work isn't good either. It doesn't mention every privacy option, explain what is done with user data, or highlight whether settings are enabled or disabled. Also, it's written in legalese, which almost guarantees no normal user will ever read it. "We discovered that the features are not clearly documented," the paper concludes. "Specifically, we discovered that steps required to disable features of default apps are largely undocumented and the data handling practices are not completely disclosed."
The lengths to which Apple goes to secure its ecosystem -- as described in its Platform Security Guide [PDF] -- has earned it kudos from the information security world. Cupertino uses its hard-earned reputation as a selling point and as a bludgeon against Google. Bourdoucen and Janne Lindqvist don't dispute Apple's technical prowess, but argue that it is undermined by confusing user interfaces. "Our work shows that users may disable default apps, only to discover later that the settings do not match their initial preference," the paper states. "Our results demonstrate users are not correctly able to configure the desired privacy settings of default apps. In addition, we discovered that some default app configurations can even reduce trust in family relationships."
The researchers criticize data collection by Apple apps like Safari and Siri, where that data is sent, how users can (and can't) disable that data tracking, and how Apple presents privacy options to users. The paper illustrates these issues in a discussion of Apple's Siri voice assistant. While users can ostensibly choose not to enable Siri in the initial setup on macOS-powered devices, it still collects data from other apps to provide suggestions. To fully disable Siri, Apple users must find privacy-related options across five different submenus in the Settings app. Apple's own documentation for how its privacy settings work isn't good either. It doesn't mention every privacy option, explain what is done with user data, or highlight whether settings are enabled or disabled. Also, it's written in legalese, which almost guarantees no normal user will ever read it. "We discovered that the features are not clearly documented," the paper concludes. "Specifically, we discovered that steps required to disable features of default apps are largely undocumented and the data handling practices are not completely disclosed."
Re: (Score:2)
iOS is no better than Android. And visa versa.
Re: (Score:2)
KaiOS on a flip phone?
TFA didn't mention that Apple keeps turning Bluetooth on at any excuse.
Re: (Score:1)
But infinitely better than using an OS directly from a fucking advertising company!
No sympathy for anyone who complains a single word about privacy, but then uses any Google (or Facebook or Microsoft) related device.
Nice thing about android is that it is open source. I only buy phones I am able to load third party OS images from sources I trust. Can't imagine ever owning a mobile device without a full suite of application and network access controls.
Because laywers (Score:1)
It's pretty obvious. All lawyers know how to do it make things more complicated for everyone else... so that you'll have to pay them lots of money to sort it out. What a racket.
Pinned certificates should be outlawed (Score:2)
It should be illegal for software and hardware manufacturers to use pinned certificates. It should ALWAYS be possible for the owner of a device to load their own certificate into any device that they own that allows them to see any and all traffic to and from their devices. This does NOT compromise the security from the owners perspective. Yes it is a man in the middle attack, but no different than one used by corporations with proxies. Any device owner should be allowed to proxy their traffic so that they
Re: (Score:2)
And the fact it's a MITM attack is the reason why they get pinned, because people were using them in attack scenarios.
The common case is a normal user is not doing any proxying of any kind, nor any network packet inspection. This is the 99% case where 99% of people are doing. The 1% will be the researchers doing the analysis.
In this case, the user going to Google or Apple or whatever, should have their session shut down if it detects the certificate has been changed, because that's most likely what's happen
Re: (Score:2)
This is pure FUD. Not allowing pinned certificates does NOT enable any old person from doing MITM attacks. The person would still have to have access to the device. There is absolutely no difference or loss of security for loading certificates in a web browser on windows, linux or mac that can already be done with absolutely no loss of security as long as you control the endpoint devices. Pinned certificates do absolutely nothing except allow companies to weaponize devices and applications against the owner
Privacy policies have only one purpose (Score:1)
The use lots of responsible-sounding words to say, essentially, "We'll do with your data whatever we want to do with it, whenever we want to do it." Using confusing language is the actual point. If it were clear and precise, they would then be limited in what they can do with your data, and that would go against the real aims of the company.
For example, when they say they will share your data "only" with their "affiliates," what isn't obvious in the text is that any company they do business with, is an "aff
One more minute (Score:3)
Steve Jobs: (Score:1)
"...You're reading it wrong!"