Shiny Expensive Things: The Global Problem of Mobile Phone Theft

I was kindly invited down to Bournemouth University the other day by Shamal Faily, to give a talk as part of their Cyber Seminar series. I decided to talk about a quite hot topic which I’m very familiar with, mobile phone theft. The slides are updated from an earlier talk, but cover some of the political involvement in 2012/13 and some information on recent industry action and what should happen next.

Manufacturers, Developers and Device Privacy

I‘m involved in the IAPP’s privacy event this afternoon, talking in the session: “Is There an App for That? Privacy in Social, Local and Mobile Services” with a view from mobile manufacturers and developers. Here is my talk and some ideas about how some of the current problems can be solved. I’d be interested in your views:

image“Privacy isn’t something that mobile manufacturers have had to get involved with. Beyond a basic device PIN lock, the furthest some manufacturers got ten years ago was to put PIN protection on mailboxes.”
 
These days, it is often a question of what does the manufacturer own? The hardware? The access control of the device? there are a vast amount of stakeholders in the mobile industry and it is difficult to see who has responsibility. When something goes wrong, the blame often goes all over the place. The manufacturer often doesn’t have control over the operating system these days, but they do have control over security in the hardware, including features such as trusted secure storage and trusted execution which can be opened up via APIs (interfaces) to the operating system and applications above that. This means that privacy sensitive information such as credentials could be stored in what is in effect a safe on the device. Other features such as full-device encryption give peace of mind if a device is stolen, but there are more fundamental things that are not fixed in some devices such as also locking down USB ports when the key-lock is in use. Often this comes down to individual engineers and it is important to note that privacy does not feature in software engineering syllabuses and there is still a problem in educating future engineers including a lack of mandatory security components.
 
As manufacturers, information sharing and disclosure of security vulnerabilities, particularly where there are privacy implications, should be encouraged and improved. This is an area that is still lacking in industry.
 
The device is our life-diary. We must all acknowledge that there are situations where the Police need to intervene and legally get access to data on devices whether the owner is the perpetrator of crime or a victim. The evidence aspect of mobile phones is incredibly important and the discipline of mobile device forensics is still emerging and developing. These needs are clearly counter to the needs of everyday security and privacy and this highlights the complexity of context, for as a user who then becomes a victim, the privacy need then turns into a need to disclose.
 
Developers
“just because you can, doesn’t mean you should” is probably the most important point when it comes to developing new services that involve the user. We have the capabilities in technology now to do almost anything. Proportionate and responsible usage by companies is a moral responsibility that is sometimes negated by the desire to make money. This is something that self-regulation is never going to be able to solve. Public exposure and the risk of public exposure by hacktivists or the media is what seems to be driving the protection of privacy rather than a genuine desire to be responsible in the majority of cases.
 
Users don’t necessarily realise that their data is being misused, because they can’t see it. This could be through profiling tools and so on. When these things become publicly exposed, such as with the Carrier IQ issue in 2011, users immediately reject the service in the most extreme ways without really realising what is going on or if indeed, the service did in fact breach their privacy. Some developers don’t know that the services they’re including in their apps breach their users’ privacy (e.g. advertising etc).
 
Some short points now on problems and solutions for manufacturers, developers and users of mobile devices around privacy:

Problems with privacy

·     Technically, we don’t have the screen real estate on mobile devices to display privacy policies and besides, no-one ever reads them anyway. This is a huge issue that has not been adequately addressed (the proposed Mozilla privacy icons are interesting..) User experience is mostly – accept these privacy settings (or permissions) or don’t use the application. This is not really acceptable. Human behaviour… Your user wants their privacy protected but is quite happy to breach others’ Privacy is contextual and often the privacy need is after the event. Here are some very brief (but extreme) examples:

1.      A user who is very open and has no privacy concerns has their social media settings set such that all their photos are available. They are murdered in unrelated events. Media across the country descend on the open site and use the images in reports, to the extreme distress of the family of the victim.
2.      A newspaper finds out that a woman has slept with a well-known celebrity. They leverage the woman through her connections on a social networking site and essentially force her to “tell her side of the story”.
3.      Employees working for a company are involved in a labour dispute. There is a division between union members and “loyalist”staff. Friends become enemies overnight without realising it. The context of privacy has changed significantly. Postings that were previously posted in a private environment are printed off and taken to management. The company takes advantag
e of the situation and goes further, even to the extent that they search for profile updates and public data on social media sites to identify “troublemakers” and discipline them.
4.      A child is befriended by another child through a social application because they both like the same band. Location data and lots of private information including pictures are happily shared, but only privately. The 2nd child is in fact an adult who has initially used the public information about the child’s interests in order to groom them.
 
Some solutions for operating system vendors and developers

·     Architecture of device operating systems needs to change – current mechanisms are more advanced than before (e.g. view privileges) but need to go to the next level. One possibility is to create the ability to “negotiate” in APIs.. – e.g. “I won’t give you fine-grained location but you can have the town I’m in” (existing example: protocol negotiation in computer systems) More fine-grained mechanisms for revoking permissions – “I don’t trust this anymore” or “I no longer want to share location” Support in APIs for saying “the user does not allow you to do this” – allows developers to gracefully fallback to something without the app breaking. Remember that human behaviour means that people will do whatever they can to get over hurdles i.e. the “Dancing Pigs” problem User must always be in control (this is not the case now) Advanced permissions architectures that allow delegation to a third party that the user trusts (e.g. children’s charities, Which? Etc.)”