Archive

Tag Archives: Android

Perhaps some of you may be wondering where TigerText, Wickr, and various other options are in the above comparison. Well, if they can’t be bothered to release any source code, fail to provide even basic protocol documentation, and have not posted a threat model analysis, then they are not worthy of your time or your attention.

Advertisements

The tech world has finally woken up to the safety and privacy risks in the app economy. Path’s silent address book mining sparked widespread outrage, and has had ripples for lots of web services and mobile apps that had, until now, seen as “industry best practice” the long-term retention and mining of customers’ contacts (UPDATE: more on Big Tech’s repeated privacy stumbles – including privacy-trashing kids’ apps – from the New York Times’ Nick Bilton). In the report I led for WITNESS last year, Cameras Everywhere, we pinpointed these practices as a massive potential vulnerability for human rights activists and for citizens and consumers more broadly (see p.27, Recommendation 2) . We specifically suggested that technology companies should take this stewardship seriously, and:

Follow the principle of privacy by design, and for products already in circulation, privacy by default. This is particularly important for products, apps and services that share this data with third parties that may not exercise the same diligence.

This might include revising the terms under which app developers can participate in App Stores for different platforms – for example, by issuing new guidelines requiring that third-party developers follow industry-class privacy practices – or it could even involve providing app developers with off-the-shelf privacy solutions directly. (WITNESS itself is a partner in ObscuraCam and InformaCam, Android apps that demonstrate privacy and human rights-sensitive ways to handle data, particularly visual data, generated by mobile phones.) Many app developers creating iOS, Android or other apps are small shops that have few staff, and no legal or privacy counsel to help them navigate tricky waters. What’s more, they are scattered in many jurisdictions that have extremely varied data protection laws and requirements. Frankly, it’s a no-brainer that they need help and guidance. (Update: I want to thank publicly Jules Polonetsky of the Future of Privacy Forum for pointing us along this path of inquiry during a research interview for Cameras Everywhere. Very exciting to see that he is involved in driving forward better industry-wide practices with the Application Privacy Summit in April 2012.)

We have made the argument for greater privacy protections in the app economy publicly and in private to the major technology companies, as well as app developers, VCs and policy-makers – we felt that it’s a central and intimate issue not just to activists, but to any and all users. We didn’t get much traction – we’re not technologists, and maybe the solutions we outline are inelegant or technically problematic, but that doesn’t mean the problem is a phantom one.

I hope that this recent upsurge in attention and scrutiny provides a window for companies like Apple, Google, Amazon, Twitter and Blackberry to realise that the concern is a real one (just as Apple did with mobile tracking data, for example), and to re-examine how their app ecosystems work. Ultimately, they need to take more responsibility for their app users’ privacy and safety, even if those apps are designed and built by third-parties elsewhere in the world – after all, only they really have the leverage, authority and know-how to make the app economy a safer place for us all.