Archive

Tag Archives: apps

The US government’s National Telecommunications and Information Administration today issued its first draft of what will be a mobile apps code of conduct intended to better protect consumers and their privacy. If made final, policy states that publishers must provide consumers with “short-form” notices in multiple languages informing them of how their data is being used. After soliciting feedback from privacy, civil liberties, and consumer advocates, along with app developers and publishers, and mobile ecosystem representatives, the NTIA sought to help make mobile apps more transparent to their end users.

Mobile applications raise many interesting and diverse legal questions for our clients, from accessibility to content protection to consumer privacy and data security, so it’s an exciting time to be doing this type of work. Some of the most popular mobile applications transcend the traditional regulatory silos – videos can be streamed to a mobile telephone, and IP telephone services may be used on a mobile telephone – and regulators have had difficulty anticipating some of the issues raised. But it’s clear that the FCC, the NTIA (which is working with industry stakeholders to develop a voluntary code of conduct for handling personal data) and the FTC are paying very close attention to the issues, especially with respect to privacy.

Mobile applications raise many interesting and diverse legal questions for our clients, from accessibility to content protection to consumer privacy and data security, so it’s an exciting time to be doing this type of work. Some of the most popular mobile applications transcend the traditional regulatory silos – videos can be streamed to a mobile telephone, and IP telephone services may be used on a mobile telephone – and regulators have had difficulty anticipating some of the issues raised. But it’s clear that the FCC, the NTIA (which is working with industry stakeholders to develop a voluntary code of conduct for handling personal data) and the FTC are paying very close attention to the issues, especially with respect to privacy.

In the worst case, it’s possible to envisage geolocation and data aggregation apps being designed to facilitate the identification and elimination of some ethnic or class enemy, not only by making it easy for users to track them down, but by making it easy for users to identify each other and form ad-hoc lynch mobs. (Hence my reference to the Rwandan Genocide earlier. Think it couldn’t happen? Look at Iran and imagine an app written for the Basij to make it easy to identify dissidents and form ad-hoc goon squads to proactively hunt them down. Or any other organization in the post-networked world that has a social role corresponding to the Red Guards.)

But as I said earlier, the app is not the problem. The problem is the deployment by profit-oriented corporations of behavioural psychology techniques to induce people to over-share information which can then be aggregated and disclosed to third parties for targeted marketing purposes.

Not an April Fool – Charlie’s Diary – about the frankly idiotic and dangerous app Girls Around Me. (Via Esther Dyson)

The tech world has finally woken up to the safety and privacy risks in the app economy. Path’s silent address book mining sparked widespread outrage, and has had ripples for lots of web services and mobile apps that had, until now, seen as “industry best practice” the long-term retention and mining of customers’ contacts (UPDATE: more on Big Tech’s repeated privacy stumbles – including privacy-trashing kids’ apps – from the New York Times’ Nick Bilton). In the report I led for WITNESS last year, Cameras Everywhere, we pinpointed these practices as a massive potential vulnerability for human rights activists and for citizens and consumers more broadly (see p.27, Recommendation 2) . We specifically suggested that technology companies should take this stewardship seriously, and:

Follow the principle of privacy by design, and for products already in circulation, privacy by default. This is particularly important for products, apps and services that share this data with third parties that may not exercise the same diligence.

This might include revising the terms under which app developers can participate in App Stores for different platforms – for example, by issuing new guidelines requiring that third-party developers follow industry-class privacy practices – or it could even involve providing app developers with off-the-shelf privacy solutions directly. (WITNESS itself is a partner in ObscuraCam and InformaCam, Android apps that demonstrate privacy and human rights-sensitive ways to handle data, particularly visual data, generated by mobile phones.) Many app developers creating iOS, Android or other apps are small shops that have few staff, and no legal or privacy counsel to help them navigate tricky waters. What’s more, they are scattered in many jurisdictions that have extremely varied data protection laws and requirements. Frankly, it’s a no-brainer that they need help and guidance. (Update: I want to thank publicly Jules Polonetsky of the Future of Privacy Forum for pointing us along this path of inquiry during a research interview for Cameras Everywhere. Very exciting to see that he is involved in driving forward better industry-wide practices with the Application Privacy Summit in April 2012.)

We have made the argument for greater privacy protections in the app economy publicly and in private to the major technology companies, as well as app developers, VCs and policy-makers – we felt that it’s a central and intimate issue not just to activists, but to any and all users. We didn’t get much traction – we’re not technologists, and maybe the solutions we outline are inelegant or technically problematic, but that doesn’t mean the problem is a phantom one.

I hope that this recent upsurge in attention and scrutiny provides a window for companies like Apple, Google, Amazon, Twitter and Blackberry to realise that the concern is a real one (just as Apple did with mobile tracking data, for example), and to re-examine how their app ecosystems work. Ultimately, they need to take more responsibility for their app users’ privacy and safety, even if those apps are designed and built by third-parties elsewhere in the world – after all, only they really have the leverage, authority and know-how to make the app economy a safer place for us all.