Archive

Activism

When I worked at WITNESS, we debated hotly how to celebrate the 60th anniversary of the Universal Declaration of Human Rights in 2008. We wanted to do something that felt contemporary, that felt open as a campaign, and that anyone – anyone – would have a response to and could run with. What we came up with, and what ended up catching the imagination of quite a few people, was a simple question:

What image opened your eyes to human rights?

To kick things off, I recorded a load of interviews with interesting activists, researchers, journalists and filmmakers when I was at the GFMD conference in Athens. I’ve just put  a playlist of these short, sometimes spine-tingling interviews onto YouTube. Here, as a taster, is Mary Robinson’s answer:

The tech world has finally woken up to the safety and privacy risks in the app economy. Path’s silent address book mining sparked widespread outrage, and has had ripples for lots of web services and mobile apps that had, until now, seen as “industry best practice” the long-term retention and mining of customers’ contacts (UPDATE: more on Big Tech’s repeated privacy stumbles – including privacy-trashing kids’ apps – from the New York Times’ Nick Bilton). In the report I led for WITNESS last year, Cameras Everywhere, we pinpointed these practices as a massive potential vulnerability for human rights activists and for citizens and consumers more broadly (see p.27, Recommendation 2) . We specifically suggested that technology companies should take this stewardship seriously, and:

Follow the principle of privacy by design, and for products already in circulation, privacy by default. This is particularly important for products, apps and services that share this data with third parties that may not exercise the same diligence.

This might include revising the terms under which app developers can participate in App Stores for different platforms – for example, by issuing new guidelines requiring that third-party developers follow industry-class privacy practices – or it could even involve providing app developers with off-the-shelf privacy solutions directly. (WITNESS itself is a partner in ObscuraCam and InformaCam, Android apps that demonstrate privacy and human rights-sensitive ways to handle data, particularly visual data, generated by mobile phones.) Many app developers creating iOS, Android or other apps are small shops that have few staff, and no legal or privacy counsel to help them navigate tricky waters. What’s more, they are scattered in many jurisdictions that have extremely varied data protection laws and requirements. Frankly, it’s a no-brainer that they need help and guidance. (Update: I want to thank publicly Jules Polonetsky of the Future of Privacy Forum for pointing us along this path of inquiry during a research interview for Cameras Everywhere. Very exciting to see that he is involved in driving forward better industry-wide practices with the Application Privacy Summit in April 2012.)

We have made the argument for greater privacy protections in the app economy publicly and in private to the major technology companies, as well as app developers, VCs and policy-makers – we felt that it’s a central and intimate issue not just to activists, but to any and all users. We didn’t get much traction – we’re not technologists, and maybe the solutions we outline are inelegant or technically problematic, but that doesn’t mean the problem is a phantom one.

I hope that this recent upsurge in attention and scrutiny provides a window for companies like Apple, Google, Amazon, Twitter and Blackberry to realise that the concern is a real one (just as Apple did with mobile tracking data, for example), and to re-examine how their app ecosystems work. Ultimately, they need to take more responsibility for their app users’ privacy and safety, even if those apps are designed and built by third-parties elsewhere in the world – after all, only they really have the leverage, authority and know-how to make the app economy a safer place for us all.

I forgot to cross-post this, which I wrote in December for the UNA-USA’s The Interdependent:

How we communicate and connect, how we see and document the world around us, how we express ourselves—all have been transformed over the past decade. Hundreds of millions of us on every continent experience this directly in our daily lives, from receiving a text message or making a mobile call to video-chatting with relatives or colleagues around the world.

As 2011 made so pointedly clear, communication technologies and networks of this kind are now so intrinsic to how many of us live, work, and interact that they are influencing how we think about, claim, and advocate for human rights. As the UN celebrated International Human Rights Day on December 10, for instance, it chose to highlight how “social media helped activists organize peaceful protest movements in cities across the globe—from Tunis to Madrid, from Cairo to New York—at times in the face of violent repression.”

This new reality is something that advocates and activists need to face head-on, urgently and collectively. Human rights concerns are at the heart of the technologies we use, the more domesticated, indispensable and close-to-home they become. But what does this mean in practical terms?

Read More

Wikipedia goes dark in protest at SOPA and PIPA

A few years back, before all this internet/smartphone/ubiquitous stuff, I worked for a media development NGO, helping to strengthen public-interest media in the developing world, as a critical part of public debate and social change. One of the ways we used to articulate why it was important to support these independent, public and community media was “imagine a world without media”… Unthinkable.

Now, with the space for individual communication and agency expanding and affecting so many facets of our lives, a flotilla of sites “going dark” is a critical action that demonstrates where we might all end up if this kind of legislation, which seeks to protect archaic modes of production and value creation, at the behest of entrenched lobbies and interests, is not stopped in its tracks. SOPA and PIPA must be stopped.

[And, if laws such as these pass in the US, then these flawed and failed legal standards will then be exported to other nations, with drastic results for free speech, and the creation of value (cultural, economic, and network) worldwide.]

In two weeks’ time, I’ll be moderating a workshop at the Silicon Valley Human Rights Conference, on a topic dear to my heart:

Visual content and human rights – Visual content has changed our world – how do we manage its impact on society, governance, and privacy?

Panelists:
Sam Gregory, Program Director, WITNESS
Thor Halvorssen, Founder, Oslo Freedom Forum
Victoria Grand, Director, Global Communications and Policy, YouTube
Hans Eriksson, CEO, Bambuser

I’ll draw in part on Cameras Everywhere, but what topics and issues would you like me to raise with these panelists? Let me know either via a comment below, or tweet me.