Such intermediaries are effectively gatekeepers are those who enable – and control – our access to that information, and this raises profound issues of principle about the role of intermediary gatekeepers in the structure of free speech […]. At present, such intermediary gatekeepers are all private entities, operating to their own rules, and it is not at all clear how they can be made accountable to their users or the wider public for their private actions. Given the practical, social and legal issues that arise in policing content in such a quasi-public sphere [see below for link], it has been argued that search engines and other intermediaries should have public interest obligations, perhaps by analogy with common law duties that govern public utilities [see below for link]. In particular, free speech norms should not only be about protecting speakers against a heavy-handed state but also about protecting speakers and readers against heavy-handed intermediate gatekeepers. This debate is now being played out online and on the op-ed pages of US news papers.
First, in the online world, where most of us access the internet through a range of intermediaries, government censorship does not necessarily need to target the disfavoured speech; it need only target the intermediaries. Very few US companies would feel able to decline a request like that from the White House, and Google are to be commended for standing firm in those circumstances. Second, these intermediaries now have a great deal of practical power over online expression‚ not only can they be co-opted by government as agents of state censorship but they also have the capacity to act as censors in their own rights, as Google did in their unilateral action to block access in the Middle East.
Lately, platforms, have experienced a significant use and growth. This study takes a critical media approach to exploring how the platform YouTube is an articulation of Web 2.0’s celebratory account of the individual in the digital and networked public sphere. In the thesis, a platform-based approach is followed to examine the socio-political structure of YouTube. In applying the concept of governmentality by Michel Foucault to the form and structure of the platform, ‘community’ is reconstituted as a ‘population’ that co-determines the wealth of the platform. The concept of platform mosaics is developed as an aesthetic metaphor to explain how the user is managed as a collective of individuals. The symbiotic interrelation between user and machine suggests how the new social relations in contemporary digital networked information society have not become more democratic, but that the individual has become homogenized. The mosaic is symbolic for the platform-user symbiosis and emerges as an unprecedented form of individualization mediated by the platform as a host for content distribution. The case study of Eric Whitacre’s Virtual Choir and Natalie Bookchin’s Mass Ornament will function as an immanent critique of the platform to ground how the individual is governed as a collective of separate, homologous subjects.
As citizens continue to play a critical role in supplying news and human rights footage from around the world, YouTube is committed to creating even better tools to help them. According to the international human rights organization WITNESS’ Cameras Everywhere report, “No video-sharing site or hardware manufacturer currently offers users the option to blur faces or protect identity.”
YouTube is excited to be among the first.
Today we’re launching face blurring – a new tool that allows you to obscure faces within videos with the click of a button.
Advocacy in any arena generally takes a long long time. In this context we’re talking about pressuring key Silicon Valley companies that have gone in under a decade from being simple technology providers to being an integral part of everyday human activity across much of the planet.
That one line quoted above was something we’d been talking to YouTube/Google about for 4 years (and that’s more than half of YouTube’s own existence). Those who can make seemingly simple changes like this happen are busy people operating within multiple sets of interlocking wheels of law and policy, and myriad competing internal demands. The conversations with these people started before I got to WITNESS, and they continued after I left in mid-2010 (and continue to this day) – and as the Cameras Everywhere report shows, there’s still plenty to discuss in the future.
Here are my personal recollections and reflections on how the conversations with YouTube that I was involved in developed – with the accent strongly on “personal”. Since I left WITNESS 2 years ago, I’m not party to the latest conversations between YouTube and WITNESS – but I do know where the seeds came from and how they took root. Over at the WITNESS blog Sam Gregory explains the human rights dimension of this move by YouTube.
I am sharing this therefore partial account in the hope that reading a little about our experience will give succour to other activists and researchers running into what seem like brick walls right now. Keep talking, keep trusting, and keep pushing… and embrace serendipity.
[Thurs 19 July - I've slightly clarified some of the written-at-1.30am-language...]
[Sun 22 July - further clarification, including of when I left WITNESS.]
Today in San Francisco, I’m moderating a panel at the Silicon Valley Human Rights Conference. I’ll be joined by Steve Grove (formerly of YouTube, now of Google+), Sam Gregory of WITNESS, Hans Eriksson of Bambuser, and Thor Halvorssen of the Human Rights Foundation and Oslo Freedom Forum.
You can watch the video live here, or follow the tireless Katherine Maher’s liveblog here. And we’ll try to take questions via Twitter for about 20 minutes after the panel ends at the hashtag #rightscon.
(After the panel, I’ll add any videos or resources we bring up or show into this page.)