Title Let's first get things done! On division of labor and techno-political practices of delegation in times of crisis Authors Miriyam Aouragh is an anthropologist and Leverhulme fellow at the Communication And Media Research Institute of Westminster University, UK. Seda Gürses is a computer scientist working as a Post-Doctoral Research Fellow at the Media, Culture and Communications Department at NYU, USA. Jara Rocha is a cultural mediator and a core member of GReDiTS/Objetologías research group at Bau School of Design in Barcelona, Spain. Femke Snelting is an artist and designer, member of the association for arts and media Constant in Brussels, Belgium. This paper is the product of an ongoing collaboration among the authors following aworkshop based on anecdotes that took place as part of the Thinking Together Symposium (http://www.osthang-project.org/projekte/thinking-together/?lang=en) in August, 2014 at the Osthang Architecture Summer School, Darmstadt, Germany. Abstract During particular historical junctures, characterised by crisis, deepening exploitation and popular revolt, hegemonic hierarchies are simultaneously challenged and reinvented, and in the process of their reconfiguration in due course subtly reproduced. It is towards such “sneaky moments” in which the ongoing divide between those engaged in struggles of social justice and those struggling for just technologies have been reshaped that we want to lend our attention. The paradoxical consequences of the divide between these communities in the context of “the Internet” have been baffling: (radical) activists organize and sustain themselves using "free" technical services provided by Fortune 500 companies. At the same time, "alternative tech practices", like the Free Software Community, are sustained by a select few some of which propose crypto with 9-lives as the minimum infrastructure for any political undertaking, and refute the rest as naive or unsophisticated in their technical practices. We argue that even when there is a great desire to bridge this divide, the way in which delegation of technological matters to the “progressive techies” is organised reconfirms hegemonic divisions of labor and can be as pertinent to this gap, as political and philosophical differences. If we believe the mantra that our tools inform our practices and our practices inform our tools, we may want to radically reconfigure these divisions of labour between “activists” and “techies". But where do we start?  Introduction It would only occur much later, that our Darmstadt crew was brainstorming in a manner eerily similar to the constant dilemma within the politically motivated tech-activist scenes that we were trying to understand and challenge. Juggling between pre-emptive questioning, pausing, immediate experimenting and occasionally going with the flow, the trade-off between doing and thinking is never ending. Without this juggling we are likely to reproduce the very hegemonic rationales one wishes weren’t. Our own occasional uttering of things like ‘oh lets first get things done!’ made us realise we were each projecting from particular – professional, political, personal – contexts. We came to realise how we mimicked the same logic of what we identified and critiqued as a normative western male-dominated approach. Through this reflexive confrontation we realised how attractive hegemonic mind-sets can be, and especially how the ability to see through them is jammed by our own set of urgencies. This is often expressed by the guise of efficient divisions of labour. In other words, our conditions are shaped if not determined by the historical pace we live in. Time is therefore not an objective metaphor or standard empirical guideline. Our sense of time is, in essence, differently formed relative to that which we are struggling with in a historical moment. The meanings ascribed to time therefore differ according to long moments of stalemate, sudden moments of suffocating urgency, fresh moments of new possibilities, and commitments to re-making that are likely to span more than a lifetime. The greater historical juncture we find ourselves in is one of deep economic crisis, an ecological disaster in the making, deepening class exploitation and, of course, popular revolts and resistance. These realities cut through many segments of society and social movements. It is as if we are all placed within a matrix categorised by time/history, setting/locality and technology and each constellation brings another understanding of efficiency and urgency. Our personal and collective histories in this matrix each come with their own distinct narrative: the deep sense of trauma over what was a hopeful uprising being high-jacked by geopolitical interests and mass state-violence resulting in its activists exiled, rounded up or killed; the incredible challenges of staying close to one’s principles in small scale informal organisations infused with personal relations and loyalties in free software projects; the puzzling cross-road of an Indignation mobilised by the motivation to radically change the system yet asked to keep parts of the system intact and perform an electoral role; the bizarreness of a mass revolt in the making being juxtaposed by silly penguins who then become re-co-opted as the very emblem of a dissident narrative around the Gezi Park.  What defined our project is a shared concern and curiosity in the way these interactions in the pursuit of fundamental change, taking place on different political levels and ‘times’, are all injected with techno-inventions. As such, they demand that we simultaneously deal with the material and ideological implications of technology. It is this double-layered ‘condition’ which we have gradually come to understand as mediations of the kind of choices manifested in our daily theories and practices. It is not a surprise that we are deliberating these dilemmas at this moment because hegemonic hierarchies are simultaneously challenged and reinvented during such junctures. Sometimes they are aggressively imposed, like ‘shock doctrines’, interventions so cuttingly demonstrated by Naomi Klein (2007).1 But often they are subtly reproduced, in at what we have come to refer to as “sneaky moments”. In these contexts hierarchies are reconfigured and wrapped in deliberate technological interjections. We argue that the divide between those engaged in politics of technology and those participating in struggles of social justice is reshaped precisely at those “sneaky moments” and that this reconfiguration requires reflection. But, where do we start? What are ways to resist the consequences emanating from sneaky moments that impose on us a pragmatic submission to "specialization of work" or delegation to experts? The kind of well-meant efficiency that reproduces hegemonic divisions of gender, race, class and age, and that reinforce ideological differences between activists for social justice and activists for ‘just’ technologies (Dunbar-Hester, 2010)? We propose to begin identifying the problem through studying concrete examples of mediation, by investigating web based campaign sites that claim or try to bridge the existing gap between social justice activists and progressive techies [+++]. Specifically, we look at sites that appeared or gained prominence at the onset of the public revelations about (mass-)surveillance programs, as confirmed with the documents leaked by whisteblower Edward Snowden. These campaign sites speak to the general public but especially encourage the use of secure communication tools by activists and journalists. But before we move onto this empirical analysis we first outline the critical reflections of activist use of media that has surfaced over the last years. We situate the Darmstadt delegation within this context. Next, we move on to provide the setting of our analysis, the Mise-en-Scène, where we introduce the actors we see as actively participating in the assemblage of activists and technology. Based on the objects of our analysis (campaign sties that promote secure communication tools) we discuss how matters of delegation and division of labor are configured. Finally, we move on to rethink and express our ideas concerning possible ways through which to contest these naturalized modes of operation. [+++] We are not impartial towards these projects or their ambitions. The Darmstadt Delegation came together on the basis of a shared experience of troubling differences in the politics, values and practices of “activists for social justice" heavily using networked technology for their struggles, and of "tech-activists" who struggle to develop progressive and alternative technologies. In conversation with numerous initiatives that have aligned around backbone409 [6], interference [7], transhackfeminist! [8], noisy square [9], and the internet ungovernance forum [10], we are concerned that due to pragmatic decisions in times of urgency and lack of resources, that, more often than not, these struggles may subscribe to divisions of labor that reproduce existing hierarchies and dominant discourses.up. But let's first look at the way our specific “sneaky moment” of interest was staged. Delegation to Platforms and Delegation to Tech-Activists [the next few paragraphs are about delegating to platforms] The process of what we call delegation of technical matters to commercial platforms run by companies like Facebook, Twitter or Google have attracted the attention of academics and practitioners alike, and are part of larger critical debates especially in communication and media studies. The growing critique of the use of these platforms for social justice movements has various components, some of which are relevant to our analysis. It is argued that Fortune 500 web companies are designed to maximize the possibility to communicate more, which, when used in the context of progressive or radical change, leads to an integration of counter-hegemonic political movements into the grids of communicative capitalism. The platforms serve to capture attempts at resistance through the seamless integration of political projects into the communication-entertainment complex (Dean, 2009). Moreover, these platforms are sights for commodifying social labor, privatising social spaces, and subjecting dissenters to surveillance (Mejias, 2012, Fuchs 2009, Trottier 2014, Lyon 2008). For example, the elementary functions delegated to these platforms, the management of content, make activist groups susceptible to practices of censorship and algorithmic organization of content guided by a logic of profit. Profit on these platforms is a function of the number of users as well as the "amount" of interaction, which seems to require the curation of a conflict free social zone. As people leveraged these technologies to establish new alliances for projects of (revolutionary) change, the platforms run by these multinational corporations were elevated in mainstream media to the status of "liberation technologies" and the companies to the position of gatekeepers of "internet freedom". Framing these corporate platforms as "liberation technologies", however, has served to drown the critique of the profit agendas embedded into these infrastructures and displace the inequalities that are intensified by the monopsonies that they have become. In the case of the Arab world, these kind of projections have attributed to cyber-orientalism, a phenomenon whereby the idea of non-modern/traditional people are liberated by modern technologies, itself tied to a much longer history of essentialist representations of ‘the orient’ (Aouragh, 2015). Furthermore, given the entanglement between user interaction, profit and conflict, blocking of users and content that may cause tension (whether among other users or in host countries), becomes a central feature of these platforms. Yet, conflict management does not lend itself well to automation and requires expensive human labor. In order to contain the costs, "conflict management" disguised under the title "content moderation" is outsourced to underpaid workers in Morocco or the Philippines (Chen, 2014). As a consequence, through the act of delegating their technology matters to these platforms, counter-hegemonic activist groups become complicit with the labour conditions and logics of profit inherent to these platforms, that at the same time work to diminish their autonomy in organizing their communications and actions. Insightful of the necessity of alternative socio-technical visions, various communities have worked over the past decades to make other forms of technological organization possible. Some of these activities have culminated in what has become known as the Free Software movement. As early as 2008 (Franklin, 2008) the Free Software community responded to the threat of increased surveillance and loss of privacy. In their view, the use of centralized network services had grave consequences for how the Internet, their most important working terrain, was developing away from "Software Freedom" and freedom in general (Stallman year). Other groups, such as the Cypherpunks, focused on the development and dissemination of encryption based tools, which turned into a project of coding software that could 'make the networks safer for privacy' and ensuring these technologies are available to the public (Hughes, 1993). [this talks back to the introduction and the delegation question: proposal: reduce this to a sentence which is a footnote at [***]] As the "social" became increasingly "networked" several free software projects have budded with the ambition to address (some of) the critiques of the way in which dominant companies began to shape what in the vernacular became known as "social networks". These contestations on sites like RiseUp [1], Mayfirst [2], or Lorea [3] are manifested through modes of software production and design proposals that are expected to enable novel performances of notions like politics, transparency, privacy, security, freedom, the networked social, and infrastructure autonomy. [who we are interested in, and the explanation of the different actors are described] Most important to our project is a small fraction of those progressive tech developers and advocates [***]. Specifically, we focus on those groups that have devoted themselves to putting in place infrastructures to protect activists from engaging in insecure communications. The activities of these groups gained momentum as news about internet surveillance and resulting government crack-downs became popular knowledge. It is at this juncture that the necessity and desire for a convergence between those 'groups that wish to use the media instrumentally to draw attention to their political efforts versus those who wish to change the media system itself' (Carroll and Hackett, 2006) became a matter of urgency. In response, a number of secure and private communication campaigns were launched or revamped, which also served to re-shape the delegation relationship between activists and this select group of technologists.[THIS SHOULD END THIS SECTION WITH A SENTENCE THAT BRIDGES TO THE MISE-EN-SCENE] In the next section we explore a vocabulary to discuss the different ways in which divisions of labor between social justice and tech activists are organized in the context of private and secure communication campaigns, and use this to launch a study of the campaign websites which have been developed to mediate between these two groups. Mise en scène [STARTS WITH THE MOMENT] Two concurrent events come together in our sneaky moment of interest: the rise of consciousness about internet based surveillance in the context of post-revolutions in the MENA region and fueled by the revelations about (mostly US and UK based) surveillance programs. In this sneaky moment, the relationship between social justice and tech activists were aligned and reconfigured. This was the moment when many activists realized that the very technologies that had given them the upper hand in mobilizing masses during their uprisings, rendered them vulnerable to state surveillance. The Snowden revelations confirmed that numerous Fortune 500 web companies had contributed to the establishment of a global surveillant assemblage (Haggerty and Ericson, 2000). In other words, it helped re-order the relationship between activists and technology. It was the urgency of the moment around the Snowden revelations which lead tech activists respond with an design proposal to reinstate privacy, a call to "cryptographic" arms. According to this techno-legal consensus the revelations make evident that the US government no longer limits its intelligence activities to military and diplomatic targets, but has exploited the 9/11 attacks on the World Trade Towers and the following War on Terror as a justification for engaging in mass surveillance across the globe. These efforts in surveillance have scaled so well thanks to advances in networked technologies and the dropping costs of processing and storage. So this is both economically informed enterprise and one that is based on a artificial difference between mass and targeted surveillance, Thus mass surveillance is unacceptable as it reverts the presumption of innocence and violates people's privacy on the way to catching the “bad guys”. However, if mass surveillance could be made inconvenient, intelligence agencies would have to return to relying solely on methods of targeted surveillance. One way to make mass surveillance inconvenient is to make it costly. The idea is that if everybody, individuals and institutions, would use encryption, the cost of surveillance can be increased, stifling mass surveillance. [11] The immediacy of the problem as well as the apparent absence of any effective transnational laws that could be leveraged against these surveillance programs, combined with the call to "cryptographic arms" to stifle (mass) surveillance, culminated in the different campaigns to encrypt "everything" on the Internet. Despite the fact that "targeted surveillance" was pushed out of the scope, encryption tools were seen as relevant for activists and journalists who were likely targets of intelligence gathering efforts. These developments contributed to the momentum that saw numerous digital rights and internet freedom initiatives seizing the moment to propose new communication methods for activists (and everyday citizens) that are strengthened through encryption. This was paralleled by numerous conferences, blog posts and press articles, online and offline (internet based and physical protests) action, and, central to our analysis, the development and promotion of encryption tools that "enhance privacy". This also coincided with a somewhat ironic acceleration in the flow of funds from US Department of State, Silicon Valley and philanthropic foundations to digital rights and informational self-defense projects. [12] [THE ACTORS HAVE BEEN LOST HERE AND MAYBE FOUND] [THIS PARAGRAPH NEEDS TO BRIDGE BETWEEN THE DICHOTOMY IN THE INTRO AND MIRIYAM'S INTRODUCTION OF NUANCES, BUT ALSO SHOW THAT WE ARE TRYING TO DRAW A TENSION] A clearer understanding of activism in the context of technology and therefore also a better (progressive, critical) assessment of technology in the context of political change requires some effort to conceptualize the terms and labels we often use. For instance, earlier we referred to two camps that have been labelled tech justice and social justice activists. While the emphasis on the two camps may suggest they are mutually exclusive, research about the social implications of the internet for political mobilization in Lebanon and Palestine showed that these are rather two overlapping groups: activists for/by technology and activists with/via technology (Aouragh, 2012). Certain people do their activism with political change (e.g. equality) as the objective and technology as the tool. For others, politics and justice is their context but a certain improvement in the tool itself is the objective. These overlapping identities and positions often shift one way or the other. They can also be part of parallel lives, i.e. some respondents in our cases consciously divide between their techno-engagements for which they get paid and other political work that requires techno expertise they do for activist or ideological reasons. Surely, changes in technology and technology governance, public space, labor conditions, and novel forms of organizing also effect these divides and overlappings (Enrico De Angelis, 2015, http://ema.revues.org/3398) only a subset of which we turn our attention to later. Tech justice activists inhabit a great variety in political beliefs. These beliefs include attitudes towards design and how these matters translate to their activist practice. When it comes to gender, race, age, class and geography diversity among individual tech activists is less noticeable. The lack of diversity has been insistently criticized from within and outside of the community. Many of the civil society projects that produce campaigns can have a more diverse representation in gender, race, age and geographical origin, e.g., AccessNow or EFF. Still, those who are made prominent by virtue of a politics of representation that relies on viewership, tend to be white (Euro-American) and male.. [THIS CAN BE A LITTLE BIT MORE NARRATIVE] Given the focus of our interest in the mediation of secure communication tools to activists, we are specifically interested in those tech justice activist who subscribe to a permutation of different elements of the following ideals: First, privacy is one of the fundamental elements of an open society and struggles for political and social change. Second, access to encryption and anonymity on distributed architectures without centralized control is the basis of providing this privacy in a networked world. Third, code is the preferred form of regulating privacy and openness, especially given that political or policy approaches are weak due to existing power imbalance and the inefficiences of bureaucracies. Rising to the sneaky moment of surveillance Whilst Focusing mainly on the secure communications aspect, civil society initiatives stepped up to respond to this urgency [4] some with activist interests, others informed by the U.S. Department of State's agenda around "internet freedom”. [5] The campaigns helped mainstream secure communication tools and sparked popular discussions about the complexity and usability of these technologies. Little consideration was given, however, to matters related to class and social status. With the Snowden revelations reaching their second year and the MENA revolutions now in their fourth painful anniversary, it is now time to reflect on how the relationship between tech and social justice activists is being re-shaped. In order to understand and critique this shift, we studied several secure communication campaigns and analysed how they reveal their politics of mediation. While these campaigns exist alongside and in conjunction with training projects, we felt it was important to pay attention to the way these artefacts frame the relationship between tech activists and activists for social justice through language, selection and design..From a large number of campaigns that are currently actively being developed, we selected three sites and deliberately included a non-anglophone example: [TURN THESE DESCRIPTIONS INTO NOTES] *Surveillance self-defense: Tips, Tools and How-tos for Safer Online Communications *https://ssd.eff.org * *Surveillance Self-Defense is a project by the Electronic Frontier Foundation. The aim of Surveillance Self-Defense is to teach people how to think about online privacy and security so that they can adapt their choice of tools and practices in an ever-changing environment. The project is framed as a defence against the abilities of modern technologies to eavesdrop on innocent people. Some of the proposed tools are developed by the EFF themselves. EFF is based in the United States and funded by individual donors, NGOs and some corporate support. * * *The Guardian project: Mobile Apps and Code You Can Trust *https://guardianproject.info/ * *The Guardian Project is developed by an international collective of software developers embedded in the Free Software community. They observe that mobile technologies are important for communication and collaboration, but problematic when it comes to personal security, anonymity and privacy. In response, the Guardians actively develop software applications, software libraries, customized mobile devices and tutorials. The project is funded through donations from NGOs around Human Rights issues such as Free Press Unlimited and Tibet Action Institute, as well as the US Government's funding schemes for human rights projects that are channelled through the Department of State and Radio Free Asia. The Guardian Project also receives support from software related companies such as Google, and from philanthropic foundations. * * *Kem Gozlere Sis: “Bilgiyi ?ifrelemek, ?ifresini çözmekten daha kolayd?r.” — Julian Assange (“Encrypting information is easier than decrypting information.” — Julian Assange) *https://kemgozleresis.org.tr/tr/ * *Kem Gozlere ?i? (skewers to evil eyes) is a project developed by members of Alternatif Bili?im (The Alternative Informatics Association) in Istanbul, Turkey. Resisting the 'evil eye' of surveillance, the project addresses users in Turkey to prevent them from bringing their security and privacy in danger through careless use of communication devices. Kem Gozlere ?i? offers a software selection and related manuals in varying degrees of detail. Kem Gozlere ?i? is a volunteer project organized by members and the activities of Alternatif Bili?im receive event based funding from various local and international NGOs. While in some projects serious translation efforts are being made, we were interested to see if a more situated initiative would provide design elements that mediated between spaces of action in a different way. [13] From full-fledged software 'solutions' to PDF-pamphlets, from authoritative manuals to quick guides for people in a hurry, each of these projects addresses the sneaky moment of surveillance in a different way. Often recycling similar arguments, references, methods and software choices, awareness of surveillance is raised on these sites through an ever-expanding ecosystem of activities. Before starting with a close reading and comparison of the way in which the content of the sites is communicated, we provide a description of the naturalized divisions of labor as enacted through the design, vocabulary and other modes of address used in the campaign sites. Simultaneously, we look for conceptions of time and how the "urgency of the moment" is being mediated through these sites. We later use this close-reading for a broader understanding of the performance of these sites. [WE SHOULD SET UP THE UPCOMING SECTION MORE SPECIFICALLY, ONCE WE ARE DONE WITH THE SUBSECTIONS] INTERLUDE (some tacky music) Divising latent structures The four campaign websites that we will have a closer look at in the next chapter, are at once cultural artefacts but also convivial spaces in which varied agencies, including the activist communities that are mediated, co-habit with tools, discourses and languages. In order to understand the dynamics of this complex scenography, we looked at the different gestures of delegation within these sites of mediation. By doing so, we started to identify some of the "latent structures" that help us reflect on the situations in which these platforms function. We argue that a dis-attention to the way the implicated agents (e.g., the users, the developers, the tools, language and design elements) are mise-en-scène, results in a division of labor that follows "traditional scripts", and shows a perhaps un-intended hierarchy based on traditional models of production. As a result, their performance can take a tragic and unintended shape in which tasks and articulations of labor tend to echo fixed behaviours, a hegemonically-scripted play. Such a script implies a strict management of expectations from all implied agents. It is often the result of top-down hierarchies and naturalized to the extent of rendering divisions of labor invisible. This is why we refer to them as "latent structures": while they are certainly present on these mediation sites, they merge with the background and move out of our frame of attention. This merging with the background is a product of, but also facilitates, the path dependencies that determine both tool development and use, and as a result all the agents pass through it without interpreting it as a structure. In other words: its smoothness is a tricky materialisation of a long period of hierarchical organisation through a hegemonic performativity. This smoothness also has to do with a cultural paradigm of "naturalizing the available" (Zapata, 2014). What this means is that agents perform within a pre-disposed framework, of which the limits, shapes and shadows are no longer questioned, or may simply have found consensus, e.g. in our case the assumption that a tool can provide security or enable anonymity. The reliance on available gestures is precisely the paradigm that helps merge latent structures to the background in our present (cultural, political) time. When our cultural-political ecosystems intensify or enter moments of agitation, then our relation to tools tends to fall into the paradigm of affordances. It does not matter how radical the political struggle is, people may succumb easily to work with the available. Dependency on the available plot of technological design is precisely what produces the conditions for a sneaky moment, at the risk of discarding very basic political, ethical and aesthetic sensibilities. Designing the divide between providers and users Assuming that the four campaign sites we selected are intended to mediate between the worlds of tech activists and social justice activists, we are interested in how they use language, design and tool-selection to bridge distances in knowledge, trust and geography. If these projects are explicitly developed to communicate between agents that are not physically in the same space, how is a relationship of trust established? What do tech activists do to convince activists for social justice that they are on their side, and that the information and technologies provided are worth their trouble? And in the course of these relevant bridging and translation attempts, how do activists for social change find out if the provided tools are appropriate and safe for their situation? The three projects show similarities and also differences in their approach of how “us” and “you” are imagined. A first thing to note is that both The Guardian project (TGP) and Security Self Defense (SSD) establish a clear separation of roles between those that provide these secure communication tools, and those that should consider using them: *'How to keep you and your communications safe wherever your campaigning takes you' [13] *'Whether you are an average citizen looking to affirm your rights or an activist, journalist or humanitarian organization looking to safeguard your work in this age of perilous global communication, we can help address the threats you face.' [14] This seemingly simple construction of address effectively sets up a narrative where tech activists in-the-know are reaching out to others, activists for social justice, that might need their help. Interestingly, Kem Gözlere ?i? (KGS) is more ambiguous about their perspective, and will mix instances of “us” and “we/you”: 'The objective of this project is to provide practical information about how to protect us from the "evil eyes" that are watching us.' [15]. But we also read: 'As your knowledge of the issues and your skills increase, you will see that you can better protect your personal data and your privacy, you will feel more powerful.' [16] Given the precarity with which KGS was created, these shifts in mode of address could be due to a lack of professional copy-editing or the inherent elasticity of the Turkish language. However, their consistent ambiguity does suggest that the KGS tech activists might sometimes get out on the streets themselves. The site continues to invite readers to become part of a community: 'This project aims to help users who need privacy, to provide tips on how to protect oneself from “the evil eyes” and to create a growing community engaged in these issues.' [17] For example, SSD imprints it's signature on every page by mentioning that this is “a project of the Electronic Frontier Foundation”, linking to the EFF project page. The connection with EFF attempts to provide authority and perspective to skip any mention of “us” in the About page. Rather, the About page of the SSD campaign starts with explaining the type of users this guide is meant for, before moving on to the project's goals and limitations. In a secondary menu we find “credits” and a list of individuals that have contributed to SSD. While credits are given for specific contributions, nothing is said about the institutional or political affiliations of these contributors. A similar lack of attention can be found at the KGS site. Some limitations of the recommended technologies are mentioned but there is a certain vagueness about the identity of those who stand behind the site. TGP on the contrary has an extensive section “About us” that explains 'How Guardian Helps', has all individual team members listed with “anonymised” pictures and provides extensive details about funding and affiliations, including a note explaining that certain U.S. government related funding has not co-opted their work. [18] In sum, in all of these sites, through omission or over-exposure of contributors, the stage is set for the rest of the information to come. The sheer existence of the campaign sites builds on the assumption that many activists for social change are not already familiar with the proposed methods and tools, that this weakens them, and that mediation is needed to change that situation. A first concern than is usability, as within this community, as well as among researchers of security tools, security and usability are seen as complimentary, if not competing aspects of secure communication tools. According to experts, usability means that "it is easy for the users [using the tools] to do the right thing, hard to do the wrong thing, and easy to recover when the wrong thing happens anyway” (Sasse and Palmer, 2014). Given that usability of security tools is tightly coupled with doing the "right thing" and, in moments of urgency, the "right thing" is ambiguous for activists, it seems like a reasonable choice that SSD and GP both employ 'scenarios' as the design method to communicate secure communication tools to their imagined or intended users. Through "playlists" (SSD) or 'use-cases' (TGP) they categorize types of users according to their perceived security needs. In comparison, KGS does not provide explicit scenarios, but prefer to channel users' attention based on their devices: Mobile or Desktop. Scenarios in SSD are "Activist or protester", "Human rights defender", "Journalism student", "Journalist on the move", "Mac user", "Online security veteran", or someone who "wants a security starter pack". [NOTE SSD Landing page] TGP on the other hand focuses on "undercover human rights researchers", "tech savvy citizen journalists" and "activists in the streets", "community organizations reporting on election issues", "emerging online citizen journalists organizations", and "mobile journalists for major news organizations". [19] In addition, the three projects insist on communicating that the proposed technologies are easy-to-use. SSD literally offers a “security starter pack” that in fact is a manual which of course, starts with threat modelling. TGP makes an effort with seven friendly icons ('So you got an Android phone?') each linking to an up-beat interactive explanation, consistently starting with 'Easy'. Both sites use design styles and language that mimic commercial on-line services. KGS evokes the “security pack” with big red buttons suggesting one-click-install. In usability design, a scenario is a description of a person's interaction with a system. It is believed that scenarios help focus design efforts on the user's requirements (Nielsen, 1993). The scenarios here though seem to directly map existing technological solutions onto supposed real-life experiences; they seem to be organised around assumed groups of solutions or technologies rather than in response to actual problems. An indication might be that most of the images and icons used on the three sites depict devices, rather than situations. Another issue that none of the scenarios brings up situations that might ask for solutions not based on technological tools. The mantra of 'simplicity' than hides the complicated and situated nature of using the promoted technologies in real-life situations, and designs away the human efforts that need to be made to put the advertised technologies into action. From the awkward connections between technology needs and skills, causes and effects, devices and situations we started to wonder who was involved in the establishment of those categories. If these tech activists have already established relationships with activist groups on the street or in human rights initiatives to think through the required technologies and to test and develop them together, than the effort to communicate via a website is in fact redundant. On the other hand, if the projects are set up to actively reach out to unknown activists for social change, supporting a dialogue between groups is essential. SSD has a 'feedback' button on each page, but omits any possibility to ask for a return on the feedback. A secondary menu offers a standard contact form which is protected, but does not allow for any further secure forms of communication, e.g., through email. The 'help us' that is repeated on most pages at KSG, includes the following rather conversational statement: 'We ask you to inform us about mistakes in any of the documents we provide. If you have suggestions for better solutions, please let us know. You can also contact us with questions about use.' Users are invited to send comments and suggestions by twitter, indy.im, and diaspora to KSG. When we first tried out the site, their email address had become invalid, which probably explains why the FAQ, which has great potential for a two way communication, has remained empty. TGP states on their contact page 'If you’d like to learn more about Guardian from the team directly or have a proposal for us, please let us know using of the methods below'. The Guardian-Dev Discussion List is at first sight potentially the most interesting channel for user-provider exchange, as it invites developers as well as power users or "just anyone interested in getting involved in the development side of things". But even if this list may be read as the most inclusive of all three campaign sites, the listing of profile s centers again on technological development and negates any space for a collective exchange beyond tech-development. By speaking to the activist audience rather than inviting them into a community of participation, the campaign sites in fact unnecessarily amplify the condition where activist communities are not expected to take part in the definition of the relationship they will have with the technologies they apparently need to depend on. Participation in determining their relationship to these secure communication tools is already hard, but will be even less appealing (or even not known to the users) if not explicitly solicited. If problems appear, users will drop tools that don't fit well to their context. In moments of urgency, the drop rate may depend even more so on the urgency and needs of the activists rather than the positivist claims made on these sites. Connecting technologies to situations The activist may need to develop their own scripts of the possible ways in which tools can fill in roles during different moments of internet activism. Currently, sites like Security self-defense by EFF, as is also typical in many cryptoparties, start by asking users to do "threat modelling". As phrased on the website, 'To become more secure, you must determine what you need to protect, and whom you need to protect it from.' While threat modelling is a tool in itself that can be useful for activists, its military and industrial origins seeps through the recommendations associated with threat modelling. First, in professional settings, like in a company or a military setting, threat modelling is assumed to be part of a number of activities conducted by a large team of developers, e.g., there is a bigger system development or maintenance project where security is one aspect among many. The language of “assets” assumes that the owner of the system has (information) assets that need to be protected, and it is the security teams duty to make sure those assets are secured. This monolithic vision of a system to be defended is however outdated and criticized within security engineering. Even a corporate system is likely to have users and associates with conflicting interests, meaning that there may be numerous competing threat models, for different situations, the priority of which depends on the bigger project that is to be achieved. Addressing the social and political complexity of threat models would allow the SSD site to show that security is a negotiated process, instead of a clear state of affairs which experts can discover depending on their adversaries’ capabilities. Further, all the examples in SSD focus on information assets and relevant data, which may or may not be of primary importance for activists. Hence, the campaign site sees threat modelling through the perspective of the tools they offer: the tools are there to secure information, hence the activists should focus on securing information. This is very different than focusing on people, situations or political goals that may matter to the activists, which then have to be translated to "information assets" afterwards. While both SSD and TGP are involved in developing software themselves, the first layer of each of the campaigns seems to focus on the curation of useful technologies, bringing them together in easily digestible "playlists". Since all three projects, at least on the surface, aim at the same type of users, under pressure of similar threats, it surprised us how little overlap there is between suggested technologies. Besides ChatSecure and Tor plus related software for phones, Orbot and Orweb, there is hardly consensus between projects. What is also surprising, given that these sites are about security awareness, is that only SSD provides dates of the information that is presented. Campaigns may have starting dates, or the websites may contain reference to the year when the site was last updated, however, they lack prominent time stamps that indicate whether the information is fresh or outdated. Given that security vulnerabilities are disclosed every day, this leaves the users with the duty of checking whether these tools are still valid for use. The "weatherrepo" federated app store is an initiative of the Guardian Project that hopes to provide an app store with vetted tools. While this is a timely project, it is ironic that the information page itself says that Yahoo! transfers mails in the clear, which is no longer the case. [20] Finally, most "users" will have encountered security mainly in situations where they are marshalled in the project of "securing the assets of a company". From passwords to tokens to captchas, the main mechanisms through which users experience security have been predominantly deployed to protect the assets of the services they are using. Never mind that for most users these things are a burden, if not an annoyance. It is then a challenge to re-contextualize related mechanisms as standing for their protection. A language of “threat modelling” and “information assets” is unlikely to make this re-orientation an easy one. Setting up divisions of labour One form of division of labor can happen through the distribution of roles according to expertise, in other words based on specialization of work. The most typical of such divisions based on specialization of work is the distinction drawn between developers and users. This division inevitably comes with expectations about what the roles entail and their capacities. It frames a dependency relationship and situates the expertise inevitably with the developer, in other words, the developer has the necessary and probably sufficient skills to develop a given technology (Suchman, 1994). Especially in matters such as security, where threats and vulnerabilities to the underlying protocols require extensive technical skill, this seems like a plausible delegation. But is it? The conditions of activists across parts of the Arab world are increasingly being shaped by an encroaching danger and enclosure. Rather than a widening of activist networks, maintaining their physical, social and political momentum has become priority. What an activist expects when organising or mobilizing via the internet (and mostly social media) is that it works. Our questions about long-term risks involved with relying on Fortune500 platforms and why they themselves engage in co-designing alternative infrastructures is at times met with bored sighs, pitying smiles or confused stares. During a political tipping point, such as an uprising or the eve of a massive public occupation, users want efficient and ready-to-hand tools. Activists do not have the time or the patience to become designers, too. On some level, this is what “services” are expected to offer, an expectation that naturalizes the delegation of numerous matters to developers, confirming the latent structures that cement the power asymmetries between users and service providers. On the other hand, there is always some friction that is not captured by the developers, and here the activists hope that another division of labour within the movement will solve these: tech savvy activists can volunteer their time to making sure there is connectivity and that the tools work while others storm the streets. During tipping points time is so valuable that it is not very wise to raise any question about pragmatic delegation decisions based on specialization of work. For example, many of our experiences from the Arab world, Turkey and Spain suggest that the stage (or timing) of certain actions defines how relevant a tool is and what its potential role in offering secure communication tools may entail. A helpful way to think through these contradictions is to imagine a distinction between various revolutionary stages: pre-revolution (preparation and mobilization); moment of revolution (the actual tipping points); and post-revolution (successful continuation or dangerous counter-revolution). It is useful then to juxtapose these historical timing-related factors with the kind of usage (sometimes as a space and at other times as a tool). This then suggest that technology is not always dominant but surely is a factor of change. This could also increase a better understanding of technology designs and which group – the activists for technology/activists with technology – are best suited or should be more present at a given moment. Depending on these different phases, it may be more informative to rethink how the use of technologies may be decisive (for early mass mobilization), just mentionable (for class struggle) or virtually irrelevant (in military battles). There are numerous issues here. We argued earlier that if tools determine practices, then these activists are opening their practices to change according to the tools that they use. In commercial platforms this may come at the price of the pressure to adhere to real name policies, accounts that get blocked, many hours recreating an unexportable friends list of a blocked account, or waiting for language features to be implemented, e.g. Arabic hash-tags in Twitter. [21] An engagement with alternative tech activists sensitive to their needs has the potential to reverse some of these dependencies, however, the campaign sites that we studied suggest that this is not guaranteed. While many of the secure communication tools are revolutionary in their protocol design, the campaign sites indicate that the same tools rely on very traditional framings of users and ways of relating to developers and technology. Expectations of a seamless service combined with inattentiveness to these matters from the campaign developers folds activist users and progressive developers into the available forms of delegation. This sets up the "users" to oscillate between deliverance to developer decisions and disappointments with unsatisfiable expectations. While the campaign sites had a special opportunity to transform this delegation relationship, we find that they instead amplify the user-developer opposition in order to make complex encryption tools accessible to users in a format that they would recognize from commercial platforms. However, making tools easy to download pushes the task of rethinking information rituals within a community once these tools are installed, outside of the campaign scope. It takes great amount of labor to integrate any new tool into the social tapestry that activists find themselves in, and in the pursuit of usable tools, this labor goes unrecognised. With every tool comes the laborious activity of configuring them to local needs, maintenance of tools on the variety of available devices, as well as the development of trust toward the tool developers through mechanisms like "user support". Given that these laborious activities are critical to secure communications, it is a pity that almost none of the campaign sites attend to these matters, and generally do not consider how people can move into the secure communication space collectively. In a sense, there is a commodification logic inherent in the choices made on these campaign sites that aspire to bridge between tech developers and activist users. The objective of these sites is to depict security tools as "completed products". The lack of salience given to timestamping, the continued validity of the security and availability of the tools, as well as to the modes of production in which these tools are developed, is in harmony with a project of "design from nowhere" (Suchman, 1994). According to Suchman, this is an ideal in which 'the goal is to construe technical systems as commodities that can be stabilized and cut loose from the sites of their production long enough to be exported en masse to the sites of their use'. Subjecting secure communication tools to this logic demands that security is a binary – you can download and be secure – an unachievable goal if not a marketing gimmick. This expectation pressures developers of secure communication tools to either come up with gross security claims or disclaimers that some of these tools may not work, confirming the illusion that a universe in which security exists as a binary is possible. This is a significant step away from the culture of secure communications. Practitioners participating in what can be called security design collectives will agree that "it takes a village to keep a tool secure" and that security is a continuous cat and mouse game. But this culture is lost on the campaign sites. With the exception of the empty FAQ on the Kem Gozlere Sis site and the developers channel on The Guardian Project, there is little invitation on any of these sites that gesture at the idea of creating an activist security community that could think along. This mode of mediation sets up the individual user "to be the weakest link" instead of playing for the "community to be the strongest link" in achieving a security aware activist culture. Design from nowhere in service of activists The objective of developing tools that can function across contexts is part of the universalist ideal typical of design collectives. Here, in the absence of far away users under threat, designers can invoke them at will and imagine their needs. With the urgency to build secure communication tools that are easy to install and use independent of context, this practice becomes further normalized. It is hence not unusual to hear among activist techies that the idea is to get funding for a tool through Internet Freedom initiatives that are focused on "dissidents in repressive countries", which can then also be used by activists in the UK, a "democratic country". This is also what makes it possible for campaign sites to reach out to users "across the globe", while the devices that are required to install the proposed tools suggest that they depart from a user who has access to some of the latest in mobile technologies and infrastructure. Secure communication design collectives also have the additional objective to make sure that the security guarantees of the tools that they develop is 'exogenous, homogeneous, predictable, and stable, performing as intended and designed across time and place' (Orlikowski, 2007). These tools are however entangled in very intricate political and social realities which technologists can only "design around" to a limited degree. For these designers the ambition to develop tools with universal security properties is seen as an ideal and is pertinent to modes of thinking that allow engineers to abstract away from situated knowledge of a specific context and to shift real world problems into the technical solution space. This means that developers can pursue goals like developing an anonymous communications service like Tor which provides anonymity or the ability to circumvent censorship regardless of contextual constraints. Yet, while the design of anonymous communication networks is a challenge in itself and validates the need to abstract messy realities away, user community input is almost as pertinent. The Tor community has been very aware that Tor can only work if they take situated reporting into account. When the act of using Tor, which can be identified by ISPs, can be sufficient to put a person under suspicion, it becomes evident that contextual realities matter and the design of tools has to be rethought. [22] The ability to find technical solutions to situated challenges is neat, nevertheless, activists already under suspicion may sometimes be better off staying away from these technologies. This was expressed loudly by Anne Roth, when her whole family was put under heavy surveillance in Berlin, Germany for unsubstantiated terrorism charges, and is the case for many members of the Muslim community in the US and western European countries, who do not necessarily have the luxury of securing their communications, as this would trigger greater surveillance and suspi cion. Technology design requires focusing on the security of the interactions of activists on the network, but the real world may creep up in unexpected ways, requiring these tech activists to continuously revise their social, political as well as technical assumptions. This makes it very challenging, if not undesirable, to rely solely on technical experts to develop technologies for activists. The heavy reliance on tech activists also conceals an effective international division of labor which all of these projects affirm. What Mike Hales put into words in 1994 about corporate engineers still proves to be true in the context of progressive tech developers: 'Our times present us with a de facto economic and cultural separation between production and use. In our work world, producers are professionally (i.e., culturally) specialized; to a large extent, system-production is located in specialized and distinct sectors and/or geographical locations within an international division of labor.' (Hales, 1994) While most of tech activists develop universal technologies with a "design from nowhere", in reality much of the development work occurs in the Western hemisphere where market values of efficiency and resulting coding practices are the rule. As more and more tech activists succumb to the pressure to develop ready-to-hand tools, this also means they are expected to replicate designs whose success is based on market parameters. This is the point at which most of the campaign sites and tool developers evaluate their success on the number of downloads or number of individual users rather than the efficacy of the tools for the projected activist communities. Once subject to these parameters of market efficiency and its correlated principles of design, questioning of hegemonic divisions of labor can only be regarded as counter-productive and inefficient. A related anxiety among activists is that technologies tend to shape their environment towards increasing individualism (one of the features of social-networking) and that this hinders collective action. A telling example is given at the Fourth Arab Bloggers Meeting in Amman (January 2014). This challenge was voiced during a heated debate about ways to bridge the gap between knowledge and practices of activism, more precisely how blogging can shift from an individual act to a more comprehensive collective performance. The internet motivates micro-celebrities and social media stars, and this tendency is further triggered by traditional media where some bloggers and activists are put in the spotlight, treated as if they were spokespersons for the entire movement (Angelis and Della Ratta, 2014). In an interesting reflection during the conference by a Yemeni activistwe are reminded that that the dominant or expected hierarchies of priorities cannot be universalized. The person in question mentioned that in the surge of social media trainings, they often don’t take into account the needs of local societies, especially in terms of anti-surveillance programs: 'Circumvention is not a big issue here; yet we heavily invest in training on such tools in every single event, conference, and gathering held in this region'. Rather than the currently very hyped issues of cyber security/anti-surveillance, finding a safe offline space to meet and plan is of a much greater concern’ (quoted in: Angelis and Della Ratta, 2014). [maybe add a paragraph about how the mass and targeted surveillance creates a whole lot of confusion on all these websites: if you are targeted do not use, but if you are a journalist, you should. if they would talk with activists more, the ridiculousness of the mass/targeted dichotomy would have appeared or would have been much harder to publish the way it is.] Let's first get things done: modes of operation for sneaky moments One way to better situate some of the secure communication development activities is to move from an attitude of "design collectives" at the service of (individual) users, to "designing for activist collectives". This could be bootstrapped by avoiding to inscribe users into the language of threat modelling, and instead inviting security engineers to step into the language of collective action within a political project. For example, by framing the matter at hand in terms of the role of technology for activists in the context of the aforementioned Pre/During/Post categorisation of the political moment, it may be valuable to start by distinguishing what effect we expect technology to generate for political activists or politically motivated techies. Technologies may tip the scales of power when they help expand existing networks and thus become vital for the emergence of movements and campaigns. This can be achieved by interpreting the online/offline divide as a reflection of the space and tool separation, and this in turn as part of the overall political strategies and tactics without excluding any of the pre- or non-digital technological tools or spaces. Reframing the project of communications security not as a precondition to political activism but as a constituent part of it, however, may still be orthogonal to whether we break away with the user-developer dichotomy and all the associated baggage. A more radical proposal may be to shift this relationship by recognizing that ultimately what is desirable is to do "collective design". Here it may be useful to think of the way in which Lorea, one of the alternative social networks proposed to frame the relationship between activists and technology: *'These networks are self-managed because Lorea is a non-profit, independent, open, and self-sufficient project. We don't talk of “users” but rather of “inhabitants” because we prefer a conscious coexistence instead of a simple, passive client relationship. Lorea inhabitants actively participate in the design, development, and maintenance of the network’s working to implement the federation protocols, develop code, maintain safe servers, hunt down bugs, translate the interfaces into various languages, test user friendliness, document its development, and to undertake dissemination, help, or welcome activities for new inhabitants. There is thus no institution or formalized association behind Lorea, but rather a community of inhabitants.' (N1Crew/SpiderAlex) What is beautiful about the proposition of "inhabitants" here is that it recognizes the labor it takes to make a “community tool”. Where Lorea probably had its shortcomings, which may explain why it has seized to exist, the project reflected on the cost of this labor and how to sustain it over time. For many tech activists, although not all, the dependency on wage-labor is something that they can free themselves from, at least temporarily (Soderberg, 2014). This ability to sustain oneself is, however, both gendered, raced and geographically specific, if not also specific to the IT sector. A focus on divisions of labor that defines roles based on their relationship to the software artefact leaves out the fact that the production, maintenance and use of the technology can only exist with the necessary sustenance of life, such as the production of food and shelter, as well as physical reproduction. In many political collectives with their own space, this is also known as the problem of “who pays the rent” and “who cleans the toilets”. In fact, a lack of attention to matters of sustenance of life has been the breaking point of many alternative projects, if not the point in which corporate and government funding has found entry into alternative technology projects. Circling back to the argument by tech activists that proposed an economic solution to end surveillance by raising the cost of monitoring through the use of encryption, we may sadly find that they forgot to add to their equation the cost to activist communities of integrating secure communication tools into their political projects. In fact, the urgency of our post-sneaky moment maybe to think along the lines of class, regional differences, as well as too easily assumed divisions of labor, if we want secure activist communication projects that truly scale. As revelations about surveillance programs and related crack downs on activists emerged, tech activists grasped, remolded and redefined this occasion. They effectively translated what are considered surivellance problems into one concerned with privacy and cryptographic self-defense. We attempted to develop a vocabulary and offer a snapshot that could help us attend to the naturalized divisions of labor and technology delegation practices that manifest themselves between activists for social justice and activists for just technologies during such sneaky moments. In order to situate our discussion, we focused on the numerous initiatives that emerged to provide activists with secure communication tools, seizing the momentum created by the anxiety about surveillance. Retrospectively, this turn happened almost naturally and invisibly, sneakily extending fringe secure communication tools to activist communities across the globe. The subtext in such initiatives invokes the notion of a universal user and in extension that of a universal activist. This is where it started to sound familiar, the hegemonic division of labor related to universalist ideas about technology came to conceal situated politics, interests and contestations. These projects make a lot of sense in a world where ‘the’ governments and its services cannot be trusted. The proposed alternatives promise to protect activists from uninvited eavesdroppers and resulting vulnerabilities. In doing so, the campaign sites channel the vacuum created by the revelations about surveillance programs into building ‘trust’ towards encryption technologies. They vet small tech activist initiatives for secure communications and vouch for their trustworthiness so that they can scale globally. This is not to say that these campaigns are all the same, but rather that their attentiveness to latent structures as well as the selection of tools, plays an important role in their political alignments. Hence the organizations that propose the campaigns have in a sense positioned themselves as the hubs of delivering public trust towards the Internet. Not just corporate fronts but albeit cynical perspective, the rise of these campaigns explains why Fortune 500 Internet companies and governments that are scrambling to rebuild confidence in the Internet and associated markets (Wisniowski, 2012). We illustrated how these campaign sites transform tools developed by tech activists by delivering consensus around using the available structures for reframing secure communications technologies, e.g., by depicting them as usable apps that are one-click away. Furthermore, all sites suggest that they are part of the same project to provide instructions for self-defense, but the mutually exclusive bundle of tools possibly point to territory claims between the two US organisations, while Kem Gözlere ?i? signals a strict commitment to free software tools that use open standards. In contrast to this careful articulation of politics through a diversity of tools, we found little attention given to the delegation relationship that is constructed between the invoked activists and the tech developers. The user-developer opposition reiterated on these sites gives insights into how specialization of work and scarcity of resources can easily lead to divisions of labor, expressed across fault-lines of race, gender, class, age and geography that are themselves already a consequence of neoliberal power. As a result, tech-activist communities and social justice activist communities, ideally a natural match, come to oppose each other in these "sneaky moments". Our critique comes with its own risks, for given the positive valence associated with these campaigns - putting secure communication technologies in the service of activists during an obvious increase in state (-intelligence) deployment of the internet with the intention to crack down on activists in different regions of the world - and the largely still marginal position of counter-surveillance, critique is difficult to articulate. Yet, we argue, quite necessary. References Aouragh, Miriyam. 'Social Media, Mediation and the Arab Revolutions', Triple C Journal for a Global Sustainable Information Society 10, no. 2 (2012) Aouragh, Miriyam. 'Revolutions, the Internet and Orientalist Reminiscence', in Reem Abu Fadel (Ed.) Visions of Tahrir: Connection domestic and international spheres in revolutionary Egypt. (Routledge, 2015) Carroll, William and Hackett, Robert. 'Democratic Media Activism through the lens of Social Movement', Theory, Media, Culture and Society 28, no.1 (2006) in Dunbar-Hester, Christina. 'Drawing and Effacing Boundaries in Contemporary Media Democracy Work', Media and Social Justice. Sue Curry Jansen, Jefferson Pooley and Lora Taub-Pervizpour, eds. (Palgrave Macmillan, 2011) Chen, Adrian. 'The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed', Wired, 23, October (2014), http://www.wired.com/2014/10/content-moderation/ Dean, Jodi. Democracy and Other Neoliberal Fantasies: Communicative Capitalism and Left Politics (Durham, NC: Duke University Press Books, 2009) De Angelis, Enrico and Ratta, Donatella Della, 'Mind the Gap: Bridging Knowledge and Practices of Activism' at the Fourth Arab Bloggers Meeting, Jadaliyya, June 7 (2014) http://www.jadaliyya.com/pages/index/18040/mind-the-gap_bridging-knowledge-and-practices-of-a Dunbar-Hester, Christina. 'Beyond “Dudecore”? Challenging Gendered and “Raced” Technologies through Media Activism', Journal of Broadcasting & Electronic Media: Special issue on race, class, and gender 54 (2010). Franklin Street Statement on Freedom and Network Services (2008) http://autonomo.us/2008/07/14/franklin-street-statement/ Haggerty, Kevin D. and Ericson, Richard. 'The surveillant assemblage', The British Journal of Sociology 51 no. 4 (2000) Hales, Mike. 'Where are designers? Styles of Design Practice, Objects of Design and Views of Users in CSCW', in D. Rosenberg et al. (eds) Design Issues in CSCW, Springer (1994) Hughes, Eric. A Cypherpunk's Manifesto, 9 March (1993), http://www.activism.net/cypherpunk/manifesto.html Mejias, Ulises A. 'Liberation Technology and the Arab Spring: From Utopia to Atopia and Beyond', The Fibreculture Journal Issue 20: Networked Utopias and Speculative Futures, (2012) http://twenty.fibreculturejournal.org/2012/06/20/fcj-147-liberation-technology-and-the-arab-spring-from-utopia-to-atopia-and-beyond/ N1crew/SpiderAlex, Reclaim the Networks: Technological Sovereignty for Social Networks, https://n-1.cc/blog/view/76157/reclaim-the-networks-technological-sovereignty-for-social-networks Nielsen, Jakob. Usability engineering. (Morgan Kaufmann, 1993) Orlikowski, Wanda J. 'Sociomaterial Practices: Exploring Technology at Work', Organization Studies, 28 (2007) Sasse, M. Angela and Palmer, Charles C. 'Protecting You', IEEE Computer and Reliability Societies, January/February (2014) Soderberg, Johan. 'Reproducing Wealth Without Money, One 3D Printer at a Time: The Cunning of Instrumental Reason', in Stefan Meretz (Ed), Book of Peer Production, Special Issue of Journal of Peer Production (2014) Stallman, Richard, Who does that server really serve https://www.gnu.org/philosophy/who-does-that-server-really-serve.html Suchman, Lucy. 'Working Relations of Technology Production and Use', Computer Supported Cooperative Work (CSCW) 2: 21- 39, (1994) Wisniowski, Matthew. Engineers for Change: Competing Visions of Technology in 1960s America, MIT Press (2012) Zapata, Guillermo, 'Ni el copyright ni el copyleft te va a dar de comer', El Diario, November 13, (2014) http://www.eldiario.es/interferencias/copyright-copyleft-va-dar-comer_6_324127601.html End Notes [1] Rise Up https://help.riseup.net [2] MayFirst People’s Link https://mayfirst.org [3] Lorea http://p2pfoundation.net/Lorea [4] See for example the many organisations listed here under 'Civic Organizations with Information and Communication Focus' http://www.publicsphereproject.org/civic_organizations [5] Internet Freedom, U.S. Department of State http://www.state.gov/e/eb/cip/netfreedom/index.htm [6] Backbone409, Calafou http://backbone409.calafou.org [7] Interference, Amsterdam https://interference.io/ [8] TransHackFeminist, Calafou http://transhackfeminist.noblogs.org/ [9] NoisySquare, OHM https://noisysquare.com [10] internet ungovernance forum, Istanbul https://iuf.alternatifbilisim.org/ [11] In this argument, targeted surveillance is bounced between either being legit, and hence not worthy of further discussion, or out of the scope, since technical solutions cannot withstand methods used by a keen "nation state adversary" targeting an individual, community or country. [12] A sample of some of the funding reports that has been flowing into digital security projects: DRL Internet Freedom Annual Program Statement for Internet Freedom Technology, http://www.state.gov/j/drl/p/207061.htm; Portfolio Assessment of Department of State Internet Freedom Program: An Annotated Briefing, http://cryptome.org/2014/09/rand-internet-freedom-attack.pdf; Digital Defenders, https://digitaldefenders.org; Knight News Challenge on Strengthening the Internet,http://www.knightfoundation.org/blogs/knightblog/2014/6/23/19-projects-win-knight-news-challenge-strengthening-internet/ [13] An overview of campaign sites that we considered can be found here: Media:Theatreofsurveillance.jpg [14] Security Self-defense, Landing page [15] The Guardian Project, Landing page https://guardianproject.info/home/partners/ [16] Kem Gözlere ?i?, Landing page https://kemgozleresis.org.tr/tr/kemgozler/ [17] Bilgiyi ?ifrelemek, ?ifresini çözmekten daha kolayd?r. Kem Gözlere ?i?, About this project https://kemgozleresis.org.tr/tr/kemgozler/ [18] Note: this project has received small grants and sub-contract work from organizations (such as the Open Technology Fund) and research projects (such as Tor) that receive funding from the U.S. Government and other governments around the world. None of this funding has modified or shaped our development plans, and we would never, ever put any sort of backdoor or compromised component into our software based on this funding. https://guardianproject.info/home/partners/ [19] How Guardian helps https://guardianproject.info/home/use-cases/ [20] WeatherRepo https://guardianproject.info/code/weatherrepo/ [21] Twitter is now available in Arabic, Farsi, Hebrew and Urdu https://blog.twitter.com/2012/twitter-now-available-in-arabic-farsi-hebrew-and-urdu [22] See project on Tor Pluggable Transports, developed in response to increased use of DPI by Internet Service Providers to detect Tor users https://www.torproject.org/docs/pluggable-transports.html.en ======================================================================== Here can mention Sami Ben Gharbia’s earlier damning critique of internet freedom projects, which practically predicts what we say… http://nawaat.org/portail/2010/09/17/the-internet-freedom-fallacy-and-the-arab-digital-activism/ ---- Further to our earlier mentioned forms of temporalities, there are two types of temporality that are relevant to our analysis. Firstly moments of urgency itself that we call "sneaky moments" which we experienced and assume are temporalities with a particular impact in the political arena. When the temporality is rendered urgent, activists tend to organize upon a sense of immediacy which is followed by suspension of judgement. This can be understood as exceptional states in which the spectrum of critique becomes more constrained. These are the moments where things first have to get done! The sparsity in time makes it more likely that divisions of labor are settled pragmatically, determined by availability and efficiency. In our cases, this could be the tipping point of an uprising or the moment of urgency to produce a technical tool for immediate use, or the launch of a political campaign with an immediate deadline. Secondly, and very much and outcome/follow-up, are moments of calm with a possibility for reflection when those urgent moments pass and communities reach a state of calm in their long-term struggles, actors re-take the suspicion on the maintenance of infrastructure but allow some form of reflection on decisions of delegation and divisions of labor made earlier during moments of urgency. ---- Another way of conceptualizing our terms and labels is by deconstructing them, this is a similar but not identical exercise. For instance, with the Arab uprisings we learned that a proper assessment of the political implications of the internet depends on two different characteristics of technology: a tool for activists (operational, e.g. coding or designing promotion material) and a space for activists (mobilization, e.g. expanding networks, archiving) (Aouragh, 2012). Such deconstruction of terms is pivotal to our argument and borrows from diverse fields of study that share our concerns about civic technology development and social movements. ---- Given their explicit and implicit mission, we may be able to discover some greater issues that underlie the divide between these groups by closely studying these campaign sites. If we also assume that the moment of political urgency, brought about by the Snowden revelations and the MENA uprisings is behind us and yet the interest in secure activist practices persists, this is an opportune time to take a fresh look at these efforts. It is a moment to deliberate the ways in which the activists' delegation of tech matters is being reconfigured from delegation to Fortune-500 platforms to delegation to tech-activists with their heart in the right place. ---- [PLACE THIS WHERE NEEDED ONCE STRUCTURE STABILIZES] The rest, 2 para’s missing? Where do I add the hegemony part? This is what I found/have that can be added/mentioned to please them, change as you please Hegemony is here used in its vernacular meaning referring to dominance but is also strongly linked to Antonio Gramsci’s development and application. As Peter Thomas discusses in what can be considered the most in-depth analysis of Gramsci, hegemony is one of the terms emanating from radical Marxist philosophy that has been disseminated through the social scientists more prominent than any other critical notion (Thomas 133). Hegemony in Gramsci’s understanding counters the false dichotomy between consent and coercion, and for good reasons in light of the role played by security initiatives discussed in this paper. For Gramsci consent must be understood in its dialectical distinction to coercion. In constant interplay, hegemony functions as the social basis of the dominant class which in turn reinforces initiatives in (civil) society, they figure as moments within each other, theoretically distinct but really united as moments of a political hegemonic project’ (167). So hegemony is both complemented by coercion (but not necessarily) and in the process legitimates the state. Here both our understanding of power /dominance (oops!) and sneaky moments fold into the critique in hegemony.