Peer-to-Peer: Toward the Collection Conservation of Net Art

Anna Mladentseva
Electronic Media Review, Volume Seven: 2021-2022

This article examines the conservation of net art that has been shaped by user-generated data, suggesting that its participatory nature engenders new possibilities for conservation. I propose a “peer-to-peer” framework that offers a distributed approach, which sanctions acts of collective conservation by identities that have been excluded from mainstream economies of information exchange, namely the hacker and the spammer. This proposal emerged out of my own efforts to document net art as an independent researcher and the obstacles which I have faced, such as in the form of automatically issued bans. This suggests that online activities which may be productive to conservation also tend to be associated with “hacking” or “spamming”—an association that is likely to be tied to the contemporary landscape of the web. By considering a post-Marxist interpretation of the term peer-to-peer, practitioners can acknowledge the potential struggles and exploitations of the communities that they may want to integrate into their collective conservation workflows. In this way, we can start to reconsider what aspects of net art require documentation and reevaluate who gains permission to enter, what Annet Dekker calls networks of care.

Introduction

The ideas presented in this article are based on my interactions with a work of net art by the artist Annie Abrahams, Violences (2006), which give insight into some of the tensions that exist with regard to the type of identities that gain permission to perform conservation-related activities. Bram.org is a website consisting entirely of Annie Abrahams’ net art with projects dating all the way back to 1997. Somewhat similar to the seminal work by the Dutch artist Martine Neddam, Mouchette.org (1996), Abrahams’ expansive website features multiple pages that are, at the same time, independent projects and part of a larger artistic narrative. It is therefore expected that some of its parts will become obsolete sooner than others, suggesting that it requires a decentralized approach to its maintenance and conservation. In particular, I focus on a participatory, forum-based page titled Violences (2006), which the user is redirected to at the end of another page: Karaoke (2006).

Karaoke consists of an animation of a woman curled up defensively on the floor next to a scrolling banner of lyrics and a sound piece, which invites the viewer to sing along to snippets of, what appears to be, a domestic argument. At the center of the page is a pronounced, orange “(close)” button that, when clicked, leads the user to the Violences forum, where they are asked “What to do with violence? What, when is violence?” These questions suggest that by leaving the Karaoke page, the viewer is engaging in a form of violence, as they are abandoning a woman who clearly looked like she was in danger. The forum allows the user to propose an answer as well as modify existing answers by replacing one word with another. The website visitor has access to a small selection of the most recent forum contributions, as well as a log of all contributions between April 2 and November 25, 2006.

The participatory nature of Violences poses a challenge to conservators: If user-generated content makes up most of the work’s composition, then the “object” of conservation is being constantly regenerated by incoming data. However, what if we were to view the emancipated position of the user who generates data not only in terms of the work’s contingent identity, something to “conserve,” but also as in itself a strategy of conservation? In response to this question, I attempt to develop a framework of collective conservation that benefits from what the user does best—generating data and behaving in an unanticipated manner.

In Search of a Community: Reviewing Existing Conservation Efforts

Web archiving is one of the most accessible ways for the general Internet public to preserve and document net art. It is a method for preserving content on the web that usually involves a process, either automated or human driven, of capturing and subsequent replay, using a browser extension or desktop software. Webrecorder is perhaps one of the most widespread web archiving tools. Between 2016 and 2020, the developers of Webrecorder were partnered up with the New York based organization Rhizome to develop their tool in the context of preserving Rhizome’s born-digital collections (Kreymer and McKeehan 2016). However, in 2020, Webrecorder announced that it will be reverting back to being an independent project, whereas Rhizome emerged with a rebranded tool called Conifer that uses Webrecorder as its prototype (Espenschied 2020).

Even though Webrecorder and its offshoot tools are certainly user oriented, they are not necessarily community oriented. Webrecorder offers high-fidelity, symmetrical web archiving where the user captures web page content by merely interacting with it—hence the claim that it is human driven or user oriented (Kreymer and McKeehan 2016). It allows for the creation of replayable web archives that are not merely a facsimile and are responsive to a learned set of interactions. The more interactions are performed during the capture process, the more holistic a given web archive will be. Both Webrecorder and Conifer have attempted to assist the archiver in the capture process by introducing the “autopilot” feature, which performs trivial actions such as scrolling, clicking, and opening links (Espenschied 2019). Of course, this expansion of documented material, called patching, lacks the capacity of performing more “human” behavior such as inputting data and any other form of decision making. In theory, a single web page could be archived by many different users on separate occasions. Yet, in practice, this has little utility or significance, as the user does not have the opportunity to aggregate these masses of records. Patching is currently only offered at an individual, private level; however, it is worth mentioning that frameworks for aggregating web archives, particularly both public and private web archives, have been proposed in the field of library and information science (Kelly, Nelson, and Weigle 2018). A user may decide to publish their web archive to make it public, and several artists have done so, including Abrahams, who left a link to the web archived version of Karaoke at the original site, but the lack of an infrastructure that indexes these web archives suggests that discoverability is difficult. These documentation efforts remain somewhat isolated, and users who decide to document the same artworks cannot feel each other’s presence or form communities.

At the core of this decision to leave out a centralized repository or provide a framework for aggregating individual archives is security, and although this is largely a practical problem, it articulates an interesting tension between isolation together with its promises for security and community-oriented conservation. As a personal web archiving tool that aims to capture the web “as it is”—which sometimes means past the log-in page—web archives may face serious security risks if shared or made public (Kelly, Nelson, and Weigle 2018, 273). This is significant because net art is not the only web content that is being archived using Webrecorder or Conifer, with a lot of users documenting their social media pages, where processes of authentication means that sensitive credentials are being captured. By interacting with other users’ private archives, individuals with malicious intentions are able to obtain privileged access, allowing them to steal these credentials or create unsanctioned paths to local computer files. It is, of course, important for a web archiving tool to be safe for its users. Nonetheless, this engenders inward web archives that lack the ability to be discovered or aggregated, privileging interests of the individual user over any possible collective or community—a problem that this article returns to in later sections.

Given these practical limitations, the closest example of a truly community-driven stewardship has been outlined by Annet Dekker with regard to Mouchette.org: an expansive net art project revolving around Mouchette, a fictional character after which the site is named. This soon to be 13-year-old girl from Amsterdam uses the website as her personal diary, sometimes sharing her dark and suicidal tendencies. Mouchette.org spans many participatory and forum-based pages, maintaining an evident sense of community. In her book on collecting and conserving net art, Dekker (2018) argues:

A networked, community-driven conservation strategy is not unlikely to happen for mouchette.org. For instance, a situation presented itself on 23 July 2002. A few months after Neddam launched a quiz comparing characters from the film Mouchette with the website, she received a summons from Bresson’s widow to remove any reference to the film. Shortly afterwards, Neddam posted the letter on her website and through her e-mail lists. In response, several independent organizations took it upon themselves to mirror the project on other websites. (89)

One of these mirrors remains accessible to this day on Computer Fine Arts’ website collection of net art. This shows that, unlike in web archiving where one cannot sense the presence of other users, the community of users surrounding Mouchette.org can be felt throughout its forums and mailing lists—the same users who may, one day, become its caretakers. In this context, Dekker (2018) introduces the term network of care, which is “based on a transdisciplinary attitude and a combination of professionals and non-experts who manage or work on a shared project” (91).

Nonetheless, the concept of “network of care” leaves several questions unanswered, which collective conservation aims to address. In the past, Emma Waterton and Laurajane Smith have criticized the uncritical approaches toward community in heritage, reminding us that “communities are run through with divergent interests . . . and a range of either motivating or disruptive energies” (Waterton and Smith 2010, 8). Their article serves as a reminder that heritage professionals have the tendency to expect communities to behave in an expected and somewhat homogenous way. It is therefore essential that we ask ourselves who gains permission to enter these “networks of care” and the type of activities that are expected to occur within these networks. By asking these questions, I am proposing for conservation activities to be delegated not only to partner organizations, such as the Computer Fine Arts collection in the case of Mouchette.org, but also to an expanded, online community where identities are often anonymous and slip between both motivating and disruptive energies, which I argue manifest in the identities of the hacker and spammer.

“Spammers are Leeches”: Getting Banned from Bram.org

This section details the situation that has inspired this research, which occurred while I was trying to document user contributions from the Violences forum. Because the latter part of the URL is indicative of a path that leads to a resource hosted on the site’s server, it can be used to reverse engineer routes to resources that are not displayed on the site. This, at times accidental, redirection to covert website resources seems to have been more common in the early days of the Internet where security concerns were not as widespread. Because the primary way of accessing sites used to be memorizing and typing out the URL by hand, many users would accidentally mistype and get redirected to unwanted pages. Some net artists would even replace the “404 not found” error with custom pages in a form of play with the user. Frédéric Madre, in his 1994 net artwork No Fun, redirects the user to a pornographic image with the caption “you are not supposed to be here,” playing on the stereotype of the nosy “hacker” who stumbles across information they are not meant to stumble upon. Inspired by Madre’s suggestive commentary, I have embarked on a URL guessing game in the forum of Violences as an exercise at harmless “hacking.” Based on the names of other history log files, I was able to discover a self-updating text file (fig. 1) created by the artist that contains all user-generated data to have ever been inputted into the forum’s field, which included my own contribution from January 21, 2021 where I have entered “violence is misusing the URL?”

Fig. 1. Self-updating text file containing all user input for the Violences forum. Please note user input other than the author’s has been blurred for privacy.

Interestingly, I ended up getting banned from the web page by its automatic anti-spamming program. At some point, when I tried returning to the Violences forum, I kept on getting redirected to the site “monkeys.com/spammers-are-leeches” (fig. 2). In an email thread with the artist, Abrahams confirmed to me that in 2014 she commissioned a programmer to write an anti-spam script that recognizes repetitive entries in her forums and bans users based on their IP address. The ban was therefore issued because I was recursively contributing several inputs to figure out how the forum reacts and test the log file. In this way, my own conservation efforts have become entangled with the undesirable identities of the hacker who accesses information below the surface and the spammer, who is often an anonymous actor that generates heaps of repetitive data before getting banned and never returning.

Fig. 2. Getting banned from Bram.org.

Peer-to-Peer: Theorizing the Contemporary Web

The preceding section suggests that activities which may be productive for conservation tend to reside within slippery categories with origins in the political fabric of the web. After all, I would not consider myself a hacker or a spammer, despite being placed into these categories by an automatic banning mechanism. This entanglement seems to suggest that a distributed form of conservation is, in part, a political agenda. For conservation to be truly “collective,” we have to “peer” it and understand the struggles of actors on the web through a post-Marxist lens.

Arguably, the term peer-to-peer has become synonymous with the decentralizing potential of the Internet, particularly in terms of its telecommunications infrastructure; however, the World Wide Web that sits on top of the Internet has turned it into an unattainable fiction. In peer-to-peer networks, peers hold equal privilege. The Internet, by allowing a peer to theoretically connect to any other device in the network, expresses this equipotency (Kleiner 2010, 14). However, Dmytri Kleiner reminds us that our quotidian experience of the Internet is hindered by the veneer of the World Wide Web, suggesting that the web’s client-server technology “[kills] off the decentralised potential of peer-to-peer technology” (Kleiner 2010, 15). The asymmetric condition of client-server applications concentrates all of the website’s resources into servers that serve content to the requesting client-consumer. Moreover, the participatory nature of Web 2.0 creates a myth of the dispersion of power. Although production tasks are partitioned between users, the publisher of the website still owns the content (Kleiner 2010, 15). The technology that enables this extraction of labor is the “client-server capitalist state,” profiting from content with a near-zero production cost (Kleiner 2010, 9). Given that the net artworks mentioned in this study—or perhaps a more accurate description would be web artworks—have emerged from this client-server technology, a truly peer-to-peer conservation strategy would be completely fictional. Therefore, to avoid reproducing this decentralizing myth, we ought to reappraise the meaning of the term peer-to-peer using an additional framework that acknowledges more recent tendencies such as cloud computing.

While Kleiner critiques the uneven distribution of power between the producers and consumers of content on the web, Tung-Hui Hu brings in an additional actor for problematizing the contemporary web. The emergence of cloud computing has further centralized the web by concentrating most of its services, files, and hardware infrastructures into the same monolithic, cloud-based providers such as Amazon Web Services (Hu 2015, 7). This new generation of data centers indicates an increasing abstraction of users not only from their content, as Kleiner suggested, but also their infrastructure. As a result of sharing these computational resources, it became very important for the service providers to ensure that these environments are isolated. This was partly influenced by corporate politics, where competition meant that organizations did not want to share infrastructure with their competitors, who might find ways to leak their sensitive information (Portnoy 2012, 4). These concerns trickled into the realm of the “individual” user who lives in the fantasy of personal computing, where all resources appear to be exclusively theirs and personalized. Even the concept of the username and password, which we now see as something intimate to be kept safely away from hackers, has been initially introduced to track computer usage (Hu 2015, 46). This gave way to an interesting paradox: Although we, as users, share many computer resources through the cloud, including data pipes, servers, and applications, we cannot feel each other’s presence within this system. This mutually distrustful system in which we operate in is somewhat absurd, as we trust companies with our confidential data more than we do our Internet neighbors—our peers.

The cloud’s soft form of control and ideology of individuation has driven peers apart and fostered a collective fear of identities such as the spammer and the hacker who threaten the security of the individual user. A peer-to-peer strategy will perhaps be able to break through these isolations by allowing users to feel each other’s presence, thereby facilitating forms of direct sharing and exchange. In a way, excluded identities such as the hacker and spammer ought to be brought back as essential actors in the pursuit of peering the web, and in turn, acts of conserving net art.

By bringing in these actors, it is important to consider the political struggles of these identities. McKenzie Wark offers a post-Marxist reading of the hacker, who becomes part of a productive class that struggles against the ruling vectoral class, arguing that information became the new commodity whose exchange value is being controlled by privatization: the creation of intellectual property. Karl Marx’s acclaimed analysis of societies with capitalist modes of production establishes the bourgeoisie as the ruling class who, using the worker class, transform commodities from their “simplest, almost imperceptible outline, [into their] dazzling money-form” (Marx 1887, 34). That is to say, a commodity transgresses its elementary expression of value to become defined by a wealthier exchange value (Marx 1887, 35). This transgression requires an abstraction, or miscalculation, of actual human labor. For Wark (2004), modes of exchanging information within contemporary digital economies engendered a novel “hacker class” (14). In this vein, the productive class expanded to virtually anyone in the online community who produces content, whereas the commodity has become dematerialized. It became information that the ruling, vectoral class profits off by turning it into “intellectual property” (Wark 2013, 91). Wark (2004) names the ruling class vectoral after “vectors of communication,” which is where they reproduce and accumulate value for their new commodity (146).

An identification of the struggles of the “hacker class” and, by extension, the “spammer class”—as both have been excluded from contemporary economies of information exchange—allows us to consider the ethics of delegating acts of conserving net art to the broader Internet community. The primary struggle is based on the inability for hackers to collectivize their property (information), as the vectoral class is continually turning it into private property instead (Wark 2004, 92). The aim therefore becomes the “free availability of information rather than in an exclusive right” and, by extension, its “free circulation” (Wark 2004, 48). For conservation, this means collaborating with the expanded online community to collectively produce documentation of net art, which is then made publicly available in a common repository for further interrogation or aggregation of records. Every party becomes the so-called “hacker-spammer-conservator”: contributing information on the artwork by interacting unexpectedly with—or hacking—it and generating raw data out of which further hacking is possible.

Toward Collective Conservation

The notion of collective conservation feeds into a larger consideration on what aspects of net art require documentation. Documentation is incredibly important in fostering an artwork’s continued existence; however, its purpose has mainly been defined within institutional frameworks where it is used to, for example, record basic intake information as part of the acquisition process (Engel and Phillips 2019, 192). Yet, the field has already started to recognize that these traditional notions and functions of “the document” have lost their relevance in works with “mutable ecologies” (Black 2021). In a documentation workshop hosted by LIMA in 2020, a question was raised on whether audience-generated contributions—for example, the forums in Mouchette.org—are in themselves a form of documentation. In the Conserving Computer-Based Art initiative’s guidelines on analyzing the source code, Deena Engel and Glenn Wharton suggest documenting the data sets that the artist has used in testing their work (Engle and Wharton 2015, 98). Based on my encounter with Violences, perhaps we can consider expanding this recommendation to documenting user-generated data more broadly, the same data that I have used in testing the behavior of the forums, which in turn sprung me into the status of a spammer.

By working collaboratively with artists and inscribing acts of ethical hacking and spamming into the practice of collective conservation, the community can work together to retrieve culturally valuable facets of net artworks which may be otherwise lost. These initiatives would be particularly useful for websites such as Bram.org and Mouchette.org, which are expansive and contain many broken links, causing pages or files to not be properly indexed and therefore only retrievable if the user has a direct URL pointing toward them. In the case of Mouchette.org, Neddam (2020) recognizes that she struggles to acknowledge and remedy these broken links. Sanctioning these acts of net art archaeology—manifested through hackathons and other crowdsourcing initiatives—will make producing community-driven documentation of the website far more manageable than granting every visitor access to resources stored in the server. Moreover, these acts do not necessarily have to be implemented by an institution. By fostering an anti-isolationist system of information exchange where a peer can feel another peer’s presence, a natural or instinctive formation of caretaker groups is embraced.

Conclusions

Collective conservation is a community-driven strategy through which we can reimagine the treatment of net artworks that contain large amounts of user-generated data. It is important to consider these strategies as existing tools that involve the general public in conservation activities, such as web archiving, lack any expression of community. Departing from the ideas presented by Dekker in her concept of “networks of care,” this work aimed to start answering some of the questions with regard to who gains permission to enter these caretaker groups and what activities are permitted to take place. The theoretical analysis presented in this article demonstrates that net art is not immune to the political fabric of the web; thus, the conservation of these artifacts ought to acknowledge the struggles that online community practitioners are aiming to involve, so as to establish an ethical, peer-to-peer economy of information exchange. Based on my own experiences with archiving a work of net art, this research concludes that the identities of the hacker and spammer are closely entangled with that of the conservator, suggesting that the field should integrate these identities by reconsidering approaches to documentation.

ACKNOWLEDGMENTS

I would like to thank the Foundation for Advancement in Conservation/Samuel H. Kress Foundation for supporting my participation in the 50th Annual Meeting of the American Institute for Conservation in Los Angeles, California, where this article was initially presented. I am indebted to the support and thoughtful comments on early versions of this research from my undergraduate and (soon) PhD supervisor, Dr Hélia Marçal. Finally, I am thankful to the artist Annie Abrahams for being receptive to my research.

REFERENCES

Black, Patricia. 2021. “Essay: Can Mouchette Be Preserved as an Identity?” About Mouchette. https://about.mouchette.org/case-study-li-ma/.

Dekker, Annet. 2018. Collecting and Conserving Net Art. New York: Routledge.

Engel, Deena, and Glenn Wharton. 2015. “Source Code Analysis as Technical Art History.” Journal of the American Institute for Conservation 54 (2): 91–101. 

Engel, Deena, and Joanna Phillips. 2019. “Applying Conservation Ethics to the Examination and Treatment of Software- and Computer-Based Art.” Journal of the American Institute for Conservation 58 (3): 180–95. 

Espenschied, Dragan. 2019. “Introducing Webrecorder Pilot.” Rhizome, New York. https://blog.conifer.rhizome.org/2019/08/14/autopilot.html.

Espenschied, Dragan. 2020. “Introducing Conifer and the Future of Web Archiving at Rhizome.” Rhizome, New York. https://rhizome.org/editorial/2020/jun/11/introducing-conifer.

Hu, Tung-Hui. 2015. A Prehistory of the Cloud. Cambridge, MA: MIT Press.

Kelly, Mat, Michael N. Nelson, and Michele C. Weigle. 2018. “A Framework for Aggregating Private and Public Web Archives.” In JCDL ’18: Proceedings of the 18th ACM/IEEE on Joint Conference on Digital Libraries. New York: Association for Computing Machinery. 273–82. 

Kleiner, Dmytri. 2010. The Telekommunist Manifesto. Amsterdam: Institute of Network Cultures.

Kreymer, Ilya, and Morgan McKeehan. 2016. “Symmetrical Web Archiving with Webrecorder, a Browser-Based Tool for Digital Social Memory: An Interview with Ilya Kreymer.” The National Digital Stewardship Residency. https://ndsr.nycdigital.org/symmetrical-web-archiving-with-webrecorder-a-browser-based-tool-for-digital-social-memory-an-interview-with-ilya-kreymer/.

Marx, Karl. 1887. Capital: A Critique of Political Economy, edited by Frederick Engels, translated by Samuel Moore and Edward Aveling. London: Swan Sonnenschein, Lowrey & Co. 

Neddam, Martine, and LIMA. 2020. “Documentation Questions to Martine Neddam.” About Mouchette. http://about.mouchette.org/li-ma-questions/.

Portnoy, Matthew. 2012. Virtualization Essentials. Hoboken, NJ: Wiley.

Waterton, Emma, and Laurajane Smith. 2010. “The Recognition and Misrecognition of Community Heritage.” International Journal of Heritage Studies 16 (1): 4–15. 

Wark, McKenzie. 2004. A Hacker Manifesto. Cambridge, MA: Harvard University Press. 

Wark, McKenzie. 2013. “Considerations on a Hacker Manifesto.” In Digital Labor: The Internet as Playground and Factory, edited by Trebor Scholz. New York: Routledge. 69–76.

AUTHOR

Anna Mladentseva
MSc Digital Humanities Student
Department of Information Studies, University College London
London, England
anna.mladentseva.18@ucl.ac.uk