ECPN Interviews: Electronic Media Conservation with Brian Castriota

To promote awareness and a clearer understanding of different pathways into specializations that require particular training, the Emerging Conservation Professionals Network (ECPN) is conducting a series of interviews with conservation professionals in these specialties. We kicked off the series with Chinese and Japanese painting conservation, and now we are focusing on practitioners in AIC’s Electronic Media Group (EMG). These conservators work with time-based media, which can include moving components, performance, light or sound elements, film and video, analog or born-digital materials. We’ve asked our interviewees to share some thoughts about their career paths, which we hope will inspire new conservation professionals and provide valuable insight into these areas of our professional field.

Previous posts in ECPN’s EMG blog series include interviews with Yasmin Dessem, Alex Nichols, and Nick Kaplan. In this installment we hear from Brian Castriota, a conservator specialized in the conservation of time-based media and contemporary art. Brian holds a Master’s degree in Art History and a Certificate in Conservation from the Institute of Fine Arts at NYU where he graduated in 2014. He worked as a contract conservator for time-based media artworks at the Smithsonian American Art Museum, and was a Samuel H. Kress Fellow in Time-Based Media Conservation at the Solomon R. Guggenheim Museum in New York. He is currently a Research Fellow in the Conservation of Contemporary Art at the Scottish National Gallery of Modern Art and is pursuing a doctoral degree at the University of Glasgow within the research program “New Approaches in the Conservation of Contemporary Art” (NACCA) – a Marie Skłodowska-Curie Innovative Training Network.


image4
Brian Castriota documenting the artist-modified turntables used in Susan Philipsz’s installation “Seven Tears” (2016) at the Scottish National Gallery of Modern Art. [Photo: B. Castriota]
ECPN: Please tell us a little bit about yourself.

Brian Castriota (BC): My name is Brian Castriota, I’m a conservator of time-based media and contemporary art. I’m currently working on a Ph.D. at the University of Glasgow within the EU-funded research initiative “New Approaches in the Conservation of Contemporary Art” (NACCA).

ECPN: How were you first introduced to conservation, and why did you decide to pursue conservation?

BC: Both of my parents are art historians and my mother worked as a museum curator and director for many years, first at Duke University and later Amherst College. I spent a lot of my childhood backstage in museum storage around artworks and artifacts from all periods, which I think was probably a very formative experience for me. Something resonated with me in the kinds of interactions I observed conservators have with museum objects, their unique expertise about the material fabric and production history of these objects, as well as their profound sense of responsibility in ensuring their continuity.

image1
Backing-up and verifying file integrity of audiovisual material on artist-supplied carriers in the collection of the Scottish National Gallery of Modern Art. [Photo: B. Castriota]
ECPN: Of all specializations, what contributed to your decision to pursue electronic media conservation?

BC: I think I have always felt a draw towards “obsolete” equipment, media and technologies; I was an avid record collector in my adolescence, studied color darkroom photography in college, and I have a small collection of vintage analog synthesizers. I first became aware of electronic media conservation as a sub-specialism of art conservation after starting in the conservation masters program at NYU’s Institute of Fine Arts. Christine Frohnert was of course a big inspiration for me – her enthusiasm and passion for time-based media conservation was absolutely contagious, and she really gave me the confidence to pursue this pathway and specialization. Joanna Phillips was also instrumental in providing me with the practical training to become a time-based media conservator in my fourth-year internship and subsequent fellowship at the Solomon R. Guggenheim Museum.

ECPN: What has been your training pathway?  Please list any universities, apprenticeships, technical experience, and any related jobs or hobbies.

BC: My training pathway has by no means been straight and narrow. I concentrated in studio arts at Sarah Lawrence College where I did my Bachelor’s degree. During my junior year abroad in Florence, Italy I took a year-long course on painting conservation which confirmed my interest in pursuing master’s-level training in conservation. Upon returning to New York I interned in the Photographs Conservation department of the Met for a summer. After I graduated from Sarah Lawrence in 2009 I worked for a paper conservator in private practice for a year while I completed the rest of my lab science requirements for grad school. During my time at the IFA I specialized in the conservation of objects and archaeological materials. I took every opportunity to work on their affiliated excavations, including three consecutive summers with the Harvard-Cornell Archaeological Exploration of Sardis, and NYU’s excavations at Selinunte and Abydos.

After taking Christine Frohnert’s seminar “Art With a Plug” in my third year I devoted my thesis research to examining the significance of CRT video projectors in Diana Thater’s early video installations. I then split my fourth year internship between the Artefacts Conservation section of the National Galleries of Scotland and the Time-Based Media Conservation Lab at the Solomon R. Guggenheim Museum in New York. Upon graduating I was fortunate to work for a few months at the Smithsonian American Art Museum on their time-based media art collection, and afterwards returned to the Guggenheim for a Samuel H. Kress Fellowship in Time-Based Media Conservation before I started my Ph.D. at the University of Glasgow.

ECPN: Are there any particular skills that you feel are important or unique to your discipline?

BC: I find that time-based media art conservation requires a very unique combination of skills: a sound knowledge of modern and contemporary art history and conservation theory, a sensitivity for contemporary artistic working practices, a broad technical knowledge of historic and current audiovisual technologies, a knack for interfacing with many groups of people with diverse skillsets and backgrounds, and an ability to think critically and reflectively.

image6
One of seven synchronized record players used in Susan Philipsz’s installation “Seven Tears” (2016), installed at the Scottish National Gallery of Modern Art. [Photo: B. Castriota]
ECPN: What are some of your current projects, research, or interests?

BC: In my doctoral research I am taking a critical look at how contemporary conservation theory and practice grapple with works of art whose authenticity doesn’t inhere through a fixed or finite physical assemblage, or even a fixed set of rules, parameters, conditions, or properties. There are in existence works whose creation continues after the work is acquired by a museum, works whose rules or conditions change over time or are seen as being variable among stakeholders. This in turn leads to questions about how the continuity of the work’s authenticity can be ensured. I am developing a framework and language to characterize these phenomena and account for them in our practical workflows and protocols.

In conjunction with my doctoral research I am working part-time at the Scottish National Gallery of Modern Art where I wear two hats. As a researcher, I’ve been examining some of the theoretical and practical challenges posed by particular artworks in and entering the collection. Right now I’m working on an exhibition that includes a number of Susan Philipsz’s complex sound installations involving custom equipment and wireless transmission, which are serving as case studies. I’ve also been lending my expertise as a time-based media conservator to help review their collection care practices around their growing time-based media art collection. Following an initial collection survey and risk assessment we have begun backing-up and condition assessing audiovisual material in the collection, as well as revising and expanding documentation records and acquisition protocols for time-based media artworks.

ECPN: In your opinion, what is an important research area or need in your specialization?

BC: I think one that deserves a bit of focus is terminology. There are a lot of terms that we use in our field, not always with the most consensus about what we mean: emulation, replica, copy, version, authenticity, fidelity, iteration, just to name a few. Some of these terms are borrowed from or have particular lineages within academic discourses in philosophy, ethnography, performance studies, or computer science. In some cases these terms may also have particular meanings in particular industries. These terms also have colloquial usage and connotations. And these are just the English terms. Our field is so international, and there are many terms in other languages that do not have direct translations in English. I have joked for a while that we need to have a “Term Focus” conference – perhaps there will be one on the horizon!

image2
Receiver aerial mounted on the roof of the Scottish National Gallery of Modern Art, relaying Susan Philipsz’s wirelessly-transmitted sound installation “You Are Not Alone” (2009/2017). [Photo: B. Castriota]
ECPN: Do you have any advice for prospective emerging conservators who would like to pursue this specialization?

BC: Do it, because the need is certainly there. If you are pre-program, the Institute of Fine Arts has developed the first dedicated stream in time-based media conservation training in North America. Also be on the lookout for short Mellon-funded courses and workshops geared towards established conservators wishing to pursue greater specialization in time-based media. Attend digital archiving conferences and workshops, join the AMIA listserv, make use of some of the online resources like Code Academy to learn some programming languages, get a Raspberry Pi or a kit for building a little synth or a guitar pedal. The best way to understand the technical underpinnings of time-based and electronic media is to play around with some yourself. Make something!

44th Annual Meeting- Electronic Media Session- Recovering the Eyebeam Collection following Superstorm Sandy- by Karen Van Malssen

This presentation highlighted the risks to important collections that are located outside of traditional museum or library environments. Eyebeam, a non-profit multimedia art space was among the buildings inundated by flood waters in Manhattan’s West Chelsea neighborhood during Superstorm Sandy. Eyebeam is a collaborative workspace, rather than a museum with a “permanent collection,” but like many alternative arts spaces and contemporary art galleries with no “permanent collection,” Eyebeam maintains a collection of work created by former fellowship recipients (something that looks a lot like a permanent collection).
Just as many people in on the East Coast attempted to prepare for the storm, the art center’s staff had had underestimated the magnitude of Sandy’s storm surge, since the storm had been downgraded from the lowest level of hurricane strength on the Saffir-Simpson Scale. The staff members had worked diligently to raise equipment off of the floors and to cover furniture and equipment with plastic sheeting. Unfortunately, three feet of water flooded the interior of the building, causing the loss of 1,500 media items and $250,000 worth of equipment. The presenter showed a video demonstrating the extent of damage to the media archive, contaminated with foul, polluted, flood water. Recovery primarily involved rinsing in clean water, but recovery required more than just the treatment process.

The presenter provided a convenient, numbered list of lessons learned:
Lesson 1. Know Your Context: Assess known risks and anticipate the worst-case scenario. Eyebeam was located near the water, but the staff members had not anticipated catastrophic damage affecting the entire region.
Lesson 2. Maintain Contacts with Local Responders: Assembling a network of contacts in advance of the disaster will greatly improve response time; plan a well-designed scalable system for working with responders
Lesson 3. Train ALL Staff for Recovery: You never know who will be available in an emergency; Be prepared to break all procedures into simple steps for training. The two biggest risks during recovery were dissociation (separation of related parts or separation of labels and other identifying markings) and mishandling (outside expertise in video preservation may be scarce).
Lesson 4. Label Everything: This makes it possible to reunite parts that were separated during recovery.
Lesson 5. Make Hard Decisions in Advance: Maintain records of collection salvage priorities, so resources will not be wasted on low-value materials.
Lesson 6. Know What Roles You Will Need: Do not allow people to multi-task; each person needs a clearly defined scope of responsibility.
Lesson 7. Keep Critical Supplies on Hand: Regional disasters cause shortages of supplies that might be plentiful at retail under normal circumstances.
Lesson 8. Adrenaline Wears off: Schedule breaks from work, and assign someone to provide food, water, etc.
Lesson 9. Integrate Preparedness into Institutional Culture
Lesson 10. Strive to Avoid Negative Press: Many anonymous critics on social media complained that Eyebeam should not have maintained an archive of analog videos or hard copies of digital content, that all of the content should have been duplicated on some cloud server not affected by the storm.
Since the disaster recovery, Eyebeam has relocated to Brooklyn.

43rd Annual Meeting – Electronic Media Session, May 16, "Tackling obsolescence through virtualization: facing challenges and finding potentials” by Patricia Falcao, Annet Dekker, and Pip Laurenson

The presenters began by explaining that they had changed the title to reflect the emphasis of presentation. The new title became "An exploration of significance and dependency in the conservation of software-based artwork."

Based upon their research, the presenters decided to focus on dependencies rather than obsolesence per se. The project was related to PERICLES, a pan-European risk assessment project for preserving digital content. PERICLES was a four-year collaboration that included systems engineers and other specialists, modeling systems to predict change.

The presenters used two case studies from the Tate to examine key concepts of dependencies and significant properties. Significant properties were described as values defined by the artist. Dependency is the connection between different elements in a system, defined by the function of those elements, such as the speed of a processor. The research focused on works of art where software is the essential part of the art. The presenters explained that there were four categories of software-based artwork: contained, networked, user-dependent, and generative. The featured case studies were examples of contained and networked artworks. These categories were defined not only in terms of behavior, but also in terms of dependencies.

Michael Craig-Martin's Becoming was a contained artwork. The changing composition of images was comprised of animation of the artist’s drawings on LCD screen, using proprietary software. Playback speed is an example of an essential property that could be changed, if there were a future change in hardware, for example.

Jose Carlos Martinat Mendoza's Brutalism: Stereo Reality Environment 3 was the second case study discussed by the presenters. This work of art is organized around a visual pun, evoking the Brutalist architecture of the Peruvian “Pentagonito,” a government Ministry of Defense office associated with the human rights abuses of a brutal regime. Both the overall physical form of the installation, when viewed merely as sculpture, and the photographic image of the original structure reinforce the architectural message. A printer integrated into the exhibit conveys textual messages gleaned from internet searches of brutality. While the networked connection permitted a degree of randomness and spontaneity in the information flowing from the printer, there was a backup MySQL database to provide content, in the event of an interruption in the internet connection.

The presenters emphasized that the dependencies for software-based art were built around aesthetic considerations of function. A diagram was used to illustrate the connection between artwork-level dependencies. With "artwork" in the center, three spokes radiated outward toward knowledge, interface, and computation. An example of knowledge might be the use of a password to have administrative rights to access or modify the work. A joystick or a game controller would be examples of interfaces. In Brutalism, the printer is an interface. Computation refers to the capacity and processor speed of the computer itself.

Virtualization has been offered as an approach to preserving these essential relationships. It separates hardware from software, creating a single file out of many. It can act as a diagnostic tool and a preservation strategy that mitigates against hardware failure. The drawbacks were that it could mean copying unnecessary or undesirable files or that the virtual machine (and the x86 virtualization architecture) could become obsolete. Another concern is that virtualization may not capture all of the significant properties that give the artwork its unique character. A major advantage of virtualization is that it permits the testing of dependencies such as processor speed. It also facilitates version control and comparison of different versions.The authors did not really explain the difference between emulation and virtualization, perhaps assuming that the audience already knew the difference. Emulation uses software to replicate the original hardware environment to run different operating systems, whereas virtualization uses the existing underlying hardware to run different operating systems. The hardware emulation step decreases performance.

The presenters then explained the process that is used at the Tate. They create a copy of the hardware and software. A copy is kept on the Tate servers. Collections are maintained in a High Value Digital Asset Repository. The presenters also described the relationship of the artist's installation requirements to the dependencies and significant properties. For example, Becoming requires a monitor with a clean black frame of specific dimensions and aspect ratio. The software controls the timing and speed of image rotation and the randomness or image changes, as well as traditional artistic elements of color and scale. With Brutalism, the language (Spanish to English) is another essential factor, along with "liveness" of search.

During the question and answer period, the presenters explained that they were using VMware, because it was practical and readily available. An audience member asked an interesting question about the limitations of virtualization for the GPU (graphics processing unit). The current methodology at the Tate works for the CPU(central processing unit) only, not the graphics unit. The presenters indicated that they anticipated future support for the GPU.

This presentation emphasized the importance of curatorship of significant propeeties and documentation of dependencies in conserving software-based art. It was important to understand the artist's intent and to capture the essence of the artwork as it was meant to be presented, while recognizing that the artist’s hardware, operating system, applications, and hardware drivers could all become obsolete. It was clear from the presentation that a few unanswered questions remain, but virtualization appears to be a viable preservation strategy.

41st Annual Meeting-Electronic Media Session, May 31, "Technical Documentation of Source Code at the Museum of Modern Art" by Deena Engel and Glenn Wharton

Glenn Wharton began with an overview of the conservation of electronic media at the Museum of  Modern Art (MoMA). When he set up the Media Conservation program at MoMA in 2005, there were over 2,000 media objects, mostly analog video, and only 20 software objects. The main focus of the program was digitizing analog video and audio tapes. Wharton was a strong advocate for the involvement of IT experts from the very beginning of the process. Over time, they developed a working group representing all 7 curatorial departments, collaborating with IT and artists to assess, document, and manage electronic media collections.
Wharton described the risk assessment approach that MoMA has developed for stewardship of its collections, which includes evaluation of software dependency and operating system dependency for digital objects.  They have increased the involvement of technical experts, and they have collaborated with Howard Besser and moving image archivists.
The presenters chose to focus on project design and objectives; they plan to publish their findings in the near future. Glenn Wharton described the three case study artworks: Thinking Machine 4, Shadow Monsters, and 33 Questions per Minute. He explained how he collaborated with NYU computer science professor Deena Engel to harness the power of a group of college undergraduate students to provide basic research into source code documentation. Thinking Machine 4 and Shadow Monsters were both written in Processing, an open source programming language based on Java. On the other hand, 33 Questions per Minute was written in Delphi, derived from PASCAL; Delphi is not very popular in the US, so the students where challenged to learn an unfamiliar language.
Engel explained that source code can be understood by anyone who knows the language, just as one might read and comprehend a foreign language. She discussed the need for software maintenance that is common across various types of industries, not unique to software-based art projects. Software maintenance is needed when the hardware is altered,  the operating system is changed, or the programming language is updated. She also explained four types of code documentation: annotation (comments) in the source code, narratives, visuals, and Unified Modeling Language (UML) diagrams.
Engel discussed the ways that the source code affects the output or the user experience and the need to capture the essential elements of presentation in artwork, which are unique to artistic software. In 33 Questions per Minute, the system configuration includes a language setting with options for English, German, or Spanish. Some functions were operating system-specific, such as the Mac-Unix scripts that allow the interactive artwork Shadow Monsters to reboot if overloaded by a rambunctious school group flooding the gallery with lots of moving shadows. Source code specified aesthetic components such as color, speed, and randomization for all of the case study artworks.
One interesting discovery was the amount of code that was “commented out.” Similar to  studies, underdrawings, or early states of a print, there were areas of code that had been deactivated without being deleted, and these could be examined as evidence of the artist’s working methods.
Engel concluded by mentioning that the field of reproducibility in scientific research is also involved with documenting and preserving source code, in order to replicate data-heavy scientific experiments. Of course, they are more concerned with handling very large data sets, while museums are more concerned with replicating the look and feel of the user experience. Source code documentation will be one more tool to inform conservation decisions, complimenting the artist interview and other documentation of software-based art.
Audience members asked several questions regarding intellectual property issues, especially if the artists were using proprietary software rather than open-source software.   There were also questions raised about artists who were reluctant to share code. Glenn Wharton explained that MoMA is trying to acquire code at the same time that the artwork is acquired. They can offer the option of a sort of embargo or source code “escrow” where the source code would be preserved but not accessed until some time in the future.