44th Annual Meeting- Electronic Media Session- Recovering the Eyebeam Collection following Superstorm Sandy- by Karen Van Malssen

This presentation highlighted the risks to important collections that are located outside of traditional museum or library environments. Eyebeam, a non-profit multimedia art space was among the buildings inundated by flood waters in Manhattan’s West Chelsea neighborhood during Superstorm Sandy. Eyebeam is a collaborative workspace, rather than a museum with a “permanent collection,” but like many alternative arts spaces and contemporary art galleries with no “permanent collection,” Eyebeam maintains a collection of work created by former fellowship recipients (something that looks a lot like a permanent collection).
Just as many people in on the East Coast attempted to prepare for the storm, the art center’s staff had had underestimated the magnitude of Sandy’s storm surge, since the storm had been downgraded from the lowest level of hurricane strength on the Saffir-Simpson Scale. The staff members had worked diligently to raise equipment off of the floors and to cover furniture and equipment with plastic sheeting. Unfortunately, three feet of water flooded the interior of the building, causing the loss of 1,500 media items and $250,000 worth of equipment. The presenter showed a video demonstrating the extent of damage to the media archive, contaminated with foul, polluted, flood water. Recovery primarily involved rinsing in clean water, but recovery required more than just the treatment process.

The presenter provided a convenient, numbered list of lessons learned:
Lesson 1. Know Your Context: Assess known risks and anticipate the worst-case scenario. Eyebeam was located near the water, but the staff members had not anticipated catastrophic damage affecting the entire region.
Lesson 2. Maintain Contacts with Local Responders: Assembling a network of contacts in advance of the disaster will greatly improve response time; plan a well-designed scalable system for working with responders
Lesson 3. Train ALL Staff for Recovery: You never know who will be available in an emergency; Be prepared to break all procedures into simple steps for training. The two biggest risks during recovery were dissociation (separation of related parts or separation of labels and other identifying markings) and mishandling (outside expertise in video preservation may be scarce).
Lesson 4. Label Everything: This makes it possible to reunite parts that were separated during recovery.
Lesson 5. Make Hard Decisions in Advance: Maintain records of collection salvage priorities, so resources will not be wasted on low-value materials.
Lesson 6. Know What Roles You Will Need: Do not allow people to multi-task; each person needs a clearly defined scope of responsibility.
Lesson 7. Keep Critical Supplies on Hand: Regional disasters cause shortages of supplies that might be plentiful at retail under normal circumstances.
Lesson 8. Adrenaline Wears off: Schedule breaks from work, and assign someone to provide food, water, etc.
Lesson 9. Integrate Preparedness into Institutional Culture
Lesson 10. Strive to Avoid Negative Press: Many anonymous critics on social media complained that Eyebeam should not have maintained an archive of analog videos or hard copies of digital content, that all of the content should have been duplicated on some cloud server not affected by the storm.
Since the disaster recovery, Eyebeam has relocated to Brooklyn.

44th Annual Meeting- Joint Photographic Materials + Research and Technical Studies Session- Surface Roughness, Appearance and Identification of AGFA-Gevaert Photograph Samples- by Dr. W. Wei and Sanneke Stigter

Having encountered some very bizarre textures in matte Gevaluxe prints during a National Portrait Gallery internship several years ago, I was eager to learn more about the characterization of these interesting papers. The popular Gevaluxe papers (made by Belgian company Gevaert) often had a velvety matte appearance that was desired by many mid-twentieth century photographers.
This project was inspired by a concern that the increasing reliance of museums on digital surrogates for original photographs might not capture all of the original properties of the photograph. Even where a traditional silver-gelatin or chromogenic photograph has been used as a surrogate, the textured surface of the replacement paper might not match the original. The work Hoe Hoeker Hoe Platter by Dutch artist Ger Van Elk was used as an example of a mixed media photographic work where texture played an important role in conservation decisions. Texture can influence the perception of color, so it was important to characterize the essential properties of the paper’s texture.
Paul Messier’s research was considered an important first step, but Bill Wei’s research team in the Netherlands sought to leverage some of the technology from other industries where surface texture and roughness are systematically quantified (such as the auto industry). First, Wei gave an overview of some of the techniques employed in texture measurement: polynomial texture mapping and confocal white light profilometry. In this project, confocal white light profilometry was used to create a non-contact contour map with a resolution of 60 nanometers. Gloss measurements were also used; on a matte surface the difference between incident and reflected light is the light scattered, so the glossiness (or lack thereof) can be quantified.
The study compared human perception with quantitative texture measurements in observations of textured paper and their apparent roughness or smoothness. An Agfa-Gevaert sample book from the 1970’s served as the source material. Only three of the samples were color papers, so they were more difficult to evaluate. The 25 samples were categorized into 5 groups. Some of the groups had a “macro” texture of waviness, versus a “micro” texture of roughness on a much smaller scale. Group 1 was smooth. Group 2 papers had a very fine texture. Papers assigned to Group 3 displayed the fine texturing in the Group 2 papers, combined with a large-scale waviness. Group 4 exhibited the waviness of Group 3, without the fine texture. Group 5, which included some of the color papers, was comprised of a very regular pattern of raised circular nubs or dots. For anyone who has a lot of family photos from the 1970’s, that dot texture will seem quite familiar.
The research is ongoing, so the presenter mentioned some preliminary observations, without drawing any conclusions. There was not a direct relationship between roughness and gloss. For example, samples from Groups 4 and 5 were just behind group 1 in gloss. The human observers demonstrated that their perceptions of smoothness did not always correlate with the quantitative measurements, especially for some papers in Group 2. It will be interesting to hear the follow-up results as the research team continues the project.

43rd Annual Meeting – Electronic Media Session, May 16, "Tackling obsolescence through virtualization: facing challenges and finding potentials” by Patricia Falcao, Annet Dekker, and Pip Laurenson

The presenters began by explaining that they had changed the title to reflect the emphasis of presentation. The new title became "An exploration of significance and dependency in the conservation of software-based artwork."

Based upon their research, the presenters decided to focus on dependencies rather than obsolesence per se. The project was related to PERICLES, a pan-European risk assessment project for preserving digital content. PERICLES was a four-year collaboration that included systems engineers and other specialists, modeling systems to predict change.

The presenters used two case studies from the Tate to examine key concepts of dependencies and significant properties. Significant properties were described as values defined by the artist. Dependency is the connection between different elements in a system, defined by the function of those elements, such as the speed of a processor. The research focused on works of art where software is the essential part of the art. The presenters explained that there were four categories of software-based artwork: contained, networked, user-dependent, and generative. The featured case studies were examples of contained and networked artworks. These categories were defined not only in terms of behavior, but also in terms of dependencies.

Michael Craig-Martin's Becoming was a contained artwork. The changing composition of images was comprised of animation of the artist’s drawings on LCD screen, using proprietary software. Playback speed is an example of an essential property that could be changed, if there were a future change in hardware, for example.

Jose Carlos Martinat Mendoza's Brutalism: Stereo Reality Environment 3 was the second case study discussed by the presenters. This work of art is organized around a visual pun, evoking the Brutalist architecture of the Peruvian “Pentagonito,” a government Ministry of Defense office associated with the human rights abuses of a brutal regime. Both the overall physical form of the installation, when viewed merely as sculpture, and the photographic image of the original structure reinforce the architectural message. A printer integrated into the exhibit conveys textual messages gleaned from internet searches of brutality. While the networked connection permitted a degree of randomness and spontaneity in the information flowing from the printer, there was a backup MySQL database to provide content, in the event of an interruption in the internet connection.

The presenters emphasized that the dependencies for software-based art were built around aesthetic considerations of function. A diagram was used to illustrate the connection between artwork-level dependencies. With "artwork" in the center, three spokes radiated outward toward knowledge, interface, and computation. An example of knowledge might be the use of a password to have administrative rights to access or modify the work. A joystick or a game controller would be examples of interfaces. In Brutalism, the printer is an interface. Computation refers to the capacity and processor speed of the computer itself.

Virtualization has been offered as an approach to preserving these essential relationships. It separates hardware from software, creating a single file out of many. It can act as a diagnostic tool and a preservation strategy that mitigates against hardware failure. The drawbacks were that it could mean copying unnecessary or undesirable files or that the virtual machine (and the x86 virtualization architecture) could become obsolete. Another concern is that virtualization may not capture all of the significant properties that give the artwork its unique character. A major advantage of virtualization is that it permits the testing of dependencies such as processor speed. It also facilitates version control and comparison of different versions.The authors did not really explain the difference between emulation and virtualization, perhaps assuming that the audience already knew the difference. Emulation uses software to replicate the original hardware environment to run different operating systems, whereas virtualization uses the existing underlying hardware to run different operating systems. The hardware emulation step decreases performance.

The presenters then explained the process that is used at the Tate. They create a copy of the hardware and software. A copy is kept on the Tate servers. Collections are maintained in a High Value Digital Asset Repository. The presenters also described the relationship of the artist's installation requirements to the dependencies and significant properties. For example, Becoming requires a monitor with a clean black frame of specific dimensions and aspect ratio. The software controls the timing and speed of image rotation and the randomness or image changes, as well as traditional artistic elements of color and scale. With Brutalism, the language (Spanish to English) is another essential factor, along with "liveness" of search.

During the question and answer period, the presenters explained that they were using VMware, because it was practical and readily available. An audience member asked an interesting question about the limitations of virtualization for the GPU (graphics processing unit). The current methodology at the Tate works for the CPU(central processing unit) only, not the graphics unit. The presenters indicated that they anticipated future support for the GPU.

This presentation emphasized the importance of curatorship of significant propeeties and documentation of dependencies in conserving software-based art. It was important to understand the artist's intent and to capture the essence of the artwork as it was meant to be presented, while recognizing that the artist’s hardware, operating system, applications, and hardware drivers could all become obsolete. It was clear from the presentation that a few unanswered questions remain, but virtualization appears to be a viable preservation strategy.

43rd Annual Meeting-General Session, May 15, 2015, "Lighten Up: Enhancing Visitor Experience," by Linda Edquist and Sarah Stauderman

Postal Museum Paper Conservator Linda Edquist was unable to attend the conference, so Sarah Stauderman presented in her place. Sarah began by describing the practice of philately and placing it within the context of the recent 18,000 square foot expansion of the National Postal Museum. A collective cringe radiated through the audience like the “wave” in a football stadium, when Sarah revealed that a key component of the building program was the plan to expose a large bank of southwest-facing exterior windows over the new exhibit space. Fortunately, the museum was able to use a variety of active and passive approaches to control light in the galleries.
First, there were translucent window films printed with large images of famous stamps. These required approval by the local architectural review board, since they were not in keeping with the period of the historic building. The stamp windows added an interpretive element, while reducing the ambient light level in the sunlit galleries.
Motion detectors were used to activate LED lights in the “GEMS” gallery, which houses the “inverted Jenny” and other famous or infamous stamps. The ambient light levels were kept low, while “Why is this room so dark?” interpretive signage allowed the museum to provide preservation outreach within the gallery.
inverted jenny stamp
A variety of interactive cases and open storage designs used a somewhat low-tech approach to reducing the light exposure of these works on paper. There was a series of pull-out frames filling the walls of what appeared to be a print reading room with the somewhat grandiose title of “National Stamp Salon.” A similar type of open storage housing was used in the Smithsonian Arts and Industries building in the 19th century. An updated version was manufactured by Goppion to meet current museum conservation and security standards in the Stamp Salon.
There were also cases with interactive lift-up doors that created an intimate viewing experience for each visitor. Horizontal pull-out cases were essentially glazed drawers set into exhibit cases. Visitor engagement was enhanced by the act of lifting and pulling to reveal the collection, a side benefit of the museum’s light-protection system. Magnetic switches permitted case lights to turn off when drawers were closed. The light switches in the lift-up cases were not always reliable, so the museum may try to redesign the lighting for these cases.

lift-up doors
Lift-up doors (circled in red) in the Mail Marks History Exhibit

Collections staff members have been meeting monthly to clean the cases and to assess the security and mechanical stability of all of these moving cases, yet they have continued to rely on some stationary case designs. To avoid the physical stress of constant movement, the museum sought a passive solution for reducing light levels in exhibits of the most fragile paper documents. In the months following 9-11, letters contaminated with anthrax had been treated with chlorine dioxide gas, making the paper more vulnerable to light. The museum selected VariGuard SmartGlass for the exhibit vitrine, blocking more than 99% of ambient light without moving parts. The glass is a laminate that can switch from opaque to transparent when an electrical current is applied. The National Postal Museum’s blog provides more information about the technology behind this interesting product, along with photos of the anthrax letters on exhibit.
Anyone who deals with works on paper or other light-sensitive collections would be likely to see some ideas to steal from this presentation. There were a wide variety of approaches, suitable for documents and works of art on paper in different formats and states of condition. Balancing the needs of the visitors to see the exhibits with the preservation of the collection can be very challenging. Linda Edquist and her colleagues at the National Postal Museum have provided a great set of models for the rest of us.

43rd Annual Meeting-Book and Paper Session, May 15, 2015, "16-17th Century Italian Chiaroscuro Woodcuts: Instrumental Analysis, Degradation and Conservation" by Linda Stiber Morenus, Charlotte Eng, Naoko Takahatake, and Diana Rambaldi

The presenter, Linda Stiber Morenus, began her discussion of these complex prints with a description of the printing process. Chiaroscuro woodcuts were intended to emulate chiaroscuro drawings, which were comprised of black chalk shadows and white chalk highlights on colored paper. Color oil-based printing inks were first used to print 14th-century textiles, being used on paper by the mid 15th-century. The chiaroscuro woodblock prints required two to five separate woodblocks, inked with different shades lighter and darker than the midtone colored paper.
In order to better characterize the media, Morenus collaborated with art historian Takahata, and conservation scientists Eng and Rimbaldi from the Los Angeles County Museum of Art (LACMA). In addition to prints at LACMA, the team studied prints from the British Museum and Library of Congress. Out of over 2000 surveyed woodcuts, 72 were studied in depth, with X-ray Fluorescence (XRF), Fiber Optic Reflectance Spectroscopy (FORS), and Raman spectroscopy. Inorganic compounds were indicated by XRF analysis. FORS was especially helpful for detection of indigo. Raman spectroscopy provided additional information about organic colorants.
Renaissance artists’ manuals, such as Cennino Cennini’s Libro dell’Arte guided the research by providing information on the most likely colorants for printing inks. Inorganic pigments included lamp black, lead white, ochres, vermillion, verdigris, and orpiment. Organic pigments included indigo and a variety of lake pigments.
After providing background information, the presenter began to focus on deterioration and conservation of the chiaroscuro prints. The prints from the Niccolo Vicentino workshop had a high lead content. The inks typically had a low vehicle-to-pigment ratio, tending to turn gray around the edges, due to the presence of lead sulphide. Verdigris corrosion was also a common problem, as found on “Christ Healing the Paralytic Man” by Giuseppe Niccolo Vicentino, as well as 13 other prints from the same workshop. Typical copper-induced paper degradation included yellow-brown halos around inked areas and cracks in the paper.
Fading and discoloration were major problems for the organic colorants, such as indigo and the yellow lakes. Morenus compared copies of Ugo da Carpi’s “Sybil Reading a Book” in the British Museum and the Library of Congress, finding clear evidence that the indigo in the British copy had faded. The British Museum had confirmed the presence of indigo through Raman spectroscopy. At least 8 of the prints were found through XRF to have high levels of calcium in the same areas where indigo had been identified, suggesting the presence of chalk-based lakes. Organic greens had shifted to blue or brown where organic yellows had faded or become discolored.
The presenter concluded with suggestions and caveats for conservation treatment. First, she advised conservators to exercise caution in aqueous treatment, in order the preserve the topography of the prints. The woodblock creates a relief impression in the paper, and the layering of the inks adds another level of texture that might be altered by humidification, flattening, washing, or lining treatments. The low binder content also makes the inks more vulnerable to saponification and loss during alkaline water washing. Morenus warned that the hydrogen peroxide color reversion treatment for darkened lead white would be particularly risky, because the white lead sulphate end product has a lower refractive index than basic lead carbonate original pigment. This means that treated lead white becomes more translucent, and the lower “hiding power” shifts the tonal balance of the print to appear darker overall.
For exhibit recommendations, Morenus suggested that we should always expect to find fugitive organic colorants in chiaroscuro prints, so exhibit rotations should be planned accordingly. Maximum exhibit conditions should be 5 foot-candles (50 lux) of visible light for 12 weeks of exposure, no more often than every three years. She also indicated that overmatting should be avoided to reduce the risk of differential discoloration.
During the Question and Answer period, Morenus clarified the color order used in printing. Some prints were inked from dark to light, but most were printed with the lightest color first.
I thoroughly enjoyed learning about these beautiful prints, but I think that the discussion of the lead white conversion treatment-induced refractive index shift was the most important “take-away” from the presentation.

42nd Annual Meeting- Electronic Media Session, May 31, 2014, "The California Audiovisual Preservation Project: A Statewide Collaborative Model to Preserve the State’s Documentary Heritage by Pamela Jean Vadakan"

The California Light and Sound Collection is the product of a collaboration between 75 partner institutions with original recordings of audiovisual content in California. Following a 2007 statewide collection survey that used the University of California’s CALIPR sampling tool, it was discovered that over 1 million recordings were in need of preservation. In 2010, the California Audiovisual Preservation Project (CAVPP) was founded. Recipients of a National Leadership Grant from the Institute of Museum and Library Services (IMLS), the CAVPP uploaded its first video in 2011.
Like previous statewide initiatives within the California Preservation Program, the project is based at the University of California at Berkeley, where Barclay Ogden provides leadership for the project. By repurposing existing staff and existing tools, the project is able to realize a high level of efficiency. Each partner institution is responsible for surveying its own collection with CALIPR, adding its own records to CONTENTdm, and sending its own recordings with metadata to CAVPP. It is anticipated that open-source tool OMEKA will replace CONTENTdm, because the project partners should not be dependent upon costly proprietary software site licenses.
CAVPP adds administrative metadata, confirms the descriptive metadata, and sends content to the vendors. The vendors include MediaPreserve, Scenesavers, and Bay Area Video Coalition. The vendors produce a preservation file, a mezzanine file, and an access file for each item. Moving forward the project will discontinue creating the mezzanine file, because the preservation file is more useful. Two copies of each file are saved to Linear-tape-open (LTO) and one on the Internet Archive’s servers. Storage costs are about five dollars per recording. CAVPP is also responsible for running checksums and checking video quality. Problems have included out of sync audio, shifts in hue or saturation (chrominance), and shifts in value (luminance). The AV Artifact Atlas has proven essential to the quality control process.
It is crucial for a project this large to have a clear scope in terms of both content and format. The criteria for selection include statewide or local significance, unpublished or original source material, and public domain content. The project also encompasses content for which rights have already been acquired. In some cases, “unknown” has been used a placeholder for missing copyright information. The materials are also subject to triage in terms of the original physical format and condition (preservation need). The project is limited to digital conversion. Film-to-film conversion is outside the scope of the project, but it is hoped that project partners can leverage this project to facilitate projects for high-definition video and film-to-film conversion.
The project has already exceeded its original goals. In the first year, CAVPP uploaded 50 recordings. Now the project has grown to 75 institutions and over 1,400 recordings. It is anticipated that there will be over 3,000 recordings by the end of 2014. Future steps include an assessment of who is using the collection and how they are using it. The project also includes outreach workshops scheduled for project partners in 2014 and 2015.

42nd Annual Meeting- Book and Paper Session, May 29, 2014, "The impact of digitization on conservation activities at the Wellcome Library by Gillian Boal"

The Wellcome Library relies on cross-training and written policies to facilitate the increased involvement of non-conservators in the digitization workflow. Gillian Boal explained that the Wellcome Library, the UK’s largest medical library at over 4 million volumes and the public face of one of the world’s largest private charities, aims to digitize its entire holdings. In order to provide free online access to the entire collection, they have to involve a large group of internal and external partners. Some items are scanned in-house, while others are contracted out to the Internet Archive.
The role of the conservators is primarily to ensure safe handling of the original physical items. To that end, they have trained allied professionals to serve as digital preparators, empowered to perform minor conservation procedures. Treatments are divided into two groups: dry and wet. Dry treatment includes removal of paperclips and staples, for example. These dry procedures are often performed outside of a conservation lab by archivists and librarians in many institutional contexts where there are no conservators. Those procedures are an obvious fit for the non-conservators working on the project. Wet procedures include both aqueous and solvent treatments. Wet treatments are more likely to require the skills of conservation personnel with lab equipment.
Complex folded items presented a special challenge that was met with creativity. The presentation included examples where overlapping parts were lifted onto a cushion of Plastazote™ cross-linked polyethylene foam during digitization. Boal pointed out the shadows visible in the scanned documents where overlapping parts were supported by these foam shims. This is important because the customary use of a glass plate to hold materials flat for photography would have added extra stress or new creases in the absence of a cushion. The digital preparators were empowered to use their own judgement to open non-brittle folded items without humidification; such items were held flat under glass for scanning. Other items were photographed without glass, to accommodate three-dimensional paper structures.
The Internet Archive also acted as a preservation partner, re-routing items to conservation as needed. For example, a volume with a torn page was intercepted by the Internet Archive’s assessment process in order to receive treatment by the conservators.
The digitization of collections is primarily about access. To enhance that access, the Wellcome Library developed “the player” as a tool to view a variety of different types of content from the same interface. It enables downloading or embedding a zoomed-in part of a page, in addition to options for high-resolution and low-resolution images. “The player” also functions as a sort of e-reader interface for books, and it responds dynamically to create the appropriate interface for the type of item accessed, including audiovisual files. It supports both bookmarking and embedding content into other webpages. The Wellcome library is offering the digital asset player as an open-source tool through GitHub.
Boal emphasized the role of policies and documentation in ensuring good communication and trust between partners in such a large project. She also showed examples of handling videos that were created for the project. She would like to see the use of videos expanded to help to create a common vocabulary between conservators, allied professionals, and other stakeholders. The responsibility for collection care is not the exclusive territory of the Collection Care Department, so the key to the ongoing digitization process at the Wellcome Library is the distribution of that responsibility to all of the staff (and external contractors) involved in the project, guided by training, planning, and policies.

42nd Annual Meeting- Book and Paper Session, May 29, 2014, "Preserving the African American Scrapbook Collection of Emory University Libraries by Ann Frellsen, Kim Norman, Brian Methot"

This three-year Save America’s Treasures project was presented in a three person tag-team. The project represented a collaboration between the Emory University Libraries Preservation Office, Digital Curation Center, and Manuscript, Archives, and Rare Book Library (MARBL).   Emory University Libraries Conservator Ann Frellsen began the presentation with a project overview. The large scale of the project, combined with the diverse materials in the scrapbooks, required rigorous assessment and planning to fit the project schedule. The forty scrapbooks were narrowed to a group of 34 high-priority items.  Then the scrapbooks were assigned to three treatment “levels,” roughly approximating the categories used in ARL (Association for Research Libraries) statistics, based on time and degree of difficulty. The past preservation approach for the collection had been to simply limit patron access by “boxing and forgetting.” For this collection of African American scrapbooks, the library intended to expand access through digitization.
The initial project proposal would have relied on a conservation technician, but the complex construction and fragile materials required a conservator’s skills. The project plan was adjusted, and the second presenter, Kim Norman, became the project conservator. The initial decision tree was quickly abandoned as the diverse scrapbooks were assigned unique treatment plans, corresponding to the specific preservation problems presented by each item. Common problems included folded items, detached components, and overlapping elements.  Adhesive stains were generally left in place, where they provided useful evidence in positioning detached items.
In some instances, new support pages were created, but many of the album pages could be encapsulated or placed into unsealed Melinex sleeves to provide support at the page level.
The third presenter, Brian Methot, described the workflow for digitization. The reflectivity of the Melinex was a hindrance to photography, so the sequence of operations was adjusted to provide for encapsulation after digitization. Ehtafoam and binders’ board shims augmented cradles used during scanning. Custom-built platforms supported delicate fold-outs during photography. There was a vacuum table as a part of the photo studio, but the thickness of the materials made the vacuum ineffective. Instead, the photography used a sheet of Acrylite acrylic. Blank pages were also scanned to maintain the correct order and appearance of the pages.
Brian also described technical requirements for the project. The library needed a camera with a faster scanning back to capture details of these large and complex pages. Cool LED lights replaced hot tungsten studio lights as well. The Phase1 camera with Mamiya scanning back was tethered to a an Apple computer running Capture1 software, which handles both image processing and metadata. The process generated three file types: MOS native camera format files, Archival Master files (TIFF at 400 ppi with color target and ruler), and Production Master files (TIFF at 400 ppi, cropped to image).
The presenters further clarified the project details during the question and answer session. A major objective of the project was to make restricted items more accessible to the public. It is hoped that additional metadata can  be crowdsourced through the open online repository with the digitized scrapbooks. There will be a digital-first access policy, so researchers will have to request special access to the originals. Regarding the conservation treatment, the paper was not deacidified prior to encapsulation. Pages were not necessarily sealed on all four edges, so this should not be a problem.
The project was successful in providing structural stabilization for the original copies, while also enhancing access to the scrapbooks’ contents. The project was discussed in Kim Norman’s blog and in the New York Times, increasing public awareness of the collection. The project has also begun to yield broader results by connecting community members with collection items.

41st Annual Meeting-Electronic Media Session, May 31, "Technical Documentation of Source Code at the Museum of Modern Art" by Deena Engel and Glenn Wharton

Glenn Wharton began with an overview of the conservation of electronic media at the Museum of  Modern Art (MoMA). When he set up the Media Conservation program at MoMA in 2005, there were over 2,000 media objects, mostly analog video, and only 20 software objects. The main focus of the program was digitizing analog video and audio tapes. Wharton was a strong advocate for the involvement of IT experts from the very beginning of the process. Over time, they developed a working group representing all 7 curatorial departments, collaborating with IT and artists to assess, document, and manage electronic media collections.
Wharton described the risk assessment approach that MoMA has developed for stewardship of its collections, which includes evaluation of software dependency and operating system dependency for digital objects.  They have increased the involvement of technical experts, and they have collaborated with Howard Besser and moving image archivists.
The presenters chose to focus on project design and objectives; they plan to publish their findings in the near future. Glenn Wharton described the three case study artworks: Thinking Machine 4, Shadow Monsters, and 33 Questions per Minute. He explained how he collaborated with NYU computer science professor Deena Engel to harness the power of a group of college undergraduate students to provide basic research into source code documentation. Thinking Machine 4 and Shadow Monsters were both written in Processing, an open source programming language based on Java. On the other hand, 33 Questions per Minute was written in Delphi, derived from PASCAL; Delphi is not very popular in the US, so the students where challenged to learn an unfamiliar language.
Engel explained that source code can be understood by anyone who knows the language, just as one might read and comprehend a foreign language. She discussed the need for software maintenance that is common across various types of industries, not unique to software-based art projects. Software maintenance is needed when the hardware is altered,  the operating system is changed, or the programming language is updated. She also explained four types of code documentation: annotation (comments) in the source code, narratives, visuals, and Unified Modeling Language (UML) diagrams.
Engel discussed the ways that the source code affects the output or the user experience and the need to capture the essential elements of presentation in artwork, which are unique to artistic software. In 33 Questions per Minute, the system configuration includes a language setting with options for English, German, or Spanish. Some functions were operating system-specific, such as the Mac-Unix scripts that allow the interactive artwork Shadow Monsters to reboot if overloaded by a rambunctious school group flooding the gallery with lots of moving shadows. Source code specified aesthetic components such as color, speed, and randomization for all of the case study artworks.
One interesting discovery was the amount of code that was “commented out.” Similar to  studies, underdrawings, or early states of a print, there were areas of code that had been deactivated without being deleted, and these could be examined as evidence of the artist’s working methods.
Engel concluded by mentioning that the field of reproducibility in scientific research is also involved with documenting and preserving source code, in order to replicate data-heavy scientific experiments. Of course, they are more concerned with handling very large data sets, while museums are more concerned with replicating the look and feel of the user experience. Source code documentation will be one more tool to inform conservation decisions, complimenting the artist interview and other documentation of software-based art.
Audience members asked several questions regarding intellectual property issues, especially if the artists were using proprietary software rather than open-source software.   There were also questions raised about artists who were reluctant to share code. Glenn Wharton explained that MoMA is trying to acquire code at the same time that the artwork is acquired. They can offer the option of a sort of embargo or source code “escrow” where the source code would be preserved but not accessed until some time in the future.

41st Annual Meeting-Book and Paper Session, May 30, "Treatment and Housing Techniques for Pastel Paintings on Paper-Case Studies" by Soyeon Choi and Jessica Makin

Soyeon Choi, Senior Paper Conservator, and Jessica Makin, Manager of Housing and Framing Services, divided their presentation into two parts: first they addressed the treatment of one individual pastel portrait, then they described a variety of housing options used at the Conservation Center for Art and Historic Artifacts, a regional conservation center in Philadelphia. The treatment and rehousing protocols were all intended to reduce the loss of friable pastel image material and to protect the weak (often brittle) paper support.  All of the examples were originally mounted onto wooden stretchers or strainers, further complicating treatment and rehousing efforts. Most of the items also retained original frames or period frames.
Soyeon Choi began by describing the work of folk artist Micah Williams (1782-1837), who was active in the early 19th century.  He created 274 known portraits, and he tended to line them with newspaper, a fact that has provided valuable provenance and date information.  The first case study portrait was mounted onto a white pine stretcher, and treatment was performed in situ.
Micah Williams pastel  2
Soyeon Choi showed how she used a mockup of  a complex, sprung tear to devise a sympathetic repair for one of the portraits.  Repair adhesives were determined by the location of the tears. In general, Klucel G was strong enough to hold most tears, yet weak enough that it didn’t place too much stress on the fragile paper support. Klucel was applied to thin kozo in advance, and individual  repair strips were reactivated with ethanol when needed.  More traditional wheat starch paste repairs were possible on the edges where the paper was in contact with the strainer and more pressure could be applied safely. Lascaux 498 HV was also used for some pastels, but I didn’t hear exactly what mix was used or how it was activated.  Choi also explained how she used ground pastels, powdered colored pencils, and dry pigments with ethanol to inpaint losses in the portraits.
Micah Williams pastel 1
In the second half of the presentation, Jessica Makin showed photographs and diagrams of different spacer configurations and frame profiles. The spacers were wrapped with toned, 1-ply Bainbridge matboard that was attached to the lignin-free, corrugated board with 3M 415 tape. Most of the frames were altered by building up the backs to accommodate the additional thickness of the spacers and glazing. In the case of a pastel by Mary Cassat, the frame could not be altered, so Makin constructed a tray with thin sides to contain the pastel and the glazing, while also supporting the glazing away from the media surface.
I feel that this  presentation loses a lot without the photographs and diagrams, so I will ask the authors to share a link to images of at least one example to better illustrate their work.