42nd Annual Meeting, Paintings & Wooden Artifacts Joint Session, May 31, "The Analysis and Reduction of an Intractable Coating for the Panel Painting by Lluis Borrassa, Christ Before Pilate," by William P. Brown & Dr. Adele De Cruz

The presentation by William P. Brown and Dr. Adele De Cruz was an awe inspiring glimpse at the future of conservation. Through the collaboration of the North Carolina Museum of Art and conservation scientists from the University of Pisa and Duke University, an intractable layer of cross-linked drying oil, animal glues, and pigmented varnish was removed from the surface of Spanish painter Lluis Borrassa’s panel painting, Christ Before Pilate, 1420-25.
The painting, which had not been exhibited for over 40 years, was the victim of previous cleaning and coating campaigns, and several layers of consolidation materials and paints and glazes had been applied to the blue passages of Christ’s robe. As a result of the cross-linking of these consolidants and the dark pigmentation of a conealing varnish layer, Christs’s robe appeared almost black.
During treatment at the North Carolina Museum of Art, solvents were successful in removing the toned varnish from the painting. However, the reduction of the complex layer of intractable material covering Christ’s robe (the abstract describes this as a composite of old consolidant, cross-linked drying oil, and restoration materials) was not so straighforward. Conservation scientists (from the aforementioned institutions) used FTIR, SEM, and GC-MS analysis to identify the components of the intractable layer and to discern them from original material, which consistsed of lapis, indigo, and orpiment pigments in egg tempera and glue or bee pollen.
Dr. De Cruz took the podium at this point in the talk to describe the methods used to reduce the intractable composite material. Essentially, laser ablation was employed, which before this talk I was only familiar with in the context of dentistry. I have to admit that my intitial reaction to hearing the terms ‘laser’ and ‘art conservation’ used together might have been a wary one, but a refamiliarzing with the techniques involved with laser ablation (and recalling the established use of this technique on the delicate enamel surfaces of our teeth) was an encouraging and exciting reminder of the vast potential of interdisiplanary approaches to art conservation.
Dr. De Cruz explained that the 2940 nm  Er:YAG (erbium) operates using an intense monochromatic wave of light (2.94 microwatts) at 15 pulses per second to vaporize the intractable material. The depth of penetration is very controllable, maintaining a shallow depth of penetration between 3-5 microns. This light pulse is highly absorbed by water, and produced a near instantaneous steam distillation. A glass cover slip is placed over the dirt, varnish, and paint layer. The laser is used to break up the intractable surface, which is ejected and contained by the glass cover slip. The debris is then swabbed from the surface of the painting and can be used for analysis.
There are several immediately obvious benefits to this method. It eliminates the need for toxic solvents, it allows for a highly controllable and low shallow depth of penetration. There is also no risk of chemical change to the substrate, and the reaction is low temperature.
Dr. De Cruz went in to incredible depth during this talk, and I realize that my summary only touches on the ammount of information she provided. I was furiously scribbling notes the entire time, and certainly wished I had a camera to take photos of her slides. I certainly look forward to hearing more about this topic in the future, and am excited for the future and ongoing collaboration of conservation and science.

42nd Annual Meeting – Architecture + Objects Joint Session, 29 May, 2014, “Conservation Realities and Challenges: from Auto Regulation to Imposition at Archaeological and Historical Sites in Colombia” by Maria Paula Alvarez

I was drawn to this presentation on account of my background in archaeology. Although I have never had the chance to visit Colombia, I was very interested to hear about the challenges, that Colombian conservators, archaeologists, and other allied professionals encounter in their efforts to preserve their country’s archaeological and historical sites.
Maria Paula Alvarez, Director at the Corporacion Proyecto Patrimonio, presented a number of interesting case studies to illustrate the types of conservation and preservation problems that she and her colleagues face and work on solving. Her examples included assessments, research, testing, and treatments at
1)         archaeological sites, such as:

  • The Archaeological Site of Fuente de Lavapatas, where the conservation issue was stone deterioration. Extensive studies – including the evaluation of the environmental conditions at the site and the geological and physical properties of the affected stone – were conducted to determine the causes of deterioration. As well, testing of treatment materials – including biocides for controlling biodeterioration and consolidants for disintegrated areas – were undertaken.
  • The Archaeological Park of Facatativa, where panels of rock art were deteriorating not only as a result of exposure to the natural environment, but also as a result of exposure to humans. Both biodeterioration and vandalism in the form of graffiti were damaging to the rock art panels. The panels received conservation attention for both problems.

2)         and historical monuments, such as:

  • The Jimenez de Quesada Monument in the city of Bogota, which had been damaged as a result of vandalism in the form of graffiti. The monument received a conservation treatment that included both the removal of the graffiti as well as the application of a coating to protect the monument against future graffiti vandalism.

In all of the cases that she presented, Maria spoke about the effect of the political, social, and economical climate on the sites’ conservation and preservation. She stressed the impact that such climates have on cultural heritage, from the care to the destruction of sites. She explained how various political, social, and economical circumstances have led her and her colleagues to determine goals and procedures for conservation and preservation projects. I found these concepts very powerful. For me, this presentation was a strong reminder of the complexities involved in the preservation of cultural heritage.

42nd Annual Meeting- Textile Group Session, May 30, 2014 "Blown-Up: Collaborative conservation and sustainable treatment for an inflatable dress" by Chandra Obie

Chandra Obie, textile conservator at the Cincinnati Art Museum, presented her work on the conservation of an Issey Miyake pneumatic dress with inflatable puffy sleeve caps.  The circa 2000 dress had lost its ability to remain inflated with the failure/deterioration of adhesion on the rubber valve sleeve stoppers.  The dress was donated to the museum by Mary Baskett, a collector of Japanese contemporary fashion, who’s costume has been displayed in the 2007 Cincinnati exhibition Where would you wear that? and 2009 at The Textile Museum in DC.  This particular dress presented the unique challenge that Mary had full intention on wearing the dress out in public to special events after treatment.  Therefore,  the treatment methodology combined the collaboration of appeasing a major donor while conserving the original shape of the sleeve.
The dress came with a 4-page care/construction tag displaying that the dress was 42%Nylon, 40%Polyester, and 18%Polyurethane.  The photo oxidation of the urethane caused yellowing and deterioration of adhesion around the valves which prevented the sleeves from remaining inflated.  Chandra further consulted with scientists and conservators via the Conservation distlist before beginning treatment.  Step one involved testing different methods for recreating the inflated sleeve shape.  Initial solutions of creating a cage structure inside the sleeve or using a medical plastic balloon failed due to access and stability of materials.  The re-adhesion of the vinyl inflation valves was attempted with craft glue, silicone, and BEVA, which all failed adhesion.  Step two developed the creation of a sleeve pattern using Stabiltex, a semi-transparent light weight polyester fabric, filled with polystyrene beads and a polyethafoam cap.  The Stabiltex edges were finished using a heated spatula to weld the polyester and prevent fraying, and a double layer was used for strength.  The sleeve was inserted into the cap and carefully placed along the original pattern, while a funnel was used to fill the cap with polystyrene beads.  The inflatable valves were tacked back into place with a few stitches.  After treatment, Mary Baskett wore her Miyake dress out for her birthday party, and was very pleased with the return of the inflatable sleeve shape.  The only noticeable difference while wearing the garment was the tendency for the sleeves to shift forward on her body.
While the treatment was successful, post-talk discussions with other conservators presented the apprehension of long-term stability of the polystyrene beads.

42nd Annual Meeting- Electronic Media Session, May 31, 2014, "The California Audiovisual Preservation Project: A Statewide Collaborative Model to Preserve the State’s Documentary Heritage by Pamela Jean Vadakan"

The California Light and Sound Collection is the product of a collaboration between 75 partner institutions with original recordings of audiovisual content in California. Following a 2007 statewide collection survey that used the University of California’s CALIPR sampling tool, it was discovered that over 1 million recordings were in need of preservation. In 2010, the California Audiovisual Preservation Project (CAVPP) was founded. Recipients of a National Leadership Grant from the Institute of Museum and Library Services (IMLS), the CAVPP uploaded its first video in 2011.
Like previous statewide initiatives within the California Preservation Program, the project is based at the University of California at Berkeley, where Barclay Ogden provides leadership for the project. By repurposing existing staff and existing tools, the project is able to realize a high level of efficiency. Each partner institution is responsible for surveying its own collection with CALIPR, adding its own records to CONTENTdm, and sending its own recordings with metadata to CAVPP. It is anticipated that open-source tool OMEKA will replace CONTENTdm, because the project partners should not be dependent upon costly proprietary software site licenses.
CAVPP adds administrative metadata, confirms the descriptive metadata, and sends content to the vendors. The vendors include MediaPreserve, Scenesavers, and Bay Area Video Coalition. The vendors produce a preservation file, a mezzanine file, and an access file for each item. Moving forward the project will discontinue creating the mezzanine file, because the preservation file is more useful. Two copies of each file are saved to Linear-tape-open (LTO) and one on the Internet Archive’s servers. Storage costs are about five dollars per recording. CAVPP is also responsible for running checksums and checking video quality. Problems have included out of sync audio, shifts in hue or saturation (chrominance), and shifts in value (luminance). The AV Artifact Atlas has proven essential to the quality control process.
It is crucial for a project this large to have a clear scope in terms of both content and format. The criteria for selection include statewide or local significance, unpublished or original source material, and public domain content. The project also encompasses content for which rights have already been acquired. In some cases, “unknown” has been used a placeholder for missing copyright information. The materials are also subject to triage in terms of the original physical format and condition (preservation need). The project is limited to digital conversion. Film-to-film conversion is outside the scope of the project, but it is hoped that project partners can leverage this project to facilitate projects for high-definition video and film-to-film conversion.
The project has already exceeded its original goals. In the first year, CAVPP uploaded 50 recordings. Now the project has grown to 75 institutions and over 1,400 recordings. It is anticipated that there will be over 3,000 recordings by the end of 2014. Future steps include an assessment of who is using the collection and how they are using it. The project also includes outreach workshops scheduled for project partners in 2014 and 2015.

42nd Annual Meeting- Book and Paper Session, May 29, 2014, "The impact of digitization on conservation activities at the Wellcome Library by Gillian Boal"

The Wellcome Library relies on cross-training and written policies to facilitate the increased involvement of non-conservators in the digitization workflow. Gillian Boal explained that the Wellcome Library, the UK’s largest medical library at over 4 million volumes and the public face of one of the world’s largest private charities, aims to digitize its entire holdings. In order to provide free online access to the entire collection, they have to involve a large group of internal and external partners. Some items are scanned in-house, while others are contracted out to the Internet Archive.
The role of the conservators is primarily to ensure safe handling of the original physical items. To that end, they have trained allied professionals to serve as digital preparators, empowered to perform minor conservation procedures. Treatments are divided into two groups: dry and wet. Dry treatment includes removal of paperclips and staples, for example. These dry procedures are often performed outside of a conservation lab by archivists and librarians in many institutional contexts where there are no conservators. Those procedures are an obvious fit for the non-conservators working on the project. Wet procedures include both aqueous and solvent treatments. Wet treatments are more likely to require the skills of conservation personnel with lab equipment.
Complex folded items presented a special challenge that was met with creativity. The presentation included examples where overlapping parts were lifted onto a cushion of Plastazote™ cross-linked polyethylene foam during digitization. Boal pointed out the shadows visible in the scanned documents where overlapping parts were supported by these foam shims. This is important because the customary use of a glass plate to hold materials flat for photography would have added extra stress or new creases in the absence of a cushion. The digital preparators were empowered to use their own judgement to open non-brittle folded items without humidification; such items were held flat under glass for scanning. Other items were photographed without glass, to accommodate three-dimensional paper structures.
The Internet Archive also acted as a preservation partner, re-routing items to conservation as needed. For example, a volume with a torn page was intercepted by the Internet Archive’s assessment process in order to receive treatment by the conservators.
The digitization of collections is primarily about access. To enhance that access, the Wellcome Library developed “the player” as a tool to view a variety of different types of content from the same interface. It enables downloading or embedding a zoomed-in part of a page, in addition to options for high-resolution and low-resolution images. “The player” also functions as a sort of e-reader interface for books, and it responds dynamically to create the appropriate interface for the type of item accessed, including audiovisual files. It supports both bookmarking and embedding content into other webpages. The Wellcome library is offering the digital asset player as an open-source tool through GitHub.
Boal emphasized the role of policies and documentation in ensuring good communication and trust between partners in such a large project. She also showed examples of handling videos that were created for the project. She would like to see the use of videos expanded to help to create a common vocabulary between conservators, allied professionals, and other stakeholders. The responsibility for collection care is not the exclusive territory of the Collection Care Department, so the key to the ongoing digitization process at the Wellcome Library is the distribution of that responsibility to all of the staff (and external contractors) involved in the project, guided by training, planning, and policies.

42nd Annual Meeting – Painting Session, May 30, "A Hangover, Part III: Thomas Couture's Supper After the Masked Ball"

Conservators are often faced with objects that have had extensive past treatments. While undertaken with the best intentions, some treatments have resulted in aesthetically jarring effects and loss of original information embedded in the construction of the work. Fiona Beckett explored these challenges of decision-making within the treatment of Thomas Couture’s Supper After the Masked Ball (1855).
The large painting is a depiction of a scene in the Maison d’Or in Paris following a party in the infamous hangout for artists and writers. The hungover revelers acted as vehicles for Couture’s commentary about the degradation of society’s morals. Although the composition was originally intended for use as a wall paper design, Couture seemed to have a soft-spot for this scene and the finished painted version was kept in his studio as illustrated by its numerous appearances in drawings and depictions of the studio space.

Thomas Couture's Supper After the Masked Ball (1855) Courtesy of the National Gallery of Canada
Thomas Couture’s Supper After the Masked Ball (1855)
Courtesy of the National Gallery of Canada

Supper After the Masked Ball had undergone two linings and at least two cleaning treatments in the past. It had been relegated to storage for the last 90 years because of its problems. While one lining was done with glue paste the second used wax resin resulting in an uneven combination of the two residues on the verso of the canvas. Ms. Beckett described the factors that had to be considered before removal of the lining. Some of the effects from the lining treatments included wax residue stains, shrinkage of the canvas and compression tenting from the glue paste, and flattening caused by the irons. Additionally, Couture’s habit of testing tints of colors on the verso of his paintings was obscured by the lining’s presence. The condition of the lining was such that it had already began to separate fairly easily from the original canvas and it was decided, after determining that it was not appreciably stabilizing the painting, to remove it. After removal, the color tints were indeed visible on the verso of the canvas. Another interesting aspect of Ms. Beckett’s treatment was her use of Gellan Gum to locally moisten and soften the glue residues on the verso prior to mechanical removal with a spatula.
The decision to not re-line Supper After the Masked Ball followed the trend to refrain from re-lining, but was also informed by other factors specific to the painting. The original canvas was in good condition after the lining removal and the previous linings appeared to not have been necessary. The residual glue and wax residues seemed to have added strength to the canvas as well. Lastly, the absence of the lining allowed easy viewing of the brush marks on the verso.
Final steps in the treatment included a spray application of B-72 to the verso, strip lining with Lascaux P110 fabric and BEVA, and building up the face of the stretchers to an even surface with the addition of mat board and a felt-like non-woven polyester.
Supper After the Masked Ball was an excellent case study to illustrate the decision-making processes conservators must use when approaching prior extensive treatments. Ms. Beckett made an astute observation that it is quite easy for us to criticize these past treatments, but we must acknowledge that they were carried out with the intentions to preserve and stabilize using the most advanced technology available at the time. Often it’s the case that these linings and such really did have a positive effect on the preservation of the pictorial surface, although these measures need to sometimes be undone in the present day when we have less invasive and more effective processes available.

42nd Annual Meeting – Book and Paper Group Session, May 30, “Conserving the Iraqi Jewish Archive for Digitization” by Katherine Kelly and Anna Friedman

Katherine Kelly and Anna Friedman presented on a two-year project funded by the Department of State and carried out at the National Archives and Records Administration (NARA) to conserve and digitize the Iraqi Jewish Archive. This is not an archive that was collected in the traditional sense, but rather materials taken from the Jewish community over many years–the collections were discovered in the flooded basement of the Iraqi Intelligence Headquarters in Baghdad in 2003.
National Archives conservators Doris Hamburg and Mary Lynn Ritzenthaler traveled to Iraq shortly after the discovery to advise on recovery and preservation of the collection. The damaged materials were frozen and flown to the US, where they were vacuum freeze-dried. Following a smaller-scale project in 2006 to assess the collection, the hard work to clean, stabilize, and digitize the heavily-damaged and moldy collections was carried out during the two year project that was the focus of this presentation.
I am always amazed at the sheer scale of projects undertaken at NARA and the organization required to tackle the work within a limited timeframe. Katherine and Anna’s presentation included discussion of adaptations of the usual National Archives workflows to increase the efficiency of the project and to aid conservators in their work. For most materials, the first step in stabilization was to remove inactive mold. Distorted items were humidified and flattened, and tears were mended. Items that had originally been attached to documents with water-soluble adhesive, like stamps and some photographs, had often released due to the flood waters and subsequent humidity; these items were repositioned and reattached whenever possible. Once stabilized, materials could be rehoused, catalogued, and digitized. Through every step of the process, materials were tracked through the workflow using SharePoint software.
The culmination of the project is a digital collection of all 3846 items, which allows the materials to be made available to everyone. An exhibition featuring highlights of the collection was shown both at the National Archives in DC and at the Museum of Jewish Heritage in New York. Another component of the project was the creation of a website with detailed information about the collection and its history, documentation of procedures, and an online version of the exhibit. I particularly enjoyed the short video describing the history of the project, featuring many of the conservators who were involved over the years.
I often listen to NPR while working in the lab, and last November I was excited to hear my former classmate Katherine Kelly in a feature on All Things Considered. If you missed Katherine and Anna’s presentation in San Francisco, I highly recommend a visit not only to the project website, but also to the NPR feature to learn more about the important work to preserve this collection and make it accessible.

42nd Annual Meeting – Electronic Media Group Luncheon, May 30, “Sustainably Designing the First Digital Repository for Museum Collections”

Panelists:
Jim Coddington, Chief Conservator, The Museum of Modern Art
Ben Fino-Radin, Digital Repository Manager, The Museum of Modern Art
Dan Gillean, AtoM Product Manager, Artefactual Systems
Kara Van Malssen, Adjunct Professor, NYU MIAP, Senior Consultant, AudioVisual Preservation Solutions (AVPreserve)
This informative and engaging panel session provided an overview of The Museum of Modern Art’s development of a digital repository for their museum collections (DRMC) and gave attendees a sneak peak at the beta version of the system. The project is nearing the end of the second phase of development and the DRMC will be released later this summer. The panelists did an excellent job outlining the successes and challenges of their process and offered practical suggestions for institutions considering a similar approach. They emphasized the importance of collaboration, communication, and flexibility at every stage of the process, and as Kara Van Malssen stated towards the end of the session, “there is no ‘done’ in digital preservation” — it requires an inherently sustainable approach to be successful.
This presentation was chock-full of good information and insight, most of which I’ve just barely touched on in this post (especially the more technical bits), so I encourage the panelists and my fellow luncheon attendees to contribute to the conversation with additions and corrections in the comments section.
Jim Coddington began with a brief origin story of the digital repository, citing MoMA’s involvement with the Matters in Media Art project and Glenn Wharton’s brainstorming sessions with the museum’s media working group. Kara, who began working with Glenn in 2010 on early prototyping of the repository, offered a more detailed history of the process and walked through considerations of some of the pre-software development steps of the process.
Develop your business case: In order to make the case for creating a digital repository, they calculated the total GB the museum was acquiring annually. With large and ever-growing quantities of data, it was necessary to design a system in which many of the processes – like ingest, fixity checks, migration, etc.- could be automated. They used the OAIS (open archival information system) reference model (ISO 14721:2012), adapting it for a fine art museum environment.
Involve all stakeholders: Team members had initial conversations with five museum departments: conservation, collections technologies, imaging, IT applications and infrastructure, and AV. Kara referenced the opening session talk on LEED certification, in which we were admonished from choosing an architect based on their reputation or how their other buildings look. The same goes for choosing software and/or a software developer for your repository project – what works for another museum won’t necessarily work for you, so it’s critical to articulate your institution’s specific needs and find or develop a system that will best serve those needs.
Determine system scope: Stakeholder conversations helped the MoMA DRMC team determine both the content scope – will the repository include just fine arts or also archival materials? – and the system scope – what should it do and how will it work with other systems already in place?
Define your requirements: Specifically, functional requirements. The DRMC team worked through scenarios representing a variety of different stages of the process in order to determine all of the functions the system is required to perform. A few of these functions include: staging, ingest, storage, description & access, conservation, and administration.
Articulate your use cases: Use cases describe interactions and help to outline the steps you might take in using a repository. The DRMC team worked through 22 different use cases, including search & browse, adding versions, and risk assessment. By defining their requirements and articulating use cases, the team was able to assess what systems they already had in place and what gaps would need to be filled with the new system.
At this point, Kara turned the mic over to Ben Fino-Radin, who was brought on as project manager for the development phase in mid-2012.
RFPs were issued for the project in April 2013; three drastically different vendors responded – the large vendor (LV), the small vendor (SV), and the very small vendor (VSV).
Vetting the vendors: The conversation about choosing the right vendor was, in this blogger’s opinion, one of the most important and interesting parts of the session. The LV, with an international team of thousands and extremely polished project management skills, was appealing in many ways. MoMA had worked with this particular vendor before, though not extensively on preservation or archives projects. The SV and VSV, on the other hand, did have preservation and archives domain expertise, which the DRMC team ultimately decided was one of the most important factors in choosing a vendor. So, in the end, MoMA, a very big institution, hired Artefactual Systems, the very small vendor. Ben acknowledged that this choice seemed risky at first, since the small, relatively new vendor was unproven in this particular kind of project, but the pitch meeting sold MoMA on the idea the Artefactual Systems would be a good fit. Reiterating Kara’s point from earlier, that you have to choose a software product/developer based on your own specific project needs, Ben pointed out that choosing a good software vendor wasn’t enough; choosing a vendor with domain expertise allowed for a shared vocabulary and more nimble process and design.
Dan Gillean spoke next, offering background on Artefactual Systems and their approach to developing the DRMC.
Know your vendor: Artefactual Systems, which was founded in 2001 and employs 17 staff members, has two core products: AtoM and Archivematica. In addition to domain expertise in preservation and archives, Artefactual is committed to standards-based solutions and open source development. Dan highlighted the team’s use of agile development methodology, which involves a series of short term goals and concrete deliverables; agile development requires constant assessment, allowing for ongoing change and improvement.
Expect to be involved: One of the advantages of an agile approach, with its constant testing, feedback, and evolution, is that there are daily discussions among developers as well as frequent check-ins with the user/client. This was the first truly agile project Artefactual has done, so the process has been beneficial to them as well as to MoMA. As development progressed, the team conducted usability testing and convened various advisory groups; in late 2013 and early 2014, members of cultural heritage institutions and digital preservation experts were brought in to test and provide feedback on the DRMC.
Prepare for challenges: One challenge the team faced was learning how to avoid “scope creep.” They spent a lot of time developing one of the central features of the site – the context browser – but recognized that not every feature could go through so many iterations before the final project deadline. They had to keep their focus on the big picture, developing the building blocks now and allowing refinement to happen later.
At this point in the luncheon, the DRMC had it’s first public demo. Ben walked us through the various widgets on the dashboard as well as the context browser feature, highlighting the variety and depth of information available and the user-friendly interface.
Know your standards: Kara wrapped up the panel with a discussion of ‘trustworthiness’ and noted some tools available for assessment and auditing digital repositories, including the NDSA Levels of Digital Preservation and the Audit and Certification of Trustworthy Digital Repositories (ISO 16263:2010). MoMA is using these assessment tools as planning tools for next the phases of the DRMC project, which may include more software development as well as policy development.
Development of the DRMC is scheduled to be complete in June of this year and an open source version of the code will be available after July.

42nd Annual Meeting: Health and Safety Session, ‘Solvents, Scents and Sensibility: Swapping – Solvent Substitution Strategies’ by Chris Stavroudis

Part I of ‘Solvents, Scents, and Sensibility: Sequestering and Minimizing’ was presented on Friday and encouraged the use of Pemulen TR – 2 in cleaning as an alternative to solvents or as a vehicle for solvents.
The topic of Part II was substituting safer solvents for more hazardous ones. Chris Stavroudis began the talk with a warning: There is no perfect substitute for Xylenes. He did, however, address some alternatives later in his talk.
Some of the harmful solvents that Chris suggested replacing were:
Benzene (a carcinogen) – can be replaced with xylene or toluene (although these alternatives are also hazardous)
n-Hexane (a neurotoxin) – can be replaced with n-Heptane
DMF – replace with n-methyl-2-pyrrolidone (NMP), although this may also be hazardous
Methanol – replace with Ethanol
Cellosolve and Cellosolve Acetate – just don’t use them! May be able to substitute butyl Cellosolve
Chlorinated Solvents – don’t use them. 1,1,1 trichloroethane is the least of the evils, but is terrible for the environment
Xylenes (it is a mixture of isomers and contains varying levels of ethyl benzene) – It may be safer to use xylene (single isomer) but this hasn’t been adequately tested.
 
Stavroudis stressed the fact that there is a difference between a safe solvent and an untested solvent. The two should not be confused and proper safety precautions must be made. He gave multiple examples of solvents that were once considered to be safe and that we now know can be hazardous (ex: d-limonene).
The use of silicone solvents was encouraged because they are versatile, as they can be cyclic or linear, and have a very low polarity. Silicone solvents may be safer than alternative solvents. They are found in make-up, are practically odorless (although this makes exposure difficult to gauge).
Another safer solvent that Chris mentioned was Benzyl Alcohol which has aromatic and alcoholic functionality, although it is toxic to the eyes.
Chris ended his talk with a review and discussion of solubility theory, including the Hildebrand and Hansen Solubility parameters and the TEAS diagram. This review was focused on the problem of finding a replacement for Xylene, a solvent that would have the same solubility characteristics. Chris’ Modular Cleaning Program is a greener and healthier technique/tool and includes Hildebrand, Hansen, and TEAS solubility theories. Using these theories the solvent mix that most closely matches the solubility characteristics of Xylene is a mixture of nonane and benzyl alcohol. There is more experimentation to be done and the next version of MCP can help you experiment with solvent mixtures and solubilities.

“42nd Annual Meeting,” Collections Care Speciality session, May 29th, 2014, "Simple Method for Monitoring Dust Accumulation in Indoor Collections." Bill Wei

“Simple Method for Monitoring Dust Accumulation in Indoor Collections,” by Bill Wei was the first session in the Collections Care specialty section that was given on Thursday afternoon. As a museum technician in Preventive Conservation, dust is something I deal with on an almost daily basis. I thought that Bill’s talk could lend some valuable insight to my work, and I wasn’t wrong.  Bill Wei is a Senior Conservation Scientist at the Rijksdienst voor het Cultureel Erfgoed, and in his session he presented on a simple and easily implemented way a museum could monitor how fast dust accumulates in an indoor collections space. He used the Museum de Gevangepoort and the Galerij Prins Willem V to demonstrate how the method.
The talk started off with a humorous introduction by Bill about views on dust in museum spaces. How for some people, museum professionals in particular, we can take a defensive stance on dust as if it implies we aren’t doing our jobs. For other individuals, dust adds an element of age that seems appropriate. He also mentioned that when the words “dusty museum” are googled the result is over 12,000 hits. Apparently more than just museum professionals see dust. Bill brought up the fact that dust is not only an aesthetic issue in museums, it can present chemical and health issues, and it can be costly and timely to remove. The two sites were then introduced, both of which house collections and are historic buildings. Construction was being done near the sites, and there was a concern about how much more dust accumulation this might cause, so they provided a good case study. Bill then introduced the question of how do you monitor dust?
Bill explained that dust on the surface of an object causes the light to bounce off in many different angles, as opposed to at the same angle, this makes a surface look matte. The resulting matte surface can then be considered to have lost gloss. This loss of gloss is something that can be measured using a glossmeter. The type of glossmeter used during this test was made by Sheen manufacturers. Bill was careful to point out that this test doesn’t measure how much dust you have, but how quickly it will accumulate. For this run of the test Bill used microscope glass slides, because they are cheap, reusable and glossy. The steps of the test are as follows:

  1. Using the glossmeter, measure a clean slide on a white background (copy paper is suitable. This should be the same background used throughout testing.)
  2. Put slides out at various locations you wish to test, remembering that the more slides you put out, the more work you will have to do. The slides should be placed in out of the way locations and staff should be told about them.
  3. After a predetermined amount of time (ex. one month), using the glossmeter measure the slide on the same background that you used in step 1.
  4. Clean the slide, and reuse, starting over at step 1.

The calculation that is then used to determine the rate of accumulation of dust over the time period is
Fraction change= (Dusty Slide after 1 month measurement – Clean Slide measurement)/ (Clean slide measurement)
Multiply that by 100 to get the percentage.
Bill explained that for every month that you take a glossmeter measurement, you add the value of the new measurement to the previous, since this is cumulative you will go over 100% at some point. You can then use these values and plot them in a graph over time.
If you wanted to test the dust samples, to find out where the dust was coming from and what it was made of, you could incorporate small conductive carbon stickers on the slides. Since this talk focused on the accumulation, not the source of the dust, this topic was not discussed in detail.
The placement of the slides was at one point done both vertically and horizontally surface. The vertical placement was done to mimic how much dust a painting might accumulate. However the vertically placed slides needed a much longer period of time to really show a loss in gloss, so it was not considered as necessary to run both types of slide placement.
When it came to analyzing the results of this test one thing that was found was the fact that the slide nearest the entry had the most dust. When it’s results were plotted onto a graph it produced the steepest slope over time. The more visitors a museum has, the more dust accumulation occurs. During peak tourist times there was a correlating peak in dust accumulation. One thing that was also noticed at the Museum de Gevangepoort was that during construction periods there was also a rise in dust accumulation. The results confirmed a long held thought that visitors are one of the main sources of dust in museums.
Bill then talked briefly about the chemistry of dust. When the dust was analyzed it was found to contain salts, iron, chalk, sand, clay and concrete among other things. When the makeup of the dust was looked at, it was possible to notice trends, for example during the winter months, February in particular there was a noticeable rise in the amount of salts found. Looking at what the dust was comprised of could allow scientists to identify the source of the dust.
Bill pointed out that the idea of too much dust isn’t really something that is definable in terms of science. It’s more defined by people’s perception of it. Different surface types can be just as dusty as one another, but if the dust is more visible on one type of surface, say plexi, the viewer read’s that surface as being less clean.
In discussing an action plan for dust monitoring Bill said you have to determine why you are doing it, i.e. to see if your new HVAC system is producing better results, and it’s important to define “too much dust” as a difference in gloss.
The questions asked after Bill’s presentation included, how many/ what angle should a gloss measurement be taken, to which Bill answered one measurement at 85 degrees was sufficient. He was also asked how often one should be taking measurements. He said that three to four weeks at most will produce good results, if you measure too soon a change won’t be seen.
Bill’s presentation was informative and lively. He presented a system for testing dust accumulation that could easily be implemented and followed. Thanks to Bill for a great talk!