45th Annual Meeting, Book and Paper + RATS Session, May 31: “Contacts that Leave Traces: Investigations into the Contamination of Paper Surfaces from Handling,” by Karen van der Pal

In libraries, archives, and museums around the world, those in charge of protecting cultural heritage struggle with the topic: Gloves or No Gloves? Karin van der Pal’s talk on the contamination of paper surfaces from handling gives measurable data pertaining to the debate.

Van der Pal’s studies in forensic analysis are being conducted at Curtin University in Western Australia. She is currently collaborating with the Indianapolis Museum of Art on the chemistry of latent fingerprints and with Flinders University, in South Australia.

Van der Pal received paper samples from an Australian paper mill to conduct her research. She first solidified her own approach on how to not contaminate the papers she was testing: wearing cotton gloves underneath nitrile gloves she could take off the top layer and replace with a new set of gloves during the process without any of her marks coming through.

Historically, we know that dark fingerprints appear on paper. The edges of leaves in books become discolored as well. But is this a result of dirt, or could it be because of fingerprint oils? Van der Pal explained that the residue left by fingermarks include aqueous deposits, lipids, and dead skin. The proportion varies based on a person’s age, gender, and diet. Another variable on the kind of mark that is left is environmental exposure. If the pages with the contamination are left in the dark, there is little discoloration, but exposure to light causes the marks to darken.

Fingerprint deposits can be a combination of sebaceous oils and sweat from ecrine and apocrine glands. Typically, van der Pal explains that when a finger print is left, the oily sebaceous residue is on top, while amino acids sink into the paper, and the oil residue evaporates. In van der Pal’s experiments, the fingerprints are not visible to the naked eye, so it was necessary to apply an indicator agent that could show the intensity/saturation of the print left on her test papers. Ninhydrin has historically been used, that develops a fingerprint into a pink-purple. 1,2-Indandione/Zn Chloride exhibits color and luminescence and can show marks left up to 150 years old, so van der Pal selected this to use as an indicator.

The goal of the speaker’s most current experiments was to determine how effective hand washing is, if contaminants pass through gloves, and what effect hand gels and sanitizers have on papers. Using the 1,2 Indandione/Zn Chloride, van der Pal was able to determine that no contaminants come through nitrile gloves up to 2 hours. She cautioned that fingerprints and oils can still be picked up onto the outside of the nitrile gloves if one handles doorknobs and keyboards, for example. One also has to be mindful that wearing nitrile gloves for an extended amount of time is very unpleasant, so an option could be to wear cotton gloves underneath.

Van der Pal’s experiments show that 5 minutes after handwashing, the oils in the skin come back, and that 15 minutes after washing, there is more oil than prior to washing because the body is working to redevelop the oil lost.

Hand creams are left on the surface of the paper.

Antibacterial gels also do not prevent oils from being left on paper.

In the future van der Pal expects to study how drying/aging affects a wider range of paper, how long the fingermarks last on the paper, and what effects whether the marks darken.

Questions from the Floor:

Q1: Can you still detect marks on paper that have been washed? A1: Yes, you can still detect marks on paper that has been subsequently washed up to 3 months.

Q2: Regarding gels, how long did you wait until you tried to detect the oils? A2: we tested at different intervals of time.

Q3: Was there a transfer of the materials/paper to the gloves? A1: Reusing gloves can cause a transfer. Some gilding can attach to cotton gloves. Nitrile shouldn’t pick much up.

44th Annual Meeting – Research and Technical Studies Session, May 16, 2016, “Combining RTI with Image Analysis for Quantitative Tarnish and Corrosion Studies” by Chandra Reedy

This talk focused on the combination of two technologies, Reflectance Transformation Imaging (RTI) and Image analysis. Much of the talk dealt with the application of these two technologies to evaluate accelerated aging or Oddy Test coupons in a quantitative manner. As the evaluation of Oddy tests has traditionally been subjective, making reproducibility problematic, I was particularly interested in the potential for quantitative analysis.
Reflectance Transformation Imaging (RTI) is a relatively inexpensive and simple tool that creates a mathematically synthesized image of an object’s surface from a series of image (typically ~36) lit from different angles and directions. The image produced by the RTI software can reveal visual information that is difficult to discern under normal conditions.
Image analysis software utilizes algorithms that enhance the visual separation of features and marks them for analysis, a process known as segmentation, thereby enabling those features to be quantified. The software used by the authors of this presentation was Image Pro Premier by Media Cybernetics, which has previously been used for thin section analysis of ceramics.
The authors used RTI and Image analysis in combination to evaluate Oddy test coupons. The process aided in visual assessment, improved the documentation of the results, and provided quantitative results. Adding RTI and Image analysis to the Oddy test protocol was not a cumbersome addition, requiring only ~ 20 minutes. It was noted that the type of coupon used made a big difference for this technique, as foil and bent coupons were not ideal since the added texture complicated interpretation of the results.
After exposure, the coupons were photographed and processed in batches by metal: silver, copper, and lead. A single image of the coupons was chosen from the RTI viewer and used for image analysis. A different protocol was used for each metal. The image of the lead coupons was converted to grayscale and the colors inverted, background, control, and corrosion areas were defined, and the “Smart Segmentation” tool used to separate and quantify them. The image of the copper coupons was not converted to grayscale and the variety of corrosion types were all treated the same by the segmentation process. The image of the silver coupons was converted to grayscale or pseudo-color to enhance differences before segmentation. The software allows for individual segmentation protocols to be saved and reused. The percentage of tarnished to untarnished surface could be calculated for each metal. Comparison with visual evaluation of test coupons yielded the following results:
Control or clear pass: 1-4% tarnish
Clear Fail: 45-100% tarnish
Pass for temporary use: 7 – 17% tarnish
The “temporary” category is particularly hard to judge when evaluating Oddy tests in the traditional manner, so this method seems to be especially useful in this case.
In addition to Oddy test results, RTI and image analysis were used by the authors to evaluate rapid corrosion tests and coating tests. In each case, like with the Oddy tests, the process provided good documentation as well as the possibility for quantitative results. The combination of these techniques seems to have great potential for a number of applications and their relative simplicity and inexpensiveness make them a great tool for institutions with limited analytical capabilities.

43rd Annual Meeting – Research and Technical Studies, May 15, “Parylene Treatment for Book/Paper Strengthening” by John Baty

In the 1990s there was a pioneering study on the use of parylene to strengthen brittle book paper performed by Don Etherington, David Grattan, and Bruce Humphrey. Ultimately their research did demonstrate that parylene strengthened weak, brittle paper, but several concerns regarding the material’s long term effects were raised; such as reversibility and the uncertainty of its aging properties. John Baty and his colleagues at the Heritage Science for Conservation Research Center at Johns Hopkins University, sought to reexamine the potential for using parylene to strengthen brittle paper, given the improved scientific instruments and analysis methods available today. Their research sought to answer five primary research questions:  does parylene strengthen paper, what is the permanence of its effect, what are the side effects, how can parylene treatment be scaled up, and how can it be reversed.  Currently they have answered the first two and are conducting ongoing research.
Parylene is applied to brittle books by using a chamber that draws a vacuum and essentially pulls sublimated parylene through the system. The amount of parylene dimer that is added to the chamber directly correlates to the thickness of the deposited film. Previous research had not optimized the amount of parylene needed to achieve a desirable film layer, so this was a primary goal for Baty and his colleagues. The success of the treatment was evaluated using three mechanical paper strength tests: tensile testing, the MIT fold endurance test, and the Elmendorf tear test.
Baty and his team found that using 3 grams of parylene was sufficient to strengthen brittle paper to the point that it behaved similarly to modern wood pulp paper and only imparted a smoother appearance to its surface. 5 grams of the dimer was too much and conservators inspecting the pages concluded that the paper had a more “plasticky” and stiff feel to it. The three mechanical tests did indicate that the brittle paper samples were strengthened with the addition of a parylene coating, but there are still questions regarding this treatment’s reversibility and side effects that remain to be answered by Baty and his team in subsequent research.

43rd Annual Meeting – Research and Technical Studies, May 15, “The Deacidification of Contemporary Drawings: A Safe Method Based on Nanotechnology” by Giovanna Poggi

The degradation of cellulose-based materials, such as paper and canvas, is exacerbated by the presence of acidity caused by the natural aging process, various sizings, surface coatings, inks, or other papermaking products. Conservators attempt to mitigate this problem by the use of alkaline compounds to deacidify the substrate and impart an alkaline reserve within the fibers to counteract future acidity. In the case of paper-based objects, deacidification is most commonly accomplished by either washing in an alkaline bath or spraying on a solvent-based dispersion solution of micro-particles of magnesium or calcium.
Dr. Poggi’s talk presented research into a new method that can be used to deacidify paper-based objects using an apolar solvent dispersion of alkaline nanoparticles applied topically (an airbrush was used in these experiments), without the need for full immersion. The benefit of using nanoparticles for deacidification is that these particles have a higher surface area which react more readily with acidic compounds, creating a faster neutralization reaction; they are more homogenous in structure; and nanoparticles are able to penetrate further through the paper fibers, surface coatings, and sizing than micron sized particles. This research was conducted as part of the broader Nano for Art project, which seeks to devise new methods for the conservation and preservation of art using nanotechnology. More information can be found at their website: http://www.nanoforart.eu/.
Through the use of solvothermal reactions, Dr. Poggi and her colleagues were able to procure nano-sized particles of a crystallized form of CaOH in ethanol. They discovered that an alcohol based system created a stable, highly concentrated dispersion without the need for further purification and was very effective at deacidification. However, this solution could not be applied to more modern papers containing inks which were sensitive to alcohol. Apolar solvents were explored due to the fact that they would not adversely impact the topography of cellulose substrates. A variety of problematic inks were tested, such as ballpoint pen and felt tip marker, until it was determined that cyclohexane was the most appropriate solvent to use. During experimentation on both mockups and actual works of art, it was found that the cyclohexane dispersion did not adversely affect modern inks nor the topography of the paper substrates. Aging tests were performed on samples and indicated that papers treated with the nanoparticle dispersion discolored less and had an improved degree of polymerization when compared to aged, untreated samples.
Dr. Poggi’s presentation was very interesting and I’m looking forward to learning more about the use of these nanoparticle solutions to achieve a more effective and hopefully long lasting form of deacidification.

41st Annual Meeting – Paintings Session, Thursday May 30, "Traditional Artist Materials in Early Paintings by Andy Warhol" by Christopher A. Maines

Photo of Christopher A. Maines of National Gallery of Art giving his presentation at AIC 2013: Traditional Artist Materials in Early Paintings by Andy Warhol
Photo of Christopher A. Maines of National Gallery of Art giving his presentation at AIC 2013: Traditional Artist Materials in Early Paintings by Andy Warhol

 
I was looking forward to hearing this talk by Christopher Maines, Conservation Scientist from the National Gallery of Art, on artist materials used by Andy Warhol in his earlier artworks, especially since it mentioned the possibility of traditional materials. Maines began his talk with a brief summary on Warhol’s early techniques as a commercial artist between 1949-1960, specifically the blotted-line technique. Warhol’s first pop paintings during 1960 and 1962 consisted of acrylic paints on primed, stretched canvas which he hand-painted, such as 1962’s A Boy for Meg. The end of the 1960s, Warhol moved into using hand-cut silk screens with synthetic polymer paints, such as 1962’s Green Marilyn. Warhol continued to use these silk screens and synthetic polymers into the 1980s, before dying in 1987. In summary, Warhol chose to use these particular materials because they were quick drying, offered a thrill or chancy nature,  and  Warhol was accepting of any imperfections which occurred during the creative process, such as drips.
A Boy for Meg. Andy Warhol 1962 (left). 129 DIE IN JET!. Andy Warhol 1962 (right).
A Boy for Meg. Andy Warhol 1962 (left). 129 DIE IN JET!. Andy Warhol 1962 (right).

 
Maines continued to discuss synthetic polymer paints and thoughts when they were originally introduced. The NGA began analysis of Warhol’s A Boy for Meg in preparation for an upcoming exhibition to determine it’s material composition. The artwork was sampled in four places and GC-MS analysis revealed Warhol was using drying oil and egg when he was transitioning from his commercial work into his pop paintings. It was likely that Warhol was using egg as a material because he was already familiar with its behavior. NGA was fortunate enough to be granted the opportunity to sample from two other artworks from this time period owned by museums in Germany: 129 DIE IN JET! and DAILY NEWS. Both revealed drying oil and egg in these samples, as acrylic paints over a ground layer consisting of drying oil and egg.
I found this talk very interesting, especially to know that Andy Warhol was using a mixture of traditional and modern materials in his artworks. Scientific analysis can provide such fantastic insight to the working materials and methods of artists and I am very glad NGA shared their findings for this time period of Warhol’s career at this year’s AIC Annual Meeting.
Any there any other Warhol fans out there? What are some of your favorites of Warhol works? If you could read the scientific analysis report for any famous artwork to find out exactly what the artist used, what would it be? Please share any thoughts or comments!
 
NOTE: Other authors on lecture are Suzanne Q. Lomax, Organic Chemist and Jay Krueger, Senior Conservator of Modern Paintings, both at the National Gallery of Art.
 

AIC RATS – Microclimates – June 1, 2011

AIC RATS – Microclimates
June 1, 2011

Museum environmental guidelines and the implementations of change
Charlie Costain – Conservation and Scientific Services,
Canadian Conservation Institute (CCI)

Follow up to “plus/minus” dilemma we’ve been having otherwise known as: “Should we loosen up the environmental restrictions on museum loans to other institutions?”

At CCI – 2,500 museums support across Canada and 500 archives. They were looking for an approach that can be adapted for a variety of organizations.

Recap of “Plus/Minus Dilemma” at AIC 2010: http://www.iiconservation.org/dialogues/
Highlights:
• Jerry Podany: IIC – heritage conservation in the broader context of the modern world;
• Max Anderson: candor and honesty of what you’re doing and flexibility between parties and technical capacity of the buildings and energy concerns;
• Nancy Bell: in the UK AVISO group ask staff at Tate to reexamine conditions for loans and exhibitions – carbon emission reductions and new funding for research IGOR;
• Karen Stothart: talked about the balance of need of exhibitions and loans and the protection of those 50% RH does shift during winter;
• Cecily Crzywacz: there is no magic bullet for conditions for all institutions;
• Stefan Michalski: he felt that consensus that 10% +/- OK for most collections;
• Terry Weisser: conservators are concerned about energy savings, but need to take care of collections also. Welcomes more research in this area for wider and looser parameters.

ASHRAE (American Society of Heating, Refrigerating and Air-Conditioning Engineers)
This is the organization that conservators look to when determining setpoint and HVAC standards for museums. There is an ASHRAE handbook that is put out every three years that includes information about designing museums . However, the temperature and RH setpoints/ standards have not changed since 1999! This handbook is written in engineer language; design parameters; system selection for engineers. ASHRAE proposes classes of control: AA, A1, A2, B, C, and D. Cool, cold, and dry are the best conditions (duh) and each class has its collection risks and benefits. See list below:

Class of control / Relative Humidity /  Temperature
AA: 50% RH +/- 5%   75°F (high) 55°F (low) +/-4°F
A1: 50% RH +/- 10%   75°F (high) 50°F (low) +/-4°F
A2: 60% RH in summer and 40% RH in winter +/- 5%   75°F (high) 50°F (low) +/-4°F
B: 60% RH in summer and 40% RH in winter +/- 10%   75 +/- 5 and cold +/-5
C: 50% RH +/- 25%   Below 85°F
D: Below 75% RH   Below 85°F

ASHRAE also has building types; higher Roman numerals have more climate control
Climate control: (VI) collection vaults, (V) museums
Partial control: (IV) and (III)
Uncontrolled: (II) and (I)

Why ASHRAE?
• Consistent with risk approach to making decisions on collections
• Flexibility for difference types of collections and locations and building types
• Facilities communication between engineers and collections folks

Example: Risk assessment on historic house in Ontario
First questions during the assessment: collections and relative value of collections; created collections “pie” chart; building vs. collection; and collections overall by percentages.

They decided the building was the most important asset, but spalling was occurring because they were trying to maintain 45% RH inside; reduce RH will reduce strain on building envelope.

There is some confusion about RH and temperature. There is a perception that RH is an issue of paramount importance which can lead to inappropriate RH as in the above example.
• Lack of awareness of options
• Lack of transparency in operations, loans and grant requirements
• As a result, discussions have begun on the federal level in Canada

“Saving Money, Preserving Collections” dialogue
• Overview of evolution of guidelines for museum environment
• Operation of facilities – facilities managers
• Current operations – shared opportunity for savings
• Conditions for loans/ funding
What temperature and relative humidity can we have to satisfy loan and grants?
Operating Conditions in National Museums
o Differs depending on type of building and collections
o Purpose building facilities run at 50% RH +/-5% in the summer and 43% RH +/- 5% in the winter, with a temperature at 21°C

Agreement on the following from museums across Canada:
• When sending materials to another Canadian museum, the loaning museums will not demand better conditions than their own
• Museums will lend objects containing hygroscopic materials to institutions that can achieve Class A conditions
• Federal grant class A will be requirement where applicable
• This is not a strait jacket – meant to be a starting point – obligation as to why this object is not suitable – more candor and information exchange
• Having the discussion and getting agreement from some of these major players is a starting point
o Moving from a rules based approach to a risk based approach – there is work to be done in terms of communication and research

Any suggestions? CCI would appreciate your input – drop him a line! Charlie.costain@pch.gc.ca 613-998-3721

The Off-Grid Museum
National Museum in Denmark
Dr. Poul Larsen and Tim Padfield

Energy savings – Denmark has been doing it WAY before it was cool and hip. Tim has worked with saving energy in conservation for many years. The presentation was given by Dr. Larsen from the National Museum in Denmark.

He showed us buildings that are relevant to the subject: energy savings

As you know, buildings depend on external energy sources to function – light, temp, RH, etc. Museums are big time energy consumers and Denmark is trying to create a building that doesn’t rely on external energy sources and uses only renewable energy sources, taking into account climatic exceptions.

One example that he gave was this Nydam Iron Age warship that was exhibited in a temporary shelter designed as a balloon. As you can imagine, energy use quite large because sealing a balloon is no small feat. Needless to say, the “balloon” leaked, causing an unstable climate – maybe C or D class in ASHRAE terms. Air conditioning depends heavily on the building envelope and the envelope was failing in this example. TO add some more fun to the mix, there was a 6 hour power outage and the balloon collapsed – only held up by wires. Constant energy supply may become a luxury and not a constant.

He showed us another warship that was transformed into an exhibition building – a submarine-turned-museum. The interior of the submarine had a kitchen, bath, and “artwork” aka pinup girlie pictures. Hilarious!! This space has a very unstable climate – doesn’t even class in ASHRAE system.

Example: A museum building with thick concrete walls located in the open landscape from the 1930s. Truly off grid – not even a telephone line and all one level. Museum has only natural lighting – insulation – high ceiling for human health requirement. Natural ventilation sufficient for air quality, no temperature and RH control. In winter we need heat for humans – ground heat pump is the most energy efficient way to heat.

Example: Gallery for minerals display that has a ground heat pump for winter heating and underfloor heating. It uses 4-5 kilowatt hour for heat. Wind energy index can be used as an energy source – a lot of wind in winter and coexists well with heat pump that could be powered by wind energy if necessary. The gallery has a heavy structure to give thermal stability and thermal insulation to reduce heat loss. Small windows reduce solar heating but allow for natural lighting. No humidity control but passive humidification from the walls but not intended – buttresses are taking in the rain water. Efflorescence is occurring on the walls, but it is not as dry in winter so it actually helps.

Example: 17th century house in Liselund Park. This house is only open to public in the summer and not heated with interior impermeable surfaces and finishes. Dehumidified air is injected to each room through small ducts in the floor to keep RH down. It works quite well according to environmental data. However, the temperature is not steady. Dehumidifier keeps the RH to about 60% – but the dehumidifier is totally keeping it in check – if it fails you’re in trouble. Humidity-sensitive objects should be in microclimates because mechanical systems cannot be relied upon. Energy consumption for dehumidification is constant over the year – could use water turbine to power the dehumidification in summer, which would be a good off-grid solution. Mechanicals are unpredictable, but water freezes so what about winter?

Example: Runic stones in Jelling from 950-970 BC outside – no energy use at all (ha ha ha). Polychrome paintings that were on the stones are lost however – copies are displayed in museum nearby – photovoltaic panels integrated into skylights to give natural and artificial light – solar energy better in summer than winter, but condition stability problem during the seasons. Combine solar and wind energy then it will meet required energy needs for museum.

Read more at www.conservationphysics.org

Some of the questions/ comments:
Isn’t there a substantial cost to building thick walls? They anticipate that the buildings will last many years so that the cost of building will be gained back due to that.

One should note the practical limitations in USA with this type of off-grid environmental controls. Denmark is in Zone 5 maritime – mild climate; the USA varies from Zone 1 Humid to Zone 7 Dry maritime and humid – Zone 5 is very small geographically. One should design a museum to the specific location and its limitations in terms of environment.

New Technologies for Energy Storage Applied to Cultural
Heritage Buildings: The Microclimatic Monitoring of Santa Croce
Museum in Florence
Consiglio Nazionale delle Ricerche,
Istituto di Scienze dell’Atmosfera e del Clima CNR-ISAC
Francesca Becherini

I will admit that I had a difficult time following this talk, so I apologize in advance for the lack of information here. The main idea of this talk was to demonstrate a method of conserving heat (as energy) in Santa Croce, Florence by using a special kind of material in drywall. Storing and then release energy is the concept. I’m thankful I understood that!

Well… here goes nothing…

The folks at the Consiglio Nazionale want to develop, evaluate, and demonstrate an affordable multisource energy thermal and electronic storage system integrated in building based on new materials, technologies, and control system: http://www.messib.eu/

So they installed this system in two civil buildings: S Croce Museum was one of them. S Croce has heating system radiators, but no air conditioning. Illumination is halogen and metal halide lamps.

As I understood it, the materials that will hold the energy or heat are phase change materials (PCM) which has the capacity to stores much more heat per unit volume. It also has a desirable melting temperature in the desired temperature operation range, a high specific heat, small volume change with high thermal conductivity. This material (which I never actually got the name of – maybe they never said it because it’s proprietary?) is available as a paraffin or hydrated salts and used in heating panels and solar (as in sun) systems. There is not much information on the long-term durability of this material nor is it inexpensive. PCM embedded in gypsum plasterboards PCM distributed in 20 with respect to gypsum.

So measurements of this material were made with automatic and manual air, temperature, relative humidity, and surface temperature contact on the surface of the art (I think) as it hung on the wall where the PCM material was infused in the gypsum wall. They chose rooms based on how much art was displayed in the room. Main results are below (as best as I caught them):

Installed panels with monitors for air temperature, RH, temperature of panel, and contact panel of wood board to simulate canvas painting. They also monitored VOCs in museum and lab. Found aromatic chlorinated alcohol terpene compounds, aldehydes, and organic acids. PCM emits low VOCs, but it is strongly reduced when in gypsum panel. Can’t tell about aldehydes and organic acids – formic acid is from gypsum panel perhaps? Not PCM? There were lots of graphs about the PCM effects in the lab. Honestly the explanations aren’t totally clear – something about the melting point of PCM? Maybe?

Anyway, according to all of the testing and graphing, they need more information on the material’s thermal behavior, VOCs emissions, and interactions of VOCs and artifacts.