42nd Annual Meeting, Objects Session, May 31, Pine Pitch: New Treatment Protocols for a Brittle and Crumbly Conservation Problem by Nancy Odegaard, et al.

In this paper presented at Saturday’s Objects Session, Nancy Odegaard, Marilen Pool and Christina Bisulca described a new treatment protocol they established, along with their colleagues Brunella Santarelli, Madeleine Neiman, and Gina Watkinson, for treating baskets with deteriorated, pine pitch coatings.  The treatment protocol was devised after conducting a survey of the basket collection at Arizona State Museum, where the majority of the pitch-coated ethnographic baskets (70 out of about 100) had unstable, blanched, cracked and brittle surfaces.  The baskets required treatment so that they could be moved to a new location.
IMG_1359
The majority of the baskets were Apache and were made using a twining or coiling technique.  The pine pitch, obtained from the piñon pine would have been applied to the surface of the baskets as a waterproofing measure.  Two colors of pitch were observed on the exterior of the baskets, each with different condition issues.  Some baskets were covered with a red pitch that appeared translucent.  The other baskets were covered with a dark brown to black, opaque pitch. Both colors of pitch had suffered degradation due to factors such as UV, temperature and pollutants, however the red pitch appeared more unstable and had a formed a series of fine cracks.  The darker pitch had deeper cracks.
Because the baskets had to be moved, a treatment protocol was established to stabilize the surfaces so the baskets could be safely transported to a new storage area. Previous treatments for deteriorated pitch had included consolidation with solvents or the use of heat (using a butane torch!) to reintegrate the cracked, crumbly surface.  The ASM team was looking for another treatment option, and one that took into consideration the vast numbers of objects that required treatment.  Borrowing from methods used to clean aged varnish in the field of paintings conservation, the conservators decided to reactivate the pitch using a solvent to stabilize the flaking material and reattach the crumbly surfaces.
Prior to any treatment, the conservators wanted to get a cultural perspective on the treatment since they did not want to add material, alter the pitch or appearance of the basket and wanted to make sure the objected retain their cultural integrity and significance. Nancy consulted with a Navajo weaver who said that pitch baskets should always look shiny and therefore reactivating the pitch, and the subsequent shiny appearance the material would take, was acceptable.
Treatment
Because of the success in the use of ethanol in cleaning aged, pine-based varnish from paintings, that was the solvent chosen for the reactivation of the pine pitch on the ethnographic baskets.

  • The first stage of the treatment was to place the baskets (many supported by foam rings or, if they fit, by large glass beakers) in an ethanol solvent chamber for 24 hours.  This would condition the surface and prepare it for further treatment.
  • The baskets were then removed from the solvent chamber and areas of the surface sprayed with ethanol using a Dahlia sprayer for a more direct application of the solvent.
  • Brushes, foam swabs wrapped in PTFE (Teflon) tape and Kim Wipes (lint-free wipes) soaked in ethanol were then used to relocate any loose flakes.
  • After one side was treated, the pitch was left to air dry for a few hours, then the basket was turned and the other side sprayed with ethanol and flakes reattached.
  • When the entire pitch surface had been treated, the basket was left to air dry for about 24 hours or until the pitch no longer felt tacky.

IMG_1362
During treatment the conservators noticed that the transparent red pitch reacted faster to the ethanol.  The darker pitch was less soluble and more pressure was needed to re-adhere fragments.  They also noticed that for areas with damaged basketry elements, the reactivated pitch served to reinforce those areas of the plant fiber so that no further stabilization of those woven elements were required.
Analytical Investigations
In addition to the treatment, instrumental analysis was conducted to characterize the two types of pitch and determine if there were any changes in the pitch before and after treatment.  The analysis was conducted using Fourier Transform Infrared Spectroscopy (FTIR) and optical microscopy.
The first investigations looked at the two types of pitch and whether there were any changes observed before and after treatment.  Analysis showed that there were no differences before and after treatment and therefore reactivation and exposure to ethanol did not alter the material chemically.  There were differences, however, noted between the red and dark pine pitch. The transparent red pitch had a low aromatic component as opposed to the dark brown-black material, which had a high aromatic hydrocarbon content.
A series of experiments were then conducted in order to figure out what accounts for these differences and it turns out it has to due with how clean the pine pitch is and at what temperature it was initially heated to during application.  Using optical microscopy, the dark pitch seemed to contain woody materials and had inclusions of bark.  Could this be the explanation for the differences in the aromatic content?
Samples of resin from piñon pines in the Navajo area were collected and heated to different temperatures and then examined using microscopy as well as FTIR.  It turns out that if the pitch is clean and does not contain any woody components, there are little to no aromatics.  However, when bark is present in the pitch, the aromatic content is similar to that seen on the pitch coating the ethnographic baskets.  The heating temperature also plays a role not only in the color, and a temperature of 180° C produces pitch similar to that seen on the ASM baskets.
IMG_1361
This was a really informative talk describing a new approach to not only the treatment of crumbly pine pitch, but also a protocol for treating large numbers of unstable baskets.  The talk was of particular interest to me because some close colleagues and I have often encountered similar types of condition issues with different resinous materials on archaeological objects (for example bitumen coatings on ceramics, bitumen or pitch on baskets, natural resins on Egyptian funerary objects and mummies) and have often discussed the need for approaches to the stabilization of these materials other than consolidation using synthetic resins.   The literature is a bit lacking in terms of the treatment of these types of materials and it’s wonderful that Nancy and her team at ASM are adding to this body of information by sharing their treatment methods and findings (and hopefully publishing them in the OSG Postprints or another publication!).
The next stage of the pine pitch/basketry project will be to work on the archaeological basketry collection and I looked forward to hearing about their approaches to the stabilization of pitch on those artifacts.
 
 

42nd Annual Meeting – Paintings Session, May 29, "Eclectic Materials and Techniques of American Painters: 1860-1910" by Lance Mayer and Gay Myers

Gay Myers, with the support of Lance Mayer, presented research on American artists gathered from primary sources including artists’ interviews, notebooks, letters, manuals, and suppliers’ catalogues, periodicals, and advertisements. Their presentation focused on a period when more Americans began traveling to Europe.
The influence of instruction from French academics like Thomas Couture (1815-1879) was particularly strong. The American painter Elizabeth Boott (1846-1888) wrote manuscripts about European techniques that delineated Couture’s studio instruction in Paris, William Morris Hunt’s (1824-1879) classes in Boston, and Frank Duveneck’s (1848-1919) practice in Munich. Couture advocated the method of painting thinly over brown underlayers (these paint layers become more transparent over time, and so, this method has sometimes led to problems). He influenced several nineteenth century American painters including Eastman Johnson (1824-1906), Winslow Homer (1836-1910), and Thomas Eakins (1844-1916). Hunt and his pupil Helen Knowlton (1832-1918) believed that caring too much about one’s technique was stifling. Duveneck employed large amounts of oil media in his paintings to achieve a “buttery” application and sealed his works with extremely glossy varnishes. Duveneck’s varnishes were so thick that the American painter John Singer Sargent (1856-1925), who preferred light varnishes, advised others not to let “D” or any of his boys varnish their paintings.
The Art Amateur (1879–1903), an American magazine edited by Montague Marks (1847-1905), used the artists’ advice columns to document Thomas Dewing’s (1851-1938) use of matte varnishes, the growing popularity of the shellac-based Soehnée’s varnish as both a retouching and final varnish, and the early beginnings of the tempera revival in America. The American author Albert Abendschein (1860-1914) was among those in opposition to the tempura revival and has been quoted stating “the egg is more useful taken internally and kept out of the studio.” Abendschein instead advocated for indirect painting in which glazes are layered onto a monochromatic underpainting. In his 1906 book, The Secret of the Old Masters, Abendschein documented the growing tempura revival, commercially-produced paints containing wax, as well as other art trends.
J.G. Vibert (1840-1902), Edward Dufner (1872-1957), Mary Louise McLoughlin (1847-1939), and other significant members of the art community discussed varnishing practices, pigments, added media, and supplementary topics in a series of interviews conducted by DeWitt McClellan Lockman (1870-1957). The French author Vibert advocated a preference for petroleum solvents, and similarly, the American artist Dufner began using kerosene oil instead of turpentine because it dries without a glossy sheen. Dufner considered glossy surfaces so undesirable that he wrote on the verso of one of his paintings: “This picture being in a light key is meant to be matte surface and should never be varnished.” Vibert was also a staunch believer that lead white was not compatible with vermillion or cadmium and offered zinc white as an alternative. Concern about the toxicity of lead white also lead many artists, including McLoughlin, to start using zinc white. Since that time, technical analysis has confirmed zinc white is more prone to cracking than lead white.
This presentation effectively demonstrated the extent to which American painters experimented during the late nineteenth and early twentieth centuries. If you would like to learn more about the materials and techniques of American painters, Mayer and Myers have authored multiple publications including American Painters on Technique: The Colonial Period to 1860 (2011) and American Painters on Technique: 1860-1945 (2013).
American Painters on Technique
About the Speakers
Lance Mayer and Gay Myers graduated from the Oberlin College conservation program (1977 and 1978) and work as independent conservators to private collectors and public institutions including the Lyman Allyn Art Museum. The authors are fellows of the American Institute for Conservation of Historic and Artistic Works (AIC) and have each served as chair of the AIC Paintings Specialty Group. They have collaborated on conservation and research projects for over thirty years, were awarded the Winterthur Advanced Research Fellowship (1999), Museum Scholars at the Getty Research Institute (2003), and College Art Association/Heritage Preservation Award for Distinction and Scholarship in Conservation (2013).

42nd Annual Meeting – Photographic Materials, May 31, "László Moholy-Nagy: Characterization of his Photographic Work at the Art Institute of Chicago and his Working Practices" by Mirasol Estrada

Mirasol Estrada, the Andrew W. Mellow Fellow in Photograph Conservation at the Art Institute of Chicago, studied the work of László Moholy-Nagy in the museum’s collection for two years. Her talk was a comprehensive look at the photographer’s working practices as well as the specific characteristics of his photographs in the collection at the Art Institute of Chicago. Ms. Estrada was drawn to the work of Moholy-Nagy because of the experimental nature of his working practices, and his philosophy of photography.
Moholy-Nagy always thought of himself as a painter but he also produced drawings, film and photographs. He came to Chicago in 1937 to direct the New Bauhaus, which then became the Institute of Design. He was an influential teacher, including teaching his philosophy about the practice of photography, published in his book “Vision in Motion”. This philosophy was summarized very nicely by Mirasol, who described Moholy-Nagy’s idea that there are eight varieties of seeing.
The first variety of seeing is “Abstract”, which includes photograms, which are direct records of shapes and shadows. The second is “Exact”, which is straight-forward camera photography. The next is “Rapid”, which shows motion, followed by “Slow”, his description for long exposures. “Intensified” was using chemical manipulation such as solarization. “Penetrative” described x-rays, “Simultaneous” was the term for his photomontages, and lastly, “Distorted” was the term for mechanical or chemical manipulation of a print or negative. This was an interesting summary of Moholy-Nagy’s ideas about the variety of seeing correlated to his photographic method – a good window into the photographer’s thinking process and the categorization of his themes.
Ms. Estrada then took us through the characterization, both physical and analytical, of the thirty-nine photographs in the Art Institute’s collection. She grouped the prints physically by their size, tonal range, surface texture, finishing (coatings) and thickness. These groupings were displayed in a very clear, easy to read chart detailing the characteristics, including thumbnail photos of each object overall and in detail to show tone, surface texture, etc. Analytical data for each object was also included (XRF, FTIR) to complement her visual observations. Using her chart, one could compare the date clearly and easily, looking at tone, texture, and subject of the image.
A few interesting observations that were made following the study were that Moholy-Nagy most likely did not process his own photographs – he has been known to have explained that he was allergic to the development chemicals. This may explain the diverse body of work and materials choices, since his students, wife, and daughter may all have been a part of the processing of his artwork. Moholy-Nagy used many different types of paper and other materials, especially noticeable on his move from Europe to the US, reflecting the marketplace at each time and place. Ms. Estrada offers that it was perhaps more important for the artist to express his ideas, his complex categories of “varieties of seeing”, in the Bauhaus tradition, instead of focusing on the fabrication of his artwork.

42nd Annual Meeting – Workshop, May 28, 2014, “Dataloggers – Establishing and Maintaining Environmental Monitoring Systems” by Rachael Perkins Arenstein and Samantha Alderson

This workshop was a smorgasbord of dataloggers, filled with details about how they function, how the recorded information is moved from one device to another to be analyzed and repurposed, and how to think about choosing the right type of datalogger to match a particular environmental goal. I came into the workshop hoping to learn about new equipment that’s on the market now, to advance my institution’s upcoming project to re-invigorate our environmental monitoring and control program, in support of both energy and preservation goals. I got what I came for!

Workshop instructor and participants examining a long table with many types of dataloggers laid out in rows
Samantha Alderson and Suzanne Hargrove discussing datalogger options

The workshop was taught by Samantha Alderson and Rachael Perkins Arenstein, both of whom have advised institutions large and small about environmental monitoring programs, and clearly know what they are talking about. They recently updated the National Park Service Conserve-O-Gram (“Comparing Temperature and Relative Humidity Dataloggers for Museum Monitoring,” September 2011, Number 3/3, http://www.nps.gov/museum/publications/conserveogram/03-03.pdf ), which is worth reviewing, but with the caveat that the technology is changing so rapidly that vendors and specifications should be researched anew when you’re planning for a major purchase.
The presenters started by reviewing basics of the hardware and connectivity, summarizing what kind of data loggers can collect, how many loggers one needs, and where to place them, and for how long. They also talked about general environmental management concepts so the less experienced in the audience wouldn’t be left behind.
They then explained a basic difference between two families of dataloggers:

  • Stand-Alone Loggers collect data which is then harvested either by direct wired connection to a computer, through an indirect intermediary device like a card reader or thumb drive, or wirelessly; this method is appropriate when you don’t need real-time data
  • Connected Loggers either wired (Ethernet) or wireless (radio, WiFi, cellular, etc.) transmit data to a receiver that then aggregates the data from one or more devices; this method is appropriate when you need real-time data, need to receive alerts, and when you need to manage a lot of devices

Other topics covered included datalogger software and data management, calibration, and a group activity in which we had to choose (and justify) a monitoring system for one or more specific scenarios. This activity was my favorite part of the workshop, and I wish we had spent more time on this. It was a practical test of how to figure out why and what you need to monitor, and how to maximize your resources to achieve that goal.
Helpful handouts included charts of various datalogger models/systems with comparison of many variables including costs. Options to consider include: connectivity, size and aesthetics of the logger, battery type and life, sensor quality, data capacity, cost, accessibility of the device once installed, built-in display of current readings, display and/or communicate alarms, sampling rate, calibration method, probe option, and software platform compatability.
Here are some take-aways that for me will inform my upcoming work:

  • The landscape of available hardware is rapidly changing with developments in communications technologies; Bluetooth is the hot technology according to several vendors, so they are investing their development efforts into Bluetooth connectivity for their upcoming upgrades and new releases
  • Sensor quality matters, but there are also differences (reflected in the wide range of prices) in everything else around the sensor…most notably the architecture of the device, the circuitry and algorithms used to translate the sensor data into numbers. You get what you pay for, but that should be matched to what you need.
  • Sensors are very sensitive to organic vapors! They can be destroyed by a big whiff of solvents, and even thrown off by off-gassing from the plastic housing in which they are mounted.
  • Loggers need to be checked for accuracy (you can do this yourself with saturated salt solutions according to instructions in another helpful handout), and if they have drifted, they need to be recalibrated (some you can do yourself, others have to be sent to the manufacturer); battery replacement is also variable (some are DIY, others not).
  • Most connected loggers require IT support for installation in an institution, so include your IT staff during the planning phase; be sure to ask them about WiFi encryption requirements
  • Wireless technologies may be affected by building/exhibit case material and construction, as well as nearby noise-emitting sources
  • Software varies a lot, but some of the systems can import data from other manufacturers’ devices; again, you get what you pay for, but the options I favor include the ability to import climatic data, graphical visualization of the data in a format that’s understandable by a range of audiences, and good tech support.
  • Get a demo set from the vendor prior to purchasing the whole system to make sure it works for your building

At many points throughout the Collections-Care focused Annual Meeting, I noticed that careful environmental monitoring and interpretation of the data becomes a fundamental part of energy savings and decision-making, grant-funding and construction/renovation of storage spaces. I almost wish the workshop had happened right after the meeting instead of before, because I would have had many more big-picture questions to ask of the presenters. Mostly, I want to hear a more substantive discussion about why we monitor, and how to translate the data into words that advance preservation priorities. Environmental monitoring is a time- and resource-intensive process, so we should be thoughtful and strategic about it.

42nd Annual Meeting- General Session, May 30, "Using Webinars to Tackle Conservation Misinformation in Ontario's Community Museums" by Fiona Graham

“Conservation is an elusive practice just outside of budgetary reality.”  Fiona Graham, a conservation consultant in Kingston, Ontario, received this comment in a survey filled out by a small museum in Ontario, and it made her take notice.  Museums believing that conservation only equates to (costly) treatment leaves no room for implementing best practices, taking vital preventive measures, and leads to a general misunderstanding of the basic principles of preservation.  Graham set out to change the perceptions of these museums and chose webinars as her format.
Who: Ontario’s Community Museums–roughly 300 institutions that range in size but are not art galleries, private collections, or national museums.  Only 14 have in-house conservators (in one case, 9 museums share one conservator!).  The collection care for the remaining 286 falls into the hands of non-conservators.
Why: 185 of those Ontario Community Museums receive operating grants from the Ministry’s Museum Unit to survive economically.  In order to receive these grants, the museums must meet regulatory requirements, including a conservation standard.  To assess the state of conservation and preservation in the museums, a questionnaire was distributed to the museums, and Graham and her team discovered some startling misunderstandings.  For example, many respondents believed that light damage was caused only by UV, that pesticides are still needed, and that cold temperatures are always bad for collections.  (Since they are in colder climates, it’s especially disconcerting to think of the expenses paid to raise temperatures in these museums.)
What was done:  To debunk misunderstandings at as many of the museums as possible, the Ministry funded two 1.5 hour long webinars.  The webinar format was chosen because it can reach a targeted audience, has wide accessibility and the ability to be interactive, is inexpensive to produce, and has been successful through the Ontario Museums Association (an organization that provides training in museum work).  After institutions answered preliminary questions on their registration forms, webinars were conducted as powerpoint presentations narrated live by a conservator using the icohere platform.  The first webinar, Conservation 2.0, was a “good practice” refresher course meant for non-conservators, while the second, Climate Control: what do you really need?, focused on misinformation hot spots.  Participants used their own computers and sent questions to a moderator who passed them to the conservator to answer.  The Ontario Museum Association posted the slide deck and audio to their website after the webinars ended.
More details?  The prep questions: Define what conservation means in the context of your museums? What question about conservation would you like answered in this webinar? What do you think relative humidity and temp levels should be in your museum’s collection areas? Do you monitor RH and/or T; do you actively control RH? (The webinars included a disclaimer that “this webinar is not a substitute for proper training.”)
Results:  The webinars were open to all, not just the Ministry-funded institutions, and 55 organizations participated during the live broadcasts.  The prep questions from the registration forms informed the content of the webinars.  There was positive feedback overall, with requests for more programs.  The negative feedback regarded the amount of detailed information on conservation.  Graham recommends being very clear on expectations.  The webinar team will be able to gauge the long-term results of the refresher courses during the next audit in 2018.
(Author’s comments: This talk was part of the general session on Engaging Communities in Collections Care.  The U.S. Heritage Preservation organization also offers webinars to help smaller institutions with collections care.  Their webinars are part of their Connecting to Collections (C2C) online community.  Past programs are available in their archives.)

AIC’s 42nd Annual Meeting – Opening Session, May 29, “Precaution, proof, and pragmatism: 150 years of expert debate on the museum environment” by Foekje Boersma, Kathleen Dardes, and James Druzik

Foekje Boersma, along with Kathleen Dardes and James Druzik, provided an informative summary of the debate regarding environmental standards in their presentation “Precaution, proof, and pragmatism: 150 years of expert debate on the museum environment.”  The presentation began with a historical review, based in part on information obtained from AIC’s Conservation Wiki.
The Museum of Fine Arts Boston and the Cleveland Museum of Art were the first museums to set specific humidity recommendations, in 1908 and 1915, respectively.  It is often stated that the development of environmental standards arose as a by-product of the storage of artworks in salt and coal mines during World War II, so I was interested to learn of earlier attempts at environmental control.
In 1940, Harold Plenderleith and George Stout said there was not adequate information to fix an “absolute standard” but suggested 60 – 65% relative humidity, chosen because it was easiest to maintain with stability.  Later, Plenderleith, now working with Paul Philippot, prescribed a “region of security” of 50 – 65% RH.  According to Boersma, these early conservators were pragmatic: although a set temperature and RH were specified, a greater emphasis was made on avoiding extremes.  The local climate and historical conditions of the objects were also to be taken into account.  Garry Thomson, who is often assigned either the credit or blame, depending on whom you ask, for the 50% RH/70° F standard, is misinterpreted according to Boersma.  He was also pragmatic.  Rather than endorsing the 50/70 rule, he merely predicted the increasing number of museum loans would lead to museums adopting that rigid standard.
Boersma attributes the widespread implementation of the 50/70 rule to the museum building boom in the 1970s.  Architects and engineers wanted numerical targets, and conservators were happy to specify safe conditions.  Sustainability was not much of a concern given cheap energy costs.  But already by 1979, CCI was advising seasonal variations with gradual fluctuations.  Boersma then skipped ahead to the 1990s and the controversial research of Charles Tumosa and Marion Mecklenburg at MCI, which said that materials aren’t as sensitive as previously thought.
Today, the debate on the museum environment has moved from conservators to museum directors and administrators.  The Bizot Group, concerned about environmental and economic sustainability, pushed to broaden environmental standards by adopting new Guiding Principles and Interim Guidelines, influenced by those developled by the NMDC (the National Museum Directors’ Council). In response, guidelines were published many other groups, such as AIC, BSI, AICCM, and the Doerner Institut.
In order to clarify the debate, Boersma divides prevailing views into three categories: precautionary safety, proven safety, and pragmatic risk management.  Precautionary safety, embodied by the Doerner Institut’s Munich Position, centers around the belief that “stable is safe.”  Not enough research has been done on the response of objects to wider environmental conditions.  To eliminate risk, objects should be kept under a narrow set of conditions.  Supporters of the proven safety approach acknowledge that actual conditions are wider than 50/70 because tight standards are impossible to maintain.  The proofed fluctuations of 40 – 60% RH and 50 – 70˚ F are acceptable.  Pragmatic risk management reflects ideas of risk assessment developed in the 1990s.  Resources should go to the reduction of the biggest risks to collections, which may or may not be climatic fluctuation.
In conclusion, Boersma wonders how conservators can function as a profession given such different views on a central topic.  She references her ongoing research as part of GCI’s Managing Collection Environments Initiative, which is working to answer questions generated by the debate.

42nd Annual Meeting – Opening Session, 29 May, "Quantifying cost effectiveness of risk treatment options (aka preventive conservation)" by Stefan Michalski and Irene F. Karsten

Preventive conservation was the topic of much discussion at this year’s annual meeting, from how to teach it to what exactly it entails. In this talk, Stefan Michalski discussed the quantification of preventive conservation.
He began by reminding us that we base our ideas of preventive conservation on the “proofed fluctuation” argument: if fluctuation in the past has not caused significant damage, then similar future fluctuations will not either. He also defined preventive conservation. First, we assess risks. Then, we ‘treat’ risks;  this second part is Preventive Conservation. We have to remember that ‘treat’ has a different meaning in this context than in remedial conservation, and despite being a loaded word, accurately describes what we do. These definitions are simultaneously straightforward and complicated; we struggle with them and yet we need them for our daily work.
Michalski continued by defining the four steps to successful preventive conservation:
1. Identify Options
2. Analyze
3. Evaluate
4. Implement
Steps 2-3 require quantification, and it’s vital that this quantification is transparent and well-documented. This is where Michalski and Karsten’s research comes in. They assessed the financial risk of every preventive option available for a variety of institutions, including an archive and a historic house.
In order to quantify reduction in risk, calculations were made using the following formulas:

  • Option effectiveness = size of risk reduction = size of original risk – size of reduced risk
  • Risk reduction / cost = [% of collection saved / $ spent] /year

I had never encountered this calculation before, or considered this as a feasible method of determining cost-effectiveness and ranking options, and I don’t think I’m alone in the conservation field in this. I wish that this had been covered by one of my graduate courses, because while it may seem obvious in some ways, the explanation was exceptionally helpful, and is something that I will take to my professional practice.
The numbers produced graphs on a logarithmic scale, in terms of percent saved per dollar. By evaluating options on this scale, it was possible to see how cost-effective various options are. What was highlighted with this calculation is that the cost effectiveness of an action is a function of the magnitude of risk – the bigger the risk, the better the return on percentage saved. This is in line with the economic principle of ‘economies of scale‘. What Michalski noted was that it is important to remember that the scale referred to is internal, not external, which means that small museums can be just as cost-effective as larger museums.
I loved this talk, and I felt like I learned a huge amount about quantification of risk. ‘Risk assessment’ is a term that we are all familiar with; to be able to go more in-depth is a skill, and Stefan Michalski did an excellent job of teaching that skill. His results are hugely applicable to museums and institutions of all sizes, and we should all learn and apply this method to aid in our decision-making for preventive conservation.

42nd Annual Meeting – Collection Care & HVAC, May 31, "Some trends in examining six years of utility and climate data at the Museum of Modern Art" by Jim Coddington

Jim Coddington, the chief conservator at the Museum of Modern Art (MoMA) in New York,  presented some trends that were found from analyzing the environmental data that was collected at MoMA over the past six years. This was particularly interesting because it compared two relatively new or newly renovated buildings with different types of usage/functionality and HVAC systems. The building on 53rd street, Jim admits, is very leaky from a number of sources, including the many doors through which thousands of people pass, and has a steam and electric HVAC system. The building in Queens (QNS) on the other hand is mostly concrete with very little glass and has a gas powered HVAC system. The data that Jim presented was collected from across the museum including finance, operation, conservation, and vistor services. Needless to say there are a lot of people invested in this.
Jim showed mostly graphs and charts. These included data showing the temperature and %RH outside, inside the buildings, dew point, and comparing this energy usage. I’ve included images of the graphs that I found most interesting or informative.

NYC average monthly temperatures (6 year average) showing periods of cooling and heating inside the buildings.
NYC average monthly temperature (6 year average) showing periods of cooling and heating inside the QNS building. Most graphs showed what the temperature was at 1 PM each day.

Indoor RH
This graph shows the indoor RH from fixed outdoor dew point to variable indoor set-point Temperature.

In QNS there is a large expenditures of gas in august and dips in winter. This is because that are able to use free cooling to extract excess heat for 8 or9 months, or 3 out of 4 seasons, through a heat exchanger on the roof. In this process, heat is absorbed from the condenser water by air chilled water. The length of time they are able to use free-cooling is based on set points of T and RH (see second image) and is affected by air temperature, relative humidity, and water supply temperature. Non-free cooling with the RH set at 50% happens over the summer and is longer at lower temperatures. So during the summer the temperature set point is allowed to drift to 22 degrees C. Jim mentioned that having a narrower set point may actually equal cost savings, but they have no data for that.
On the analysis for the 53rd street building, Jim highlighted that this is a very different situation. It is a high use building, with lots of leakage points and demand on the systems- steam and electric principally. Therefore, the energy usage is much higher.
It has been asked whether heat from visitors is significant? In Chris McGlinchey’s calculation, the 360 kJ/hr given off by the visitors with a typical stay of 4 hours, this is not a huge contributing factor.
The combined energy usage in kJ/m2 at the 53rd street and QNS buildings.

In Jim’s summary and conclusions- The expected was stated that they are consuming more energy in the 53rd St building than QNS. This is mostly in winter (see the third image). The QNS building is more efficient because of the free cooling, lower set point temperature and equates to lower energy usage thanks to an efficient building design. Online Resources:

  • Steam- natural gas utility converter: http://www.coned.com/steam/default.asp
  • NIST Guide for the Use of the International System of Units (SI) 2008: http://physics.nist.gov/cuu/pdf/sp811.pdf
  • Humidity converter: http://www.cactus2000.de/uk/unit/masshum.shtml
  • Dewpoint calculator: http://www.decatur.ed/javascript/dew/index.html
  • NOAA, National.ncdc.noaa.gov/

42nd Annual Meeting- OSG, May 31, "Restoration by Other Means: CT scanning and 3D Computer Modeling for the Re-Restoration of a Previously Restored Skull from the Magdalenian Era by J.P. Brown and Robert D. Martin"

After collaborating with JP at the Field Museum on rendering CT scans a few years ago and seeing his article about this work in the spring MRCG newsletter, I was excited to see some images about this in person. JP has been working with CT scanners since 2006 starting out by taking advantage of the kindness of local hospitals and more recently renting a portable unit that came to museum on a truck.
As many of us know, CT scanners can look inside objects non-destructively and provide accurate images with 3D geometric accuracy. JP started the talk be reviewing some of the physics of getting a CT scan done, the benefits, and limitations. Here’s a run-down:
1. The scanner has a donut shaped gantry consisting of a steel ring containing the X-ray tube and curved detector on the opposite side, so your object has to fit within the imaging area inside the steel ring.
2. On each revolution you get lots of images scanned within 30 seconds to 5 min- this is very fast.
3. The biggest logistical challenge is moving objects to and from the hospital safely.
4. During the scanning you immediately get slices, which are cross-section images from three different directions. Volumetric rendering  is done from the slices and there is free software for this.
5. Apparently it is relatively easy to do segmentation, segment out regions of interest, and extract wire frame models, just time consuming. From there you can get images of the surface and texture and can even print the models. It is relatively easy to go from slice to wireframe, but harder to achieve a manufacturing mesh to produce a 3D print, which can be expensive in comparison to traditional molding and casting.
6. PROs of scanning and printing: there is no contact with the object, complex geometry is not a problem, the scans and volumetric rendering are dimensionally accurate, you can print in lots of materials; prints can be scaled to make large things handleable or small things more robust for handling or increase visibility; subtractive manufacture, in which you can use a computerized milling machine to cut out a positive or negative, is also a possibility.
7. CONs of scanning and printing: printing is slow, the build volume is limited, a non-traditional skill set is required of conservators to produce the final product, and only a few materials age well. The best material is sintered nylon, extruded polyester may also be safe, but it doesn’t take paint well; it is hard to get the industry to think about permanence.
The object at the center of this project was a Magdalenian skull. The skeleton itself is of considerable importance, because it is the only magdalenian era skeleton of almost completion. A little history: it was excavated, quite professionally, in 1911 when they lowered the floor of the site. Unfortunately the burial was discovered when someone hit the skull with a pickax. Needless to say, the skull did not come out in one piece. In 1915 the full skeleton was removed in two blocks. My notes are a little fuzzy here, but basically at some point between the excavation the skull was restored and then went from being 2 pieces to 6 pieces, as it is documented in a 1932 publication by von Bonen. It appears that at that point the skull was also skin coated with plaster. Thankfully (?) those repairs have held up. Great, so why, did they need to scan and reconstruct the skull? Well according to Dr. Robert Martin, JP’s colleague at the Field Museum, the skull doesn’t look anatomically correct. Apparently during the time period when it was put together there was an interest in race and the skull fragments could have been lined up incorrectly accentuating cultural assumptions.

Previous condition documentation image
Previous condition documentation image

One image slice from the CT scan
One image slice from the CT scan

 
A previous x-ray showed that two fragments in the forehead are secured with a metal pin. In 2012, when the mobile CT scanner came to the museum, they were all geared up to start with the Magdalenian skull. Unfortunately there was not much difference in attenuation between bone and plaster making it tricky to define between the two materials in the scans. JP consulted a cranial reconstruction group and asked them to pretend this was a pediatric car crash victim with a cranial injury; they asked, why aren’t you using the mimics software package?
 
In this scanner, the object sits on a rotating table, while the source and detector stay still. Since these are fixed, a full scan has to be done in parts depending on the size of the object.
In this scanner, the object sits on a rotating table, while the source and detector stay still. Since these are fixed, a full scan has to be done in parts depending on the size of the objec

JP and his team also imaged the skull with a micro CT scan that has a 0.1 mm resolution versus the normal modern setting of 0.3 mm. They had previously identified 36 fragments of bone from the previous scan. It was hard to tell if some of those separations were just cracks or actual breaks between fragments. The hope was that the micro CT scanner could better define these areas. The micro CT scanner works opposite to the industrial/medical scanner. As you can see in the image to the left, the tube and detector are fixed, while the sample is rotated. Other differences are that it is slower, one scan takes 30-90 minutes and because of scanner geometry the skull had to be imaged in two scans . Because of this, JP used the previous scan to mill out a contoured support to hold the skull in the exact position. JP noted that digitally filling in the holes of the skull to create the support was the most time consuming part of that process and suggests using different radio-opaque marker dots to identify left and right for orientation during the later stitching process. With the new scans at least three separations were identified as cracks vs. breaks.
Now for the virtual reconstruction… the biggest obstacle in this stage was how to achieve something more anatomically correct using the virtual fragments when they have no boundaries. The fragments don’t push back in the computer- and the fragments can easily move into each other. With the software JP used mostly the translation and rotation functions and the free animation software Blender (which has a high learning curve and took several days to get accustomed to) to create hierarchical parent child relationships between the fragments as he joined them together. Just like putting a vessel together, right? In the virtual world at least there is no worry about lockout. They had a 3D printed of the final skull reconstruction and had an artist do facial reconstruction, which JP thinks always look related to Jean Luc Picard… So how successful was this? From a conservation perspective- awesome, it’s fully reversible! Scientifically though, it’s decent, well documented and scientifically justifiable- However, someone else could go through the same process and come up with a different reconstruction because of their reliance on left right symmetry for this reconstruction…
 
Creating the virtual reconstruction
Creating the virtual reconstruction

Comparison of the current restoration and the virtual restoration
Comparison of the current restoration (left) and the virtual restoration (right)

So what did I take away from this talk? This was a very cool project and if I have a question about CT scanning and 3D renderings, I will call JP! The scans can be extremely informational and there seems to be a lot of potential in their use for mount-making, crates, and storage, and possibly virtual reconstructions. Hopefully at some point in the future the software will become more intuitive and easier to use so that more of these types of projects can be done.

42nd Annual Meeting, Paintings & Wooden Artifacts Joint Session, May 31, "The Analysis and Reduction of an Intractable Coating for the Panel Painting by Lluis Borrassa, Christ Before Pilate," by William P. Brown & Dr. Adele De Cruz

The presentation by William P. Brown and Dr. Adele De Cruz was an awe inspiring glimpse at the future of conservation. Through the collaboration of the North Carolina Museum of Art and conservation scientists from the University of Pisa and Duke University, an intractable layer of cross-linked drying oil, animal glues, and pigmented varnish was removed from the surface of Spanish painter Lluis Borrassa’s panel painting, Christ Before Pilate, 1420-25.
The painting, which had not been exhibited for over 40 years, was the victim of previous cleaning and coating campaigns, and several layers of consolidation materials and paints and glazes had been applied to the blue passages of Christ’s robe. As a result of the cross-linking of these consolidants and the dark pigmentation of a conealing varnish layer, Christs’s robe appeared almost black.
During treatment at the North Carolina Museum of Art, solvents were successful in removing the toned varnish from the painting. However, the reduction of the complex layer of intractable material covering Christ’s robe (the abstract describes this as a composite of old consolidant, cross-linked drying oil, and restoration materials) was not so straighforward. Conservation scientists (from the aforementioned institutions) used FTIR, SEM, and GC-MS analysis to identify the components of the intractable layer and to discern them from original material, which consistsed of lapis, indigo, and orpiment pigments in egg tempera and glue or bee pollen.
Dr. De Cruz took the podium at this point in the talk to describe the methods used to reduce the intractable composite material. Essentially, laser ablation was employed, which before this talk I was only familiar with in the context of dentistry. I have to admit that my intitial reaction to hearing the terms ‘laser’ and ‘art conservation’ used together might have been a wary one, but a refamiliarzing with the techniques involved with laser ablation (and recalling the established use of this technique on the delicate enamel surfaces of our teeth) was an encouraging and exciting reminder of the vast potential of interdisiplanary approaches to art conservation.
Dr. De Cruz explained that the 2940 nm  Er:YAG (erbium) operates using an intense monochromatic wave of light (2.94 microwatts) at 15 pulses per second to vaporize the intractable material. The depth of penetration is very controllable, maintaining a shallow depth of penetration between 3-5 microns. This light pulse is highly absorbed by water, and produced a near instantaneous steam distillation. A glass cover slip is placed over the dirt, varnish, and paint layer. The laser is used to break up the intractable surface, which is ejected and contained by the glass cover slip. The debris is then swabbed from the surface of the painting and can be used for analysis.
There are several immediately obvious benefits to this method. It eliminates the need for toxic solvents, it allows for a highly controllable and low shallow depth of penetration. There is also no risk of chemical change to the substrate, and the reaction is low temperature.
Dr. De Cruz went in to incredible depth during this talk, and I realize that my summary only touches on the ammount of information she provided. I was furiously scribbling notes the entire time, and certainly wished I had a camera to take photos of her slides. I certainly look forward to hearing more about this topic in the future, and am excited for the future and ongoing collaboration of conservation and science.