AIC’s 42nd Annual Meeting – Opening Session, May 29, “Precaution, proof, and pragmatism: 150 years of expert debate on the museum environment” by Foekje Boersma, Kathleen Dardes, and James Druzik

Foekje Boersma, along with Kathleen Dardes and James Druzik, provided an informative summary of the debate regarding environmental standards in their presentation “Precaution, proof, and pragmatism: 150 years of expert debate on the museum environment.”  The presentation began with a historical review, based in part on information obtained from AIC’s Conservation Wiki.
The Museum of Fine Arts Boston and the Cleveland Museum of Art were the first museums to set specific humidity recommendations, in 1908 and 1915, respectively.  It is often stated that the development of environmental standards arose as a by-product of the storage of artworks in salt and coal mines during World War II, so I was interested to learn of earlier attempts at environmental control.
In 1940, Harold Plenderleith and George Stout said there was not adequate information to fix an “absolute standard” but suggested 60 – 65% relative humidity, chosen because it was easiest to maintain with stability.  Later, Plenderleith, now working with Paul Philippot, prescribed a “region of security” of 50 – 65% RH.  According to Boersma, these early conservators were pragmatic: although a set temperature and RH were specified, a greater emphasis was made on avoiding extremes.  The local climate and historical conditions of the objects were also to be taken into account.  Garry Thomson, who is often assigned either the credit or blame, depending on whom you ask, for the 50% RH/70° F standard, is misinterpreted according to Boersma.  He was also pragmatic.  Rather than endorsing the 50/70 rule, he merely predicted the increasing number of museum loans would lead to museums adopting that rigid standard.
Boersma attributes the widespread implementation of the 50/70 rule to the museum building boom in the 1970s.  Architects and engineers wanted numerical targets, and conservators were happy to specify safe conditions.  Sustainability was not much of a concern given cheap energy costs.  But already by 1979, CCI was advising seasonal variations with gradual fluctuations.  Boersma then skipped ahead to the 1990s and the controversial research of Charles Tumosa and Marion Mecklenburg at MCI, which said that materials aren’t as sensitive as previously thought.
Today, the debate on the museum environment has moved from conservators to museum directors and administrators.  The Bizot Group, concerned about environmental and economic sustainability, pushed to broaden environmental standards by adopting new Guiding Principles and Interim Guidelines, influenced by those developled by the NMDC (the National Museum Directors’ Council). In response, guidelines were published many other groups, such as AIC, BSI, AICCM, and the Doerner Institut.
In order to clarify the debate, Boersma divides prevailing views into three categories: precautionary safety, proven safety, and pragmatic risk management.  Precautionary safety, embodied by the Doerner Institut’s Munich Position, centers around the belief that “stable is safe.”  Not enough research has been done on the response of objects to wider environmental conditions.  To eliminate risk, objects should be kept under a narrow set of conditions.  Supporters of the proven safety approach acknowledge that actual conditions are wider than 50/70 because tight standards are impossible to maintain.  The proofed fluctuations of 40 – 60% RH and 50 – 70˚ F are acceptable.  Pragmatic risk management reflects ideas of risk assessment developed in the 1990s.  Resources should go to the reduction of the biggest risks to collections, which may or may not be climatic fluctuation.
In conclusion, Boersma wonders how conservators can function as a profession given such different views on a central topic.  She references her ongoing research as part of GCI’s Managing Collection Environments Initiative, which is working to answer questions generated by the debate.

42nd Annual Meeting – Opening Session, 29 May, "Quantifying cost effectiveness of risk treatment options (aka preventive conservation)" by Stefan Michalski and Irene F. Karsten

Preventive conservation was the topic of much discussion at this year’s annual meeting, from how to teach it to what exactly it entails. In this talk, Stefan Michalski discussed the quantification of preventive conservation.
He began by reminding us that we base our ideas of preventive conservation on the “proofed fluctuation” argument: if fluctuation in the past has not caused significant damage, then similar future fluctuations will not either. He also defined preventive conservation. First, we assess risks. Then, we ‘treat’ risks;  this second part is Preventive Conservation. We have to remember that ‘treat’ has a different meaning in this context than in remedial conservation, and despite being a loaded word, accurately describes what we do. These definitions are simultaneously straightforward and complicated; we struggle with them and yet we need them for our daily work.
Michalski continued by defining the four steps to successful preventive conservation:
1. Identify Options
2. Analyze
3. Evaluate
4. Implement
Steps 2-3 require quantification, and it’s vital that this quantification is transparent and well-documented. This is where Michalski and Karsten’s research comes in. They assessed the financial risk of every preventive option available for a variety of institutions, including an archive and a historic house.
In order to quantify reduction in risk, calculations were made using the following formulas:

  • Option effectiveness = size of risk reduction = size of original risk – size of reduced risk
  • Risk reduction / cost = [% of collection saved / $ spent] /year

I had never encountered this calculation before, or considered this as a feasible method of determining cost-effectiveness and ranking options, and I don’t think I’m alone in the conservation field in this. I wish that this had been covered by one of my graduate courses, because while it may seem obvious in some ways, the explanation was exceptionally helpful, and is something that I will take to my professional practice.
The numbers produced graphs on a logarithmic scale, in terms of percent saved per dollar. By evaluating options on this scale, it was possible to see how cost-effective various options are. What was highlighted with this calculation is that the cost effectiveness of an action is a function of the magnitude of risk – the bigger the risk, the better the return on percentage saved. This is in line with the economic principle of ‘economies of scale‘. What Michalski noted was that it is important to remember that the scale referred to is internal, not external, which means that small museums can be just as cost-effective as larger museums.
I loved this talk, and I felt like I learned a huge amount about quantification of risk. ‘Risk assessment’ is a term that we are all familiar with; to be able to go more in-depth is a skill, and Stefan Michalski did an excellent job of teaching that skill. His results are hugely applicable to museums and institutions of all sizes, and we should all learn and apply this method to aid in our decision-making for preventive conservation.

42nd Annual Meeting – Collection Care & HVAC, May 31, "Some trends in examining six years of utility and climate data at the Museum of Modern Art" by Jim Coddington

Jim Coddington, the chief conservator at the Museum of Modern Art (MoMA) in New York,  presented some trends that were found from analyzing the environmental data that was collected at MoMA over the past six years. This was particularly interesting because it compared two relatively new or newly renovated buildings with different types of usage/functionality and HVAC systems. The building on 53rd street, Jim admits, is very leaky from a number of sources, including the many doors through which thousands of people pass, and has a steam and electric HVAC system. The building in Queens (QNS) on the other hand is mostly concrete with very little glass and has a gas powered HVAC system. The data that Jim presented was collected from across the museum including finance, operation, conservation, and vistor services. Needless to say there are a lot of people invested in this.
Jim showed mostly graphs and charts. These included data showing the temperature and %RH outside, inside the buildings, dew point, and comparing this energy usage. I’ve included images of the graphs that I found most interesting or informative.

NYC average monthly temperatures (6 year average) showing periods of cooling and heating inside the buildings.
NYC average monthly temperature (6 year average) showing periods of cooling and heating inside the QNS building. Most graphs showed what the temperature was at 1 PM each day.

Indoor RH
This graph shows the indoor RH from fixed outdoor dew point to variable indoor set-point Temperature.

In QNS there is a large expenditures of gas in august and dips in winter. This is because that are able to use free cooling to extract excess heat for 8 or9 months, or 3 out of 4 seasons, through a heat exchanger on the roof. In this process, heat is absorbed from the condenser water by air chilled water. The length of time they are able to use free-cooling is based on set points of T and RH (see second image) and is affected by air temperature, relative humidity, and water supply temperature. Non-free cooling with the RH set at 50% happens over the summer and is longer at lower temperatures. So during the summer the temperature set point is allowed to drift to 22 degrees C. Jim mentioned that having a narrower set point may actually equal cost savings, but they have no data for that.
On the analysis for the 53rd street building, Jim highlighted that this is a very different situation. It is a high use building, with lots of leakage points and demand on the systems- steam and electric principally. Therefore, the energy usage is much higher.
It has been asked whether heat from visitors is significant? In Chris McGlinchey’s calculation, the 360 kJ/hr given off by the visitors with a typical stay of 4 hours, this is not a huge contributing factor.
The combined energy usage in kJ/m2 at the 53rd street and QNS buildings.

In Jim’s summary and conclusions- The expected was stated that they are consuming more energy in the 53rd St building than QNS. This is mostly in winter (see the third image). The QNS building is more efficient because of the free cooling, lower set point temperature and equates to lower energy usage thanks to an efficient building design. Online Resources:

  • Steam- natural gas utility converter: http://www.coned.com/steam/default.asp
  • NIST Guide for the Use of the International System of Units (SI) 2008: http://physics.nist.gov/cuu/pdf/sp811.pdf
  • Humidity converter: http://www.cactus2000.de/uk/unit/masshum.shtml
  • Dewpoint calculator: http://www.decatur.ed/javascript/dew/index.html
  • NOAA, National.ncdc.noaa.gov/

42nd Annual Meeting – Opening Session, 29 May, "Being a Gallery in a Park – balancing Sustainability, Access and Collection Care" by Nicola Walker and Ann French

This talk revolved around the Whitworth Art Gallery, part of the University of Manchester in the UK. I was interested in this talk in particular because I was interested to see the differences between UK and US approaches to sustainability, and to see how sustainability measures against other principles such as access and recommended storage conditions.
One of the central themes of this talk was that “access is central to all of the gallery activities”. This resulted in some interesting decisions, which strike a balance between practical and ideal. One that stuck out to me personally was the presence of an IPM working group which meets weekly, to discuss what needs to be done in order to ensure that events like festivals and those involving food can be pulled off. Their maintenance of a ‘can do’ attitude is inspiring, and ensures that the museum works with it’s surroundings – a park, which families want to be able to visit and enjoy in tandem with the museum.
The process which the museum went through in order to add an addition to the building was also discussed. A few points stood out there, as well:
– A new route was introduced to separate catering delivery from art movement and delivery (which is also related to the IPM working group).
– A green, bio-diverse roof was put into place on part of the building.
– Stores were relocated into a basement, where the environment can be controlled with passive techniques rather than air conditioning.
– Solar panels were added to the roof.
– Daylight was introduced into some galleries.
– A ground source heat pump was installed.
The idea of the green, bio-diverse roof was fascinating. In order to prevent it from drawing unwanted pests into the museum, they worked with entomologists to ensure that they only attracted specific insects – those who don’t want to eat their lovely textile collection. The introduction of daylight into galleries as discussed here formed a funny comparison to another talk given on sustainability and environmental consciousness.
Another aspect to sustainability was also discussed: the development of working patterns which allow the collection to be feasibly managed and kept in the best condition. One of the theories they work under is known as the Pareto 80:20 principle, which says that 80% of results come from 20% of issues, or in this case, 20% of objects. They use this principle to target their work-flow, focusing on the 20% which give the most result and working on the other 80% on a “modular” basis.
This cross of sustainable environment and sustainable work practices extends to the methods they use to package their 2D objects, as well. This category of object is packaged in a way that it can be easily switched from storage to display or vice versa, and the packaging provides a buffering layer that reduces the need for strict environmental control.
I would have loved to hear more about these storage/display procedures, as I think they could be useful for other museums. I’m also curious to have a more specific list of the plants they used in their bio-diverse roof garden, because that too could be useful in other places. Their practices seem to be very widely applicable, and their attitudes towards having a museum that works for the public and within its environment are admirable. I would love to see other museums adopt these approaches, to be environmentally friendly and to sustain the working environment of conservation professionals.

42nd Annual Meeting: Emerging Conservation Professionals Network (ECPN) Luncheon, May 29th- Speed Networking and Career Coaching

The Scene: Anyone entering the hotel atrium at the AIC Annual Meeting on Thursday from 12-2 might have confused the pairs of people at numbered tables as a new conservation speed-dating event.  To give members a similar ability to make multiple connections in a limited time, the Emerging Conservation Professionals Network (ECPN) organized a speed-networking event that enabled approximately 100 conservators to meet face-to-face.

The Set-up: Before the networking began and while enjoying a boxed lunch, participants heard from various speakers.

  • Anna Zagorski and Angela Escobar, members of the Communications Group of the Getty Conservation Institute (GCI), spoke on behalf of GCI, the event sponsor.  GCI strives to advance the field of conservation through the creation and dissemination of knowledge to the field and for the field, using a variety of resources.  Information on GCI can be found on the GCI website.
  • Elena Torok and Greta Glaser gave a moving and heartfelt memorial to their WUDPAC classmate of 2013 and fellow emerging professional, Emily Schuetz Stryker, who passed away unexpectedly in February.
  • Stephanie Lussier, the AIC Board liaison to ECPN, gave a brief history of ECPN and lauded its accomplishments thus far, including the webinar series and portfolio symposium from the 2012 Annual AIC meeting.
  • Megan Salazar Walsh, current ECPN Vice Chair and upcoming Chair, also recognized current ECPN officers and liaisons, AIC staff Ruth Seyler and Ryan Winfield, and the specialty groups who contributed to the event.

The Conservators Conversing: In the weeks leading up to the event, participants filled out a questionnaire ranking their preferences of matches and the topics to be discussed.  Each person received a different match for the three 15-minute sessions based on their responses.  At the event the participants were given their matches’ information, as well as a handout on basic career and resume-building advice.  Pairs found one another for each session at designated tables, and soon the room was abuzz with enthusiastic energy from emerging and established conservator alike.  Two established professionals wandering by the event even joined in the fun and provided last-minute guidance.

Reviews:
“This was so much fun! I loved the variety of people I was matched up with. Thanks to all that organized this event for all of your hard work! I found it extremely rewarding.” – Alexandra Nichols
“This was my favorite part of the conference! The mentors I was paired with gave thoughtful and useful advice, and I hope to continue contact with them. Thank you for this opportunity, and I hope that we can continue more events like this in the future.” – Jacinta Johnson
“I met some really wonderful people during this event. Thanks for all your hard work ECPN!” – Amy Hughes
“You guys really outdid yourselves! Thank you for setting up such a fun and helpful event!” – Jackie Keck
Thanks to everyone who participated to make this first networking event a success!
The author would like to dedicate this blogpost to Eliza Spaulding in recognition of her hard work as ECPN Chair through 2012-2013.  Thank you, Eliza.

42nd Annual Meeting – Health & Safety Group Session, May 31st, "Medical Evaluations for Museum and Collection Care Professionals" by Ruth Norton and David Hinkamp

Ruth Norton and David Hinkamp presented “Medical Evaluations for Museum and Collection Care Professionals” at the first Health and Safety Session at AIC.
Ruth Norton started off by discussing a case study of a natural history collection that had been treated with toxic chemicals such as arsenic to deter infestations.  To determine if there were residual chemicals in storage and the ambient environment, a variety of tests were conducted for compounds such as lead, arsenic, mercury, organochlorine, and organophosphate.  The results of the study concluded that most chemicals were below accepted target levels.
The museum instituted written procedures for working in collection areas and handing arsenic-contaminated objects.  Good hygiene practices and the use of personal protective equipment (PPE) were also employed.   Finally, every workroom and storage room was “deep cleaned” annually to mitigate the spread of dust and other airborne contaminants.
Dave Hinkamp, a physician in occupational medicine, followed Ruth’s talk with a discussion on health and safety hazards in collection work, assessing hazards in the workplace, healthcare professionals, information to tell one’s healthcare professional and what steps one can take now.
When approaching health care professionals, tell them about your work duties, materials used (e.g. adhesives, formaldehyde, etc.), and unique aspects of your work.  Concerns that should be discussed include acute episodes such as asthma, personal health issues such as pregnancy, chronic exposure such as long periods of work with possible hazard, and any other health problems.
Two points that really stood out to me:

  1. Identify your work hazards and their effects.
  2. Protect yourself by either eliminating, substituting, controlling, or limiting your exposure to toxic substances.  Employ PPE and don’t eat, drink, or touch your face at your workstation!

 

42nd Annual Meeting – Electronic Media Group Luncheon, May 30, “Sustainably Designing the First Digital Repository for Museum Collections”

Panelists:
Jim Coddington, Chief Conservator, The Museum of Modern Art
Ben Fino-Radin, Digital Repository Manager, The Museum of Modern Art
Dan Gillean, AtoM Product Manager, Artefactual Systems
Kara Van Malssen, Adjunct Professor, NYU MIAP, Senior Consultant, AudioVisual Preservation Solutions (AVPreserve)
This informative and engaging panel session provided an overview of The Museum of Modern Art’s development of a digital repository for their museum collections (DRMC) and gave attendees a sneak peak at the beta version of the system. The project is nearing the end of the second phase of development and the DRMC will be released later this summer. The panelists did an excellent job outlining the successes and challenges of their process and offered practical suggestions for institutions considering a similar approach. They emphasized the importance of collaboration, communication, and flexibility at every stage of the process, and as Kara Van Malssen stated towards the end of the session, “there is no ‘done’ in digital preservation” — it requires an inherently sustainable approach to be successful.
This presentation was chock-full of good information and insight, most of which I’ve just barely touched on in this post (especially the more technical bits), so I encourage the panelists and my fellow luncheon attendees to contribute to the conversation with additions and corrections in the comments section.
Jim Coddington began with a brief origin story of the digital repository, citing MoMA’s involvement with the Matters in Media Art project and Glenn Wharton’s brainstorming sessions with the museum’s media working group. Kara, who began working with Glenn in 2010 on early prototyping of the repository, offered a more detailed history of the process and walked through considerations of some of the pre-software development steps of the process.
Develop your business case: In order to make the case for creating a digital repository, they calculated the total GB the museum was acquiring annually. With large and ever-growing quantities of data, it was necessary to design a system in which many of the processes – like ingest, fixity checks, migration, etc.- could be automated. They used the OAIS (open archival information system) reference model (ISO 14721:2012), adapting it for a fine art museum environment.
Involve all stakeholders: Team members had initial conversations with five museum departments: conservation, collections technologies, imaging, IT applications and infrastructure, and AV. Kara referenced the opening session talk on LEED certification, in which we were admonished from choosing an architect based on their reputation or how their other buildings look. The same goes for choosing software and/or a software developer for your repository project – what works for another museum won’t necessarily work for you, so it’s critical to articulate your institution’s specific needs and find or develop a system that will best serve those needs.
Determine system scope: Stakeholder conversations helped the MoMA DRMC team determine both the content scope – will the repository include just fine arts or also archival materials? – and the system scope – what should it do and how will it work with other systems already in place?
Define your requirements: Specifically, functional requirements. The DRMC team worked through scenarios representing a variety of different stages of the process in order to determine all of the functions the system is required to perform. A few of these functions include: staging, ingest, storage, description & access, conservation, and administration.
Articulate your use cases: Use cases describe interactions and help to outline the steps you might take in using a repository. The DRMC team worked through 22 different use cases, including search & browse, adding versions, and risk assessment. By defining their requirements and articulating use cases, the team was able to assess what systems they already had in place and what gaps would need to be filled with the new system.
At this point, Kara turned the mic over to Ben Fino-Radin, who was brought on as project manager for the development phase in mid-2012.
RFPs were issued for the project in April 2013; three drastically different vendors responded – the large vendor (LV), the small vendor (SV), and the very small vendor (VSV).
Vetting the vendors: The conversation about choosing the right vendor was, in this blogger’s opinion, one of the most important and interesting parts of the session. The LV, with an international team of thousands and extremely polished project management skills, was appealing in many ways. MoMA had worked with this particular vendor before, though not extensively on preservation or archives projects. The SV and VSV, on the other hand, did have preservation and archives domain expertise, which the DRMC team ultimately decided was one of the most important factors in choosing a vendor. So, in the end, MoMA, a very big institution, hired Artefactual Systems, the very small vendor. Ben acknowledged that this choice seemed risky at first, since the small, relatively new vendor was unproven in this particular kind of project, but the pitch meeting sold MoMA on the idea the Artefactual Systems would be a good fit. Reiterating Kara’s point from earlier, that you have to choose a software product/developer based on your own specific project needs, Ben pointed out that choosing a good software vendor wasn’t enough; choosing a vendor with domain expertise allowed for a shared vocabulary and more nimble process and design.
Dan Gillean spoke next, offering background on Artefactual Systems and their approach to developing the DRMC.
Know your vendor: Artefactual Systems, which was founded in 2001 and employs 17 staff members, has two core products: AtoM and Archivematica. In addition to domain expertise in preservation and archives, Artefactual is committed to standards-based solutions and open source development. Dan highlighted the team’s use of agile development methodology, which involves a series of short term goals and concrete deliverables; agile development requires constant assessment, allowing for ongoing change and improvement.
Expect to be involved: One of the advantages of an agile approach, with its constant testing, feedback, and evolution, is that there are daily discussions among developers as well as frequent check-ins with the user/client. This was the first truly agile project Artefactual has done, so the process has been beneficial to them as well as to MoMA. As development progressed, the team conducted usability testing and convened various advisory groups; in late 2013 and early 2014, members of cultural heritage institutions and digital preservation experts were brought in to test and provide feedback on the DRMC.
Prepare for challenges: One challenge the team faced was learning how to avoid “scope creep.” They spent a lot of time developing one of the central features of the site – the context browser – but recognized that not every feature could go through so many iterations before the final project deadline. They had to keep their focus on the big picture, developing the building blocks now and allowing refinement to happen later.
At this point in the luncheon, the DRMC had it’s first public demo. Ben walked us through the various widgets on the dashboard as well as the context browser feature, highlighting the variety and depth of information available and the user-friendly interface.
Know your standards: Kara wrapped up the panel with a discussion of ‘trustworthiness’ and noted some tools available for assessment and auditing digital repositories, including the NDSA Levels of Digital Preservation and the Audit and Certification of Trustworthy Digital Repositories (ISO 16263:2010). MoMA is using these assessment tools as planning tools for next the phases of the DRMC project, which may include more software development as well as policy development.
Development of the DRMC is scheduled to be complete in June of this year and an open source version of the code will be available after July.

42nd Annual Meeting: Health and Safety Session, ‘Solvents, Scents and Sensibility: Swapping – Solvent Substitution Strategies’ by Chris Stavroudis

Part I of ‘Solvents, Scents, and Sensibility: Sequestering and Minimizing’ was presented on Friday and encouraged the use of Pemulen TR – 2 in cleaning as an alternative to solvents or as a vehicle for solvents.
The topic of Part II was substituting safer solvents for more hazardous ones. Chris Stavroudis began the talk with a warning: There is no perfect substitute for Xylenes. He did, however, address some alternatives later in his talk.
Some of the harmful solvents that Chris suggested replacing were:
Benzene (a carcinogen) – can be replaced with xylene or toluene (although these alternatives are also hazardous)
n-Hexane (a neurotoxin) – can be replaced with n-Heptane
DMF – replace with n-methyl-2-pyrrolidone (NMP), although this may also be hazardous
Methanol – replace with Ethanol
Cellosolve and Cellosolve Acetate – just don’t use them! May be able to substitute butyl Cellosolve
Chlorinated Solvents – don’t use them. 1,1,1 trichloroethane is the least of the evils, but is terrible for the environment
Xylenes (it is a mixture of isomers and contains varying levels of ethyl benzene) – It may be safer to use xylene (single isomer) but this hasn’t been adequately tested.
 
Stavroudis stressed the fact that there is a difference between a safe solvent and an untested solvent. The two should not be confused and proper safety precautions must be made. He gave multiple examples of solvents that were once considered to be safe and that we now know can be hazardous (ex: d-limonene).
The use of silicone solvents was encouraged because they are versatile, as they can be cyclic or linear, and have a very low polarity. Silicone solvents may be safer than alternative solvents. They are found in make-up, are practically odorless (although this makes exposure difficult to gauge).
Another safer solvent that Chris mentioned was Benzyl Alcohol which has aromatic and alcoholic functionality, although it is toxic to the eyes.
Chris ended his talk with a review and discussion of solubility theory, including the Hildebrand and Hansen Solubility parameters and the TEAS diagram. This review was focused on the problem of finding a replacement for Xylene, a solvent that would have the same solubility characteristics. Chris’ Modular Cleaning Program is a greener and healthier technique/tool and includes Hildebrand, Hansen, and TEAS solubility theories. Using these theories the solvent mix that most closely matches the solubility characteristics of Xylene is a mixture of nonane and benzyl alcohol. There is more experimentation to be done and the next version of MCP can help you experiment with solvent mixtures and solubilities.

“42nd Annual Meeting,” Collections Care Speciality session, May 29th, 2014, "Simple Method for Monitoring Dust Accumulation in Indoor Collections." Bill Wei

“Simple Method for Monitoring Dust Accumulation in Indoor Collections,” by Bill Wei was the first session in the Collections Care specialty section that was given on Thursday afternoon. As a museum technician in Preventive Conservation, dust is something I deal with on an almost daily basis. I thought that Bill’s talk could lend some valuable insight to my work, and I wasn’t wrong.  Bill Wei is a Senior Conservation Scientist at the Rijksdienst voor het Cultureel Erfgoed, and in his session he presented on a simple and easily implemented way a museum could monitor how fast dust accumulates in an indoor collections space. He used the Museum de Gevangepoort and the Galerij Prins Willem V to demonstrate how the method.
The talk started off with a humorous introduction by Bill about views on dust in museum spaces. How for some people, museum professionals in particular, we can take a defensive stance on dust as if it implies we aren’t doing our jobs. For other individuals, dust adds an element of age that seems appropriate. He also mentioned that when the words “dusty museum” are googled the result is over 12,000 hits. Apparently more than just museum professionals see dust. Bill brought up the fact that dust is not only an aesthetic issue in museums, it can present chemical and health issues, and it can be costly and timely to remove. The two sites were then introduced, both of which house collections and are historic buildings. Construction was being done near the sites, and there was a concern about how much more dust accumulation this might cause, so they provided a good case study. Bill then introduced the question of how do you monitor dust?
Bill explained that dust on the surface of an object causes the light to bounce off in many different angles, as opposed to at the same angle, this makes a surface look matte. The resulting matte surface can then be considered to have lost gloss. This loss of gloss is something that can be measured using a glossmeter. The type of glossmeter used during this test was made by Sheen manufacturers. Bill was careful to point out that this test doesn’t measure how much dust you have, but how quickly it will accumulate. For this run of the test Bill used microscope glass slides, because they are cheap, reusable and glossy. The steps of the test are as follows:

  1. Using the glossmeter, measure a clean slide on a white background (copy paper is suitable. This should be the same background used throughout testing.)
  2. Put slides out at various locations you wish to test, remembering that the more slides you put out, the more work you will have to do. The slides should be placed in out of the way locations and staff should be told about them.
  3. After a predetermined amount of time (ex. one month), using the glossmeter measure the slide on the same background that you used in step 1.
  4. Clean the slide, and reuse, starting over at step 1.

The calculation that is then used to determine the rate of accumulation of dust over the time period is
Fraction change= (Dusty Slide after 1 month measurement – Clean Slide measurement)/ (Clean slide measurement)
Multiply that by 100 to get the percentage.
Bill explained that for every month that you take a glossmeter measurement, you add the value of the new measurement to the previous, since this is cumulative you will go over 100% at some point. You can then use these values and plot them in a graph over time.
If you wanted to test the dust samples, to find out where the dust was coming from and what it was made of, you could incorporate small conductive carbon stickers on the slides. Since this talk focused on the accumulation, not the source of the dust, this topic was not discussed in detail.
The placement of the slides was at one point done both vertically and horizontally surface. The vertical placement was done to mimic how much dust a painting might accumulate. However the vertically placed slides needed a much longer period of time to really show a loss in gloss, so it was not considered as necessary to run both types of slide placement.
When it came to analyzing the results of this test one thing that was found was the fact that the slide nearest the entry had the most dust. When it’s results were plotted onto a graph it produced the steepest slope over time. The more visitors a museum has, the more dust accumulation occurs. During peak tourist times there was a correlating peak in dust accumulation. One thing that was also noticed at the Museum de Gevangepoort was that during construction periods there was also a rise in dust accumulation. The results confirmed a long held thought that visitors are one of the main sources of dust in museums.
Bill then talked briefly about the chemistry of dust. When the dust was analyzed it was found to contain salts, iron, chalk, sand, clay and concrete among other things. When the makeup of the dust was looked at, it was possible to notice trends, for example during the winter months, February in particular there was a noticeable rise in the amount of salts found. Looking at what the dust was comprised of could allow scientists to identify the source of the dust.
Bill pointed out that the idea of too much dust isn’t really something that is definable in terms of science. It’s more defined by people’s perception of it. Different surface types can be just as dusty as one another, but if the dust is more visible on one type of surface, say plexi, the viewer read’s that surface as being less clean.
In discussing an action plan for dust monitoring Bill said you have to determine why you are doing it, i.e. to see if your new HVAC system is producing better results, and it’s important to define “too much dust” as a difference in gloss.
The questions asked after Bill’s presentation included, how many/ what angle should a gloss measurement be taken, to which Bill answered one measurement at 85 degrees was sufficient. He was also asked how often one should be taking measurements. He said that three to four weeks at most will produce good results, if you measure too soon a change won’t be seen.
Bill’s presentation was informative and lively. He presented a system for testing dust accumulation that could easily be implemented and followed. Thanks to Bill for a great talk!
 
 

42nd Annual Meeting – Paintings Session, May 30, "Aspects of Painting Techniques in 'The Virgin and Child with Saint Anne' Attributed to Andrea Salai" – Sue Ann Chui and Alan Phenix.

This paper, presented by Sue Ann Chui, intrigued and enticed us to want more. She noted at the beginning that the title had changed to “Leonardo’s Obsession: A Workshop Variant of his ‘Virgin and Child with Saint Anne’ from the Hammer Museum, UCLA.” This is a pertinent point to keep in mind in the broader scope of the day’s PSG talks.
Leonardo da Vinci spent fifteen years working on the painting of “Virgin and Child with Saint Anne” (now at the Louvre), keeping it in his possession, leaving it unfinished at the time of his death. While continuing to work in his studio, other variants were being created in the workshop. It was noted that the Hammer painting is in remarkable condition (both structurally and aesthetically) and that the panel is virtually unaltered.
The oil on wood panel painting, in storage for many years and thought to be an early copy, was attributed to Salai (Gian Giacomo Caprotti da Oreno 1480-1524). The panel support, estimated to be poplar with coniferous wood battens, tangential cut and not thinned, is remarkably close to Leonardo’s original panel (the “Louvre” panel) with similar tool marks and dowels. In addition to these similarities, the panel’s thickness (2-2.8cm) would suggest that both wood panels came from the same workshop in northern Italy.
Analysis revealed the ground to be calcium sulfate and glue with an imprimatura of lead white. Compositional changes can be seen in the under drawing (infrared imaging) of Saint Anne’s left foot and several other areas. Walnut oil was characterized as the binding medium in other samples. Pigments were characterized as lead white, carbon black, vermillion, lead tin yellow, red iron oxides, natural ultramarine, azurite, orpiment, transparent glazes of copper green and red lake.
The Virgin’s mantle, with a complex stratigraphy, presents some interesting questions. Does the stratigraphy represent an original sequence or changes by the artist? Analysis of the blue mantle reveals three applications of grey, along with ultramarine, and two applications of red lake glazes on top of the imprimatura and below the grey layers. Is a thinly applied transparent glaze as a preliminary layer, similar to Leonardo’s technique, intentional? The purple toned sleeve of Saint Anne, comprised of reds, red lake and layers of what appear to be retouching varnish is changed from a red-brown to a purple color similar to color found in the Louvre painting.
Two interesting finds in the Hammer Museum’s panel were imprints from fabric and fingerprints. Historical references mention the use of a textile to even out a glaze, as seen in an area of blue on the panel and using the palm of the hand to uniformly spread a glaze (leaving fingerprints in the paint – who might those fingerprints belong to?). Differing paint application in the scene’s plant foliage hint the passages may be by two different hands. Fine brushstroke’s in the face of Saint Anne suggest a very accomplished artist, leaving us to wonder if perhaps the master provided some assistance to workshop apprentices. It would seem the Hammer panel was almost certainly created in da Vinci’s studio.
The change in the title of the presentation tied in nicely with Elise Effmann Clifford’s presentation “The Reconsideration of a Reattribution: Pierre-Edourd Baranowski by Amedeo Modigliani.” In her talk Elise pointed out the biases and prejudices we all carry and need to be aware of. The need to look at each work afresh, consider all the findings of technical analysis, provenance, along with curatorial knowledge and instinct must inform how we approach artworks, while being mindful of our own biases.
As for my personal bias regarding the analysis of the Hammer panel I must admit that, like many in the attentive audience, I was hoping for a surprise ending that announced the Hammer painting would, in fact, be declared to be by the hand of the master. The session was packed full of high quality technical analysis (including a peek into workshop practices) suggesting deeper questions and the paint geek’s favorite, paint cross-sections!
————
Additional articles you may be interested in being cognizant of biases, the writer’s and your own!
 
LA Times article on Hammer St. Anne:
http://articles.latimes.com/2013/feb/05/entertainment/la-et-cm-leonardo-getty-20130206
Recent article in The Art Tribune mentions the Armand Hammer, UCLA panel:
http://www.thearttribune.com/Saint-Anne-Leonardo-Da-Vinci-s.html
Guardian article on over cleaning of panel:
http://www.theguardian.com/artanddesign/2011/dec/28/louvre-leonardo-overcleaned-art-experts
ArtWatch article:
http://artwatchuk.wordpress.com/tag/leonardos-virgin-and-child-with-st-anne/

Credit: via Tumblr from WTF Art History
Workshop of Leonardo da Vinci, The Virgin and Child with Saint Anne, c. 1508-1513, oil on panel. University of California, Hammer Museum, Willitts J. Hole Art Collection, Los Angeles
Credit: via Tumblr from WTF Art History