42nd Annual Meeting – Photographic Materials, May 31, "László Moholy-Nagy: Characterization of his Photographic Work at the Art Institute of Chicago and his Working Practices" by Mirasol Estrada

Mirasol Estrada, the Andrew W. Mellow Fellow in Photograph Conservation at the Art Institute of Chicago, studied the work of László Moholy-Nagy in the museum’s collection for two years. Her talk was a comprehensive look at the photographer’s working practices as well as the specific characteristics of his photographs in the collection at the Art Institute of Chicago. Ms. Estrada was drawn to the work of Moholy-Nagy because of the experimental nature of his working practices, and his philosophy of photography.
Moholy-Nagy always thought of himself as a painter but he also produced drawings, film and photographs. He came to Chicago in 1937 to direct the New Bauhaus, which then became the Institute of Design. He was an influential teacher, including teaching his philosophy about the practice of photography, published in his book “Vision in Motion”. This philosophy was summarized very nicely by Mirasol, who described Moholy-Nagy’s idea that there are eight varieties of seeing.
The first variety of seeing is “Abstract”, which includes photograms, which are direct records of shapes and shadows. The second is “Exact”, which is straight-forward camera photography. The next is “Rapid”, which shows motion, followed by “Slow”, his description for long exposures. “Intensified” was using chemical manipulation such as solarization. “Penetrative” described x-rays, “Simultaneous” was the term for his photomontages, and lastly, “Distorted” was the term for mechanical or chemical manipulation of a print or negative. This was an interesting summary of Moholy-Nagy’s ideas about the variety of seeing correlated to his photographic method – a good window into the photographer’s thinking process and the categorization of his themes.
Ms. Estrada then took us through the characterization, both physical and analytical, of the thirty-nine photographs in the Art Institute’s collection. She grouped the prints physically by their size, tonal range, surface texture, finishing (coatings) and thickness. These groupings were displayed in a very clear, easy to read chart detailing the characteristics, including thumbnail photos of each object overall and in detail to show tone, surface texture, etc. Analytical data for each object was also included (XRF, FTIR) to complement her visual observations. Using her chart, one could compare the date clearly and easily, looking at tone, texture, and subject of the image.
A few interesting observations that were made following the study were that Moholy-Nagy most likely did not process his own photographs – he has been known to have explained that he was allergic to the development chemicals. This may explain the diverse body of work and materials choices, since his students, wife, and daughter may all have been a part of the processing of his artwork. Moholy-Nagy used many different types of paper and other materials, especially noticeable on his move from Europe to the US, reflecting the marketplace at each time and place. Ms. Estrada offers that it was perhaps more important for the artist to express his ideas, his complex categories of “varieties of seeing”, in the Bauhaus tradition, instead of focusing on the fabrication of his artwork.

42nd Annual Meeting – Workshop, May 28, 2014, “Dataloggers – Establishing and Maintaining Environmental Monitoring Systems” by Rachael Perkins Arenstein and Samantha Alderson

This workshop was a smorgasbord of dataloggers, filled with details about how they function, how the recorded information is moved from one device to another to be analyzed and repurposed, and how to think about choosing the right type of datalogger to match a particular environmental goal. I came into the workshop hoping to learn about new equipment that’s on the market now, to advance my institution’s upcoming project to re-invigorate our environmental monitoring and control program, in support of both energy and preservation goals. I got what I came for!

Workshop instructor and participants examining a long table with many types of dataloggers laid out in rows
Samantha Alderson and Suzanne Hargrove discussing datalogger options

The workshop was taught by Samantha Alderson and Rachael Perkins Arenstein, both of whom have advised institutions large and small about environmental monitoring programs, and clearly know what they are talking about. They recently updated the National Park Service Conserve-O-Gram (“Comparing Temperature and Relative Humidity Dataloggers for Museum Monitoring,” September 2011, Number 3/3, http://www.nps.gov/museum/publications/conserveogram/03-03.pdf ), which is worth reviewing, but with the caveat that the technology is changing so rapidly that vendors and specifications should be researched anew when you’re planning for a major purchase.
The presenters started by reviewing basics of the hardware and connectivity, summarizing what kind of data loggers can collect, how many loggers one needs, and where to place them, and for how long. They also talked about general environmental management concepts so the less experienced in the audience wouldn’t be left behind.
They then explained a basic difference between two families of dataloggers:

  • Stand-Alone Loggers collect data which is then harvested either by direct wired connection to a computer, through an indirect intermediary device like a card reader or thumb drive, or wirelessly; this method is appropriate when you don’t need real-time data
  • Connected Loggers either wired (Ethernet) or wireless (radio, WiFi, cellular, etc.) transmit data to a receiver that then aggregates the data from one or more devices; this method is appropriate when you need real-time data, need to receive alerts, and when you need to manage a lot of devices

Other topics covered included datalogger software and data management, calibration, and a group activity in which we had to choose (and justify) a monitoring system for one or more specific scenarios. This activity was my favorite part of the workshop, and I wish we had spent more time on this. It was a practical test of how to figure out why and what you need to monitor, and how to maximize your resources to achieve that goal.
Helpful handouts included charts of various datalogger models/systems with comparison of many variables including costs. Options to consider include: connectivity, size and aesthetics of the logger, battery type and life, sensor quality, data capacity, cost, accessibility of the device once installed, built-in display of current readings, display and/or communicate alarms, sampling rate, calibration method, probe option, and software platform compatability.
Here are some take-aways that for me will inform my upcoming work:

  • The landscape of available hardware is rapidly changing with developments in communications technologies; Bluetooth is the hot technology according to several vendors, so they are investing their development efforts into Bluetooth connectivity for their upcoming upgrades and new releases
  • Sensor quality matters, but there are also differences (reflected in the wide range of prices) in everything else around the sensor…most notably the architecture of the device, the circuitry and algorithms used to translate the sensor data into numbers. You get what you pay for, but that should be matched to what you need.
  • Sensors are very sensitive to organic vapors! They can be destroyed by a big whiff of solvents, and even thrown off by off-gassing from the plastic housing in which they are mounted.
  • Loggers need to be checked for accuracy (you can do this yourself with saturated salt solutions according to instructions in another helpful handout), and if they have drifted, they need to be recalibrated (some you can do yourself, others have to be sent to the manufacturer); battery replacement is also variable (some are DIY, others not).
  • Most connected loggers require IT support for installation in an institution, so include your IT staff during the planning phase; be sure to ask them about WiFi encryption requirements
  • Wireless technologies may be affected by building/exhibit case material and construction, as well as nearby noise-emitting sources
  • Software varies a lot, but some of the systems can import data from other manufacturers’ devices; again, you get what you pay for, but the options I favor include the ability to import climatic data, graphical visualization of the data in a format that’s understandable by a range of audiences, and good tech support.
  • Get a demo set from the vendor prior to purchasing the whole system to make sure it works for your building

At many points throughout the Collections-Care focused Annual Meeting, I noticed that careful environmental monitoring and interpretation of the data becomes a fundamental part of energy savings and decision-making, grant-funding and construction/renovation of storage spaces. I almost wish the workshop had happened right after the meeting instead of before, because I would have had many more big-picture questions to ask of the presenters. Mostly, I want to hear a more substantive discussion about why we monitor, and how to translate the data into words that advance preservation priorities. Environmental monitoring is a time- and resource-intensive process, so we should be thoughtful and strategic about it.

42nd Annual Meeting- General Session, May 30, "Using Webinars to Tackle Conservation Misinformation in Ontario's Community Museums" by Fiona Graham

“Conservation is an elusive practice just outside of budgetary reality.”  Fiona Graham, a conservation consultant in Kingston, Ontario, received this comment in a survey filled out by a small museum in Ontario, and it made her take notice.  Museums believing that conservation only equates to (costly) treatment leaves no room for implementing best practices, taking vital preventive measures, and leads to a general misunderstanding of the basic principles of preservation.  Graham set out to change the perceptions of these museums and chose webinars as her format.
Who: Ontario’s Community Museums–roughly 300 institutions that range in size but are not art galleries, private collections, or national museums.  Only 14 have in-house conservators (in one case, 9 museums share one conservator!).  The collection care for the remaining 286 falls into the hands of non-conservators.
Why: 185 of those Ontario Community Museums receive operating grants from the Ministry’s Museum Unit to survive economically.  In order to receive these grants, the museums must meet regulatory requirements, including a conservation standard.  To assess the state of conservation and preservation in the museums, a questionnaire was distributed to the museums, and Graham and her team discovered some startling misunderstandings.  For example, many respondents believed that light damage was caused only by UV, that pesticides are still needed, and that cold temperatures are always bad for collections.  (Since they are in colder climates, it’s especially disconcerting to think of the expenses paid to raise temperatures in these museums.)
What was done:  To debunk misunderstandings at as many of the museums as possible, the Ministry funded two 1.5 hour long webinars.  The webinar format was chosen because it can reach a targeted audience, has wide accessibility and the ability to be interactive, is inexpensive to produce, and has been successful through the Ontario Museums Association (an organization that provides training in museum work).  After institutions answered preliminary questions on their registration forms, webinars were conducted as powerpoint presentations narrated live by a conservator using the icohere platform.  The first webinar, Conservation 2.0, was a “good practice” refresher course meant for non-conservators, while the second, Climate Control: what do you really need?, focused on misinformation hot spots.  Participants used their own computers and sent questions to a moderator who passed them to the conservator to answer.  The Ontario Museum Association posted the slide deck and audio to their website after the webinars ended.
More details?  The prep questions: Define what conservation means in the context of your museums? What question about conservation would you like answered in this webinar? What do you think relative humidity and temp levels should be in your museum’s collection areas? Do you monitor RH and/or T; do you actively control RH? (The webinars included a disclaimer that “this webinar is not a substitute for proper training.”)
Results:  The webinars were open to all, not just the Ministry-funded institutions, and 55 organizations participated during the live broadcasts.  The prep questions from the registration forms informed the content of the webinars.  There was positive feedback overall, with requests for more programs.  The negative feedback regarded the amount of detailed information on conservation.  Graham recommends being very clear on expectations.  The webinar team will be able to gauge the long-term results of the refresher courses during the next audit in 2018.
(Author’s comments: This talk was part of the general session on Engaging Communities in Collections Care.  The U.S. Heritage Preservation organization also offers webinars to help smaller institutions with collections care.  Their webinars are part of their Connecting to Collections (C2C) online community.  Past programs are available in their archives.)

AIC’s 42nd Annual Meeting – Opening Session, May 29, “Precaution, proof, and pragmatism: 150 years of expert debate on the museum environment” by Foekje Boersma, Kathleen Dardes, and James Druzik

Foekje Boersma, along with Kathleen Dardes and James Druzik, provided an informative summary of the debate regarding environmental standards in their presentation “Precaution, proof, and pragmatism: 150 years of expert debate on the museum environment.”  The presentation began with a historical review, based in part on information obtained from AIC’s Conservation Wiki.
The Museum of Fine Arts Boston and the Cleveland Museum of Art were the first museums to set specific humidity recommendations, in 1908 and 1915, respectively.  It is often stated that the development of environmental standards arose as a by-product of the storage of artworks in salt and coal mines during World War II, so I was interested to learn of earlier attempts at environmental control.
In 1940, Harold Plenderleith and George Stout said there was not adequate information to fix an “absolute standard” but suggested 60 – 65% relative humidity, chosen because it was easiest to maintain with stability.  Later, Plenderleith, now working with Paul Philippot, prescribed a “region of security” of 50 – 65% RH.  According to Boersma, these early conservators were pragmatic: although a set temperature and RH were specified, a greater emphasis was made on avoiding extremes.  The local climate and historical conditions of the objects were also to be taken into account.  Garry Thomson, who is often assigned either the credit or blame, depending on whom you ask, for the 50% RH/70° F standard, is misinterpreted according to Boersma.  He was also pragmatic.  Rather than endorsing the 50/70 rule, he merely predicted the increasing number of museum loans would lead to museums adopting that rigid standard.
Boersma attributes the widespread implementation of the 50/70 rule to the museum building boom in the 1970s.  Architects and engineers wanted numerical targets, and conservators were happy to specify safe conditions.  Sustainability was not much of a concern given cheap energy costs.  But already by 1979, CCI was advising seasonal variations with gradual fluctuations.  Boersma then skipped ahead to the 1990s and the controversial research of Charles Tumosa and Marion Mecklenburg at MCI, which said that materials aren’t as sensitive as previously thought.
Today, the debate on the museum environment has moved from conservators to museum directors and administrators.  The Bizot Group, concerned about environmental and economic sustainability, pushed to broaden environmental standards by adopting new Guiding Principles and Interim Guidelines, influenced by those developled by the NMDC (the National Museum Directors’ Council). In response, guidelines were published many other groups, such as AIC, BSI, AICCM, and the Doerner Institut.
In order to clarify the debate, Boersma divides prevailing views into three categories: precautionary safety, proven safety, and pragmatic risk management.  Precautionary safety, embodied by the Doerner Institut’s Munich Position, centers around the belief that “stable is safe.”  Not enough research has been done on the response of objects to wider environmental conditions.  To eliminate risk, objects should be kept under a narrow set of conditions.  Supporters of the proven safety approach acknowledge that actual conditions are wider than 50/70 because tight standards are impossible to maintain.  The proofed fluctuations of 40 – 60% RH and 50 – 70˚ F are acceptable.  Pragmatic risk management reflects ideas of risk assessment developed in the 1990s.  Resources should go to the reduction of the biggest risks to collections, which may or may not be climatic fluctuation.
In conclusion, Boersma wonders how conservators can function as a profession given such different views on a central topic.  She references her ongoing research as part of GCI’s Managing Collection Environments Initiative, which is working to answer questions generated by the debate.

42nd Annual Meeting – Opening Session, 29 May, "Quantifying cost effectiveness of risk treatment options (aka preventive conservation)" by Stefan Michalski and Irene F. Karsten

Preventive conservation was the topic of much discussion at this year’s annual meeting, from how to teach it to what exactly it entails. In this talk, Stefan Michalski discussed the quantification of preventive conservation.
He began by reminding us that we base our ideas of preventive conservation on the “proofed fluctuation” argument: if fluctuation in the past has not caused significant damage, then similar future fluctuations will not either. He also defined preventive conservation. First, we assess risks. Then, we ‘treat’ risks;  this second part is Preventive Conservation. We have to remember that ‘treat’ has a different meaning in this context than in remedial conservation, and despite being a loaded word, accurately describes what we do. These definitions are simultaneously straightforward and complicated; we struggle with them and yet we need them for our daily work.
Michalski continued by defining the four steps to successful preventive conservation:
1. Identify Options
2. Analyze
3. Evaluate
4. Implement
Steps 2-3 require quantification, and it’s vital that this quantification is transparent and well-documented. This is where Michalski and Karsten’s research comes in. They assessed the financial risk of every preventive option available for a variety of institutions, including an archive and a historic house.
In order to quantify reduction in risk, calculations were made using the following formulas:

  • Option effectiveness = size of risk reduction = size of original risk – size of reduced risk
  • Risk reduction / cost = [% of collection saved / $ spent] /year

I had never encountered this calculation before, or considered this as a feasible method of determining cost-effectiveness and ranking options, and I don’t think I’m alone in the conservation field in this. I wish that this had been covered by one of my graduate courses, because while it may seem obvious in some ways, the explanation was exceptionally helpful, and is something that I will take to my professional practice.
The numbers produced graphs on a logarithmic scale, in terms of percent saved per dollar. By evaluating options on this scale, it was possible to see how cost-effective various options are. What was highlighted with this calculation is that the cost effectiveness of an action is a function of the magnitude of risk – the bigger the risk, the better the return on percentage saved. This is in line with the economic principle of ‘economies of scale‘. What Michalski noted was that it is important to remember that the scale referred to is internal, not external, which means that small museums can be just as cost-effective as larger museums.
I loved this talk, and I felt like I learned a huge amount about quantification of risk. ‘Risk assessment’ is a term that we are all familiar with; to be able to go more in-depth is a skill, and Stefan Michalski did an excellent job of teaching that skill. His results are hugely applicable to museums and institutions of all sizes, and we should all learn and apply this method to aid in our decision-making for preventive conservation.

42nd Annual Meeting – Collection Care & HVAC, May 31, "Some trends in examining six years of utility and climate data at the Museum of Modern Art" by Jim Coddington

Jim Coddington, the chief conservator at the Museum of Modern Art (MoMA) in New York,  presented some trends that were found from analyzing the environmental data that was collected at MoMA over the past six years. This was particularly interesting because it compared two relatively new or newly renovated buildings with different types of usage/functionality and HVAC systems. The building on 53rd street, Jim admits, is very leaky from a number of sources, including the many doors through which thousands of people pass, and has a steam and electric HVAC system. The building in Queens (QNS) on the other hand is mostly concrete with very little glass and has a gas powered HVAC system. The data that Jim presented was collected from across the museum including finance, operation, conservation, and vistor services. Needless to say there are a lot of people invested in this.
Jim showed mostly graphs and charts. These included data showing the temperature and %RH outside, inside the buildings, dew point, and comparing this energy usage. I’ve included images of the graphs that I found most interesting or informative.

NYC average monthly temperatures (6 year average) showing periods of cooling and heating inside the buildings.
NYC average monthly temperature (6 year average) showing periods of cooling and heating inside the QNS building. Most graphs showed what the temperature was at 1 PM each day.

Indoor RH
This graph shows the indoor RH from fixed outdoor dew point to variable indoor set-point Temperature.

In QNS there is a large expenditures of gas in august and dips in winter. This is because that are able to use free cooling to extract excess heat for 8 or9 months, or 3 out of 4 seasons, through a heat exchanger on the roof. In this process, heat is absorbed from the condenser water by air chilled water. The length of time they are able to use free-cooling is based on set points of T and RH (see second image) and is affected by air temperature, relative humidity, and water supply temperature. Non-free cooling with the RH set at 50% happens over the summer and is longer at lower temperatures. So during the summer the temperature set point is allowed to drift to 22 degrees C. Jim mentioned that having a narrower set point may actually equal cost savings, but they have no data for that.
On the analysis for the 53rd street building, Jim highlighted that this is a very different situation. It is a high use building, with lots of leakage points and demand on the systems- steam and electric principally. Therefore, the energy usage is much higher.
It has been asked whether heat from visitors is significant? In Chris McGlinchey’s calculation, the 360 kJ/hr given off by the visitors with a typical stay of 4 hours, this is not a huge contributing factor.
The combined energy usage in kJ/m2 at the 53rd street and QNS buildings.

In Jim’s summary and conclusions- The expected was stated that they are consuming more energy in the 53rd St building than QNS. This is mostly in winter (see the third image). The QNS building is more efficient because of the free cooling, lower set point temperature and equates to lower energy usage thanks to an efficient building design. Online Resources:

  • Steam- natural gas utility converter: http://www.coned.com/steam/default.asp
  • NIST Guide for the Use of the International System of Units (SI) 2008: http://physics.nist.gov/cuu/pdf/sp811.pdf
  • Humidity converter: http://www.cactus2000.de/uk/unit/masshum.shtml
  • Dewpoint calculator: http://www.decatur.ed/javascript/dew/index.html
  • NOAA, National.ncdc.noaa.gov/

42nd Annual Meeting- OSG, May 31, "Restoration by Other Means: CT scanning and 3D Computer Modeling for the Re-Restoration of a Previously Restored Skull from the Magdalenian Era by J.P. Brown and Robert D. Martin"

After collaborating with JP at the Field Museum on rendering CT scans a few years ago and seeing his article about this work in the spring MRCG newsletter, I was excited to see some images about this in person. JP has been working with CT scanners since 2006 starting out by taking advantage of the kindness of local hospitals and more recently renting a portable unit that came to museum on a truck.
As many of us know, CT scanners can look inside objects non-destructively and provide accurate images with 3D geometric accuracy. JP started the talk be reviewing some of the physics of getting a CT scan done, the benefits, and limitations. Here’s a run-down:
1. The scanner has a donut shaped gantry consisting of a steel ring containing the X-ray tube and curved detector on the opposite side, so your object has to fit within the imaging area inside the steel ring.
2. On each revolution you get lots of images scanned within 30 seconds to 5 min- this is very fast.
3. The biggest logistical challenge is moving objects to and from the hospital safely.
4. During the scanning you immediately get slices, which are cross-section images from three different directions. Volumetric rendering  is done from the slices and there is free software for this.
5. Apparently it is relatively easy to do segmentation, segment out regions of interest, and extract wire frame models, just time consuming. From there you can get images of the surface and texture and can even print the models. It is relatively easy to go from slice to wireframe, but harder to achieve a manufacturing mesh to produce a 3D print, which can be expensive in comparison to traditional molding and casting.
6. PROs of scanning and printing: there is no contact with the object, complex geometry is not a problem, the scans and volumetric rendering are dimensionally accurate, you can print in lots of materials; prints can be scaled to make large things handleable or small things more robust for handling or increase visibility; subtractive manufacture, in which you can use a computerized milling machine to cut out a positive or negative, is also a possibility.
7. CONs of scanning and printing: printing is slow, the build volume is limited, a non-traditional skill set is required of conservators to produce the final product, and only a few materials age well. The best material is sintered nylon, extruded polyester may also be safe, but it doesn’t take paint well; it is hard to get the industry to think about permanence.
The object at the center of this project was a Magdalenian skull. The skeleton itself is of considerable importance, because it is the only magdalenian era skeleton of almost completion. A little history: it was excavated, quite professionally, in 1911 when they lowered the floor of the site. Unfortunately the burial was discovered when someone hit the skull with a pickax. Needless to say, the skull did not come out in one piece. In 1915 the full skeleton was removed in two blocks. My notes are a little fuzzy here, but basically at some point between the excavation the skull was restored and then went from being 2 pieces to 6 pieces, as it is documented in a 1932 publication by von Bonen. It appears that at that point the skull was also skin coated with plaster. Thankfully (?) those repairs have held up. Great, so why, did they need to scan and reconstruct the skull? Well according to Dr. Robert Martin, JP’s colleague at the Field Museum, the skull doesn’t look anatomically correct. Apparently during the time period when it was put together there was an interest in race and the skull fragments could have been lined up incorrectly accentuating cultural assumptions.

Previous condition documentation image
Previous condition documentation image

One image slice from the CT scan
One image slice from the CT scan

 
A previous x-ray showed that two fragments in the forehead are secured with a metal pin. In 2012, when the mobile CT scanner came to the museum, they were all geared up to start with the Magdalenian skull. Unfortunately there was not much difference in attenuation between bone and plaster making it tricky to define between the two materials in the scans. JP consulted a cranial reconstruction group and asked them to pretend this was a pediatric car crash victim with a cranial injury; they asked, why aren’t you using the mimics software package?
 
In this scanner, the object sits on a rotating table, while the source and detector stay still. Since these are fixed, a full scan has to be done in parts depending on the size of the object.
In this scanner, the object sits on a rotating table, while the source and detector stay still. Since these are fixed, a full scan has to be done in parts depending on the size of the objec

JP and his team also imaged the skull with a micro CT scan that has a 0.1 mm resolution versus the normal modern setting of 0.3 mm. They had previously identified 36 fragments of bone from the previous scan. It was hard to tell if some of those separations were just cracks or actual breaks between fragments. The hope was that the micro CT scanner could better define these areas. The micro CT scanner works opposite to the industrial/medical scanner. As you can see in the image to the left, the tube and detector are fixed, while the sample is rotated. Other differences are that it is slower, one scan takes 30-90 minutes and because of scanner geometry the skull had to be imaged in two scans . Because of this, JP used the previous scan to mill out a contoured support to hold the skull in the exact position. JP noted that digitally filling in the holes of the skull to create the support was the most time consuming part of that process and suggests using different radio-opaque marker dots to identify left and right for orientation during the later stitching process. With the new scans at least three separations were identified as cracks vs. breaks.
Now for the virtual reconstruction… the biggest obstacle in this stage was how to achieve something more anatomically correct using the virtual fragments when they have no boundaries. The fragments don’t push back in the computer- and the fragments can easily move into each other. With the software JP used mostly the translation and rotation functions and the free animation software Blender (which has a high learning curve and took several days to get accustomed to) to create hierarchical parent child relationships between the fragments as he joined them together. Just like putting a vessel together, right? In the virtual world at least there is no worry about lockout. They had a 3D printed of the final skull reconstruction and had an artist do facial reconstruction, which JP thinks always look related to Jean Luc Picard… So how successful was this? From a conservation perspective- awesome, it’s fully reversible! Scientifically though, it’s decent, well documented and scientifically justifiable- However, someone else could go through the same process and come up with a different reconstruction because of their reliance on left right symmetry for this reconstruction…
 
Creating the virtual reconstruction
Creating the virtual reconstruction

Comparison of the current restoration and the virtual restoration
Comparison of the current restoration (left) and the virtual restoration (right)

So what did I take away from this talk? This was a very cool project and if I have a question about CT scanning and 3D renderings, I will call JP! The scans can be extremely informational and there seems to be a lot of potential in their use for mount-making, crates, and storage, and possibly virtual reconstructions. Hopefully at some point in the future the software will become more intuitive and easier to use so that more of these types of projects can be done.

42nd Annual Meeting, Paintings & Wooden Artifacts Joint Session, May 31, "The Analysis and Reduction of an Intractable Coating for the Panel Painting by Lluis Borrassa, Christ Before Pilate," by William P. Brown & Dr. Adele De Cruz

The presentation by William P. Brown and Dr. Adele De Cruz was an awe inspiring glimpse at the future of conservation. Through the collaboration of the North Carolina Museum of Art and conservation scientists from the University of Pisa and Duke University, an intractable layer of cross-linked drying oil, animal glues, and pigmented varnish was removed from the surface of Spanish painter Lluis Borrassa’s panel painting, Christ Before Pilate, 1420-25.
The painting, which had not been exhibited for over 40 years, was the victim of previous cleaning and coating campaigns, and several layers of consolidation materials and paints and glazes had been applied to the blue passages of Christ’s robe. As a result of the cross-linking of these consolidants and the dark pigmentation of a conealing varnish layer, Christs’s robe appeared almost black.
During treatment at the North Carolina Museum of Art, solvents were successful in removing the toned varnish from the painting. However, the reduction of the complex layer of intractable material covering Christ’s robe (the abstract describes this as a composite of old consolidant, cross-linked drying oil, and restoration materials) was not so straighforward. Conservation scientists (from the aforementioned institutions) used FTIR, SEM, and GC-MS analysis to identify the components of the intractable layer and to discern them from original material, which consistsed of lapis, indigo, and orpiment pigments in egg tempera and glue or bee pollen.
Dr. De Cruz took the podium at this point in the talk to describe the methods used to reduce the intractable composite material. Essentially, laser ablation was employed, which before this talk I was only familiar with in the context of dentistry. I have to admit that my intitial reaction to hearing the terms ‘laser’ and ‘art conservation’ used together might have been a wary one, but a refamiliarzing with the techniques involved with laser ablation (and recalling the established use of this technique on the delicate enamel surfaces of our teeth) was an encouraging and exciting reminder of the vast potential of interdisiplanary approaches to art conservation.
Dr. De Cruz explained that the 2940 nm  Er:YAG (erbium) operates using an intense monochromatic wave of light (2.94 microwatts) at 15 pulses per second to vaporize the intractable material. The depth of penetration is very controllable, maintaining a shallow depth of penetration between 3-5 microns. This light pulse is highly absorbed by water, and produced a near instantaneous steam distillation. A glass cover slip is placed over the dirt, varnish, and paint layer. The laser is used to break up the intractable surface, which is ejected and contained by the glass cover slip. The debris is then swabbed from the surface of the painting and can be used for analysis.
There are several immediately obvious benefits to this method. It eliminates the need for toxic solvents, it allows for a highly controllable and low shallow depth of penetration. There is also no risk of chemical change to the substrate, and the reaction is low temperature.
Dr. De Cruz went in to incredible depth during this talk, and I realize that my summary only touches on the ammount of information she provided. I was furiously scribbling notes the entire time, and certainly wished I had a camera to take photos of her slides. I certainly look forward to hearing more about this topic in the future, and am excited for the future and ongoing collaboration of conservation and science.

42nd Annual Meeting: BPG Tips Session, May 30, moderated by Emily Rainwater

There were sixteen tips presented by twelve speakers in this session, with a very lively question and comment period at the end.
 

 Tip 1:  ‘Beading: A Japanese technique used to relax laminated paper’ presented by Betsy Palmer Eldridge. 

When two sheets of paper have been pasted together overall, the result is a sheet that is much thicker and stiffer than the two individuals. Ms. Eldridge described her technique of using a string of beads known as ura-suri to soften, relax, and remove cockling from the laminated paper. She forms the beads into a coil, then makes a repeated circular motion with a flat hand. During the Q&A, Rachel Freeman mentioned that marbles or a Japanese printmaking baren work well too.
 

 Tip 2:  ‘Quick and Easy Plexi Paste’ presented by Cher Schneider.

Ms. Schneider developed this method for adhering two pieces of Plexiglas together to make mounts. Step 1: Collect Plexi shavings into a glass container. Step 2: Dissolve first in drops of acetone until it gets milky white, then add drops of toluene until it becomes transparent. Do not stir too much. Step 3: Apply to one side of the joint with a glass stir rod, then attach the other piece. Clear excess with a piece of matboard, then with a swab dampened with toluene. Step 4: Cure for 15-20 hours. Step 5: Clean glass tools by popping the dried Plexi paste right off. She does not recommend trying to re-use dried Plexi paste. During the Q&A, John Baty suggested a bake-out to cure the paste.

Tip 3: ‘Alt Training’ by Beth Doyle.

After struggling with the difficulties of providing care and handling training to temporary and permanent staff and students, Beth Doyle of Duke University Libraries figured out that using social media to make short training videos on specific topics is a great way to reach everyone in a timely manner. The instagram videos are 15 seconds and the youtube videos, such as this one, are 2 minutes. If you want to make your own, she recommends using multiple paths to reach the largest audience, exploiting what each platform has to offer, reusing and recycling clips where possible, accepting that what you have is good enough, and keeping it short.

 Tip 4:  ‘Studio-Lab Weight Sources’ by Stephanie Watkins.

Ms. Watkins reviewed the types of weights that conservators use, with suggestions for how to find or make your own. Because they are by nature heavy, she suggests above all that looking locally or making your own is the most cost-effective, and in the spirit of the meeting, ecologically sound. If you do have to have some weights shipped, she recommends USPS flat weight priority. Items that have been used as weights include magnets, sewing weights, scuba, exercise and fishing weights, car tire balancing weights, glass scraps, paperweights, flat irons, shoe anvils, weights manufactured by conservation suppliers, hand-crafted weights, scrap metal, and heavy items from freecycle. Home-made weight fillings include ball bearings, BB shot, coins, stones, sand, glass, beads and beans. Modifications can include polishing, covering, and adding smooth boards, felts, handles, and fabric. Form follows function, so determine the size and shape needed, then look around to see what is available.
I commented on this tip to add that a friend who sometimes has to travel out of the lab to do conservation work on-site brings empty containers and fills them with water for make-shift weights.
 

Tip 5:  ‘Cling and Release: Silicone Mylar+Japanese Paper+Wheat Starch Paste= A One-Step Hinge for Float Framing’ by Terry Marsh, read by Anisha Gupta.
steps 1-6
This PDF ( TerryMarsh-OneStepHinge ) explains the process.

Tip 6:  ‘Aquazol as a heat-set adhesive’ by Adam Novak.

Mr. Novak presented two quick tips. First, he shared his recipe for heat-set tissue, based on research by Katherine Lechuga, summarized here. He makes a 6% solution of Aquazol 500 in deionized water and brushes it on very thin (2 gram) tengujo paper. After cutting the repair strip, it can be set in place temporarily using the heat of his finger. Then, he places silicone release paper over the repair and sets it with a tacking iron. (When questioned later by Sarah Reidell, he indicated that he did not know the exact temperature used with the tacking iron, but supposed that it is in the range of 150 degrees F.) The repair may look shiny in comparison with surrounding paper. If this is the case, shine can be reduced by brushing on a bit of ethanol.

Tip 7: ‘Using pH strips with filtered water’ by Adam Novak.

The second tip addressed the issue of very different readings with a pH strip and a pH meter when measuring deionized water solutions buffered with calcium hydroxide. This is something that I had noticed in my lab at NYPL, and was glad to hear an explanation. Mr. Novak has discovered that the conductivity is very low in the calcium solutions and that there is not enough ionization to get an accurate reading with the strips. This is only the case with calcium- other buffers have higher conductivity and the strips read more accurately.

Tip 8:  ‘Cellulose Powder’ by Becca Pollak.
IMG_3638
(photo of slide taken by Valerie Faivre)
Ms. Pollack described her technique of spraying cellulose powder with an airbrush to minimize local discoloration on paper, cover foxing, or prepare for inpainting. She sprays the powder directly through stencils and adds pigments for toning if necessary. The basic recipe is below as a starting point, but adjustments may need to be made depending on its moisture sensitivity of an object or the desired effect. She also sprays films on Mylar and allows them to dry for future use. In that case, she sprays a layer of plain methylcellulose first to improve cohesion of the sheet. Ms. Pollak is preparing a tip sheet to be posted soon.
Basic recipe:
  • Approximately 20mL 0.5-1% Methocel A4M (Ms. Pollack reports that Elissa O’Loughlin prefers 1-2% of A15C; and Jim Bernstein prefers a mixture of cellulose ethers or gelatin.)
  • 5-10mL isopropanol
  • 1g of micro-cellulose powder

Tip 9: ‘Applying New Techniques On A Traditional Adhesive For Book Conservation’ by  Marjan Anvari.

Traditional, western conservation training in book and paper conservation centers around the use of wheat starch paste. Ms. Anvari is an Iranian conservator working on middle eastern objects and decided to develop a repair adhesive based on a traditional Iranian adhesive that is also flexible and reversible in water. This adhesive, used by artists and artisans and known as ‘green paste,’  is dark yellow in color and leaves a stain, so Ms. Anvari worked to purify it, and came up with an acceptable recipe. She gave out samples at the end of the session. The paste can also be acquired from Raastak Enterprises, which can be contacted for more information.

Tip 10:  ‘Flattening translucent paper’ by Laura Neufeld.

Ms. Neufeld tested four techniques for flattening thin papers: Mylar flattening, the hard-soft sandwich technique, friction flattening, and edge flattening. A gampi-fibered paper was used for testing. The Mylar flattening technique, featured in the article ‘The conservation of three Whistler prints on Japanese paper’ by Catherine Nicholson, required the paper to be fully wet and gave the paper a slight sheen. The hard-soft sandwich technique, featured in the article ‘Architectural Drawings on Transparent Paper’ by Hildegard Homburger, did not require much moisture and removed severe creases. The sandwich calls for polypropylene fleece., but Ms. Neufeld found that this can be substituted with polyethylene fleece or Gore-Tex with the fuzzy side away from the object. Friction flattening, described in the article ‘The Use of Friction Mounting as an Aid to Pressing Works on Paper‘ by Keiko Keyes, can have similar results as using a kari-bari and has been found to work well on both old master and Japanese prints. She found edge flattening to be the most difficult. This slide (Flattening_Slide) shows the results in normal and raking illumination.

Tip 11:  ‘Tek-wipes’ by Gwenanne Edwards.

Tek-wipes, which are used in the computer and custodial industries, were mentioned on the DistList and handed out at last year’s Tip Session, and it seems that the word is out; many people have been discovering uses for them in paper conservation. Ms. Edwards likes to use them for capillary washing, slant washing, suction washing, as a support for lining, for drying and flattening, and in emergency response. She recommends them because they are highly absorbent, strong, reusable, machine-washable, dimensionably-stable, you can vary their saturation, they pull discoloration out well, they are safe with solvents, and they are way cheaper than blotter. They are available from a number of sources under various trade names, such as Texwipe or Technicloth. The overwhelming majority of commenters at the end of the session wanted to talk about Tek-wipes and other blotter replacements. Seth Irwin uses them to pull tidelines from paper using a tacking iron. Betsy Palmer Eldridge suggested that they would work in some of the drying techniques tested by Laura Neufeld (above.) In Australia, they use bamboo felt and interfacing in place of blotters. Bill Minter said that Christine Smith uses bath towels. Anna Friedman uses Sham-wow(warning: this link takes you to the company page with a video commercial).

Tip 12: ‘Rare earth magnets to make solvent chambers’ by  Anne Marigza.
IMG_3651
(photo of slide taken by Valerie Faivre)
Ms. Marigza uses rare earth magnets in a solvent chamber. One on either side of the inverted glass or Mylar container will hold the solvent-saturated blotter (or other absorbent material) in place. The magnet can be discarded when it becomes powdery.

Tip 13.  ‘Flattening Rolled Drawings for Digitization’ by Bill Minter.
IMG_3657
(photo of slide taken by Valerie Faivre)
Mr. Minter developed a method for flattening architectural drawings by reverse-rolling. He places a cardboard tube at the edge of the table with an attached paper extension hanging down to the floor. He places the leading edge of the drawing in the roller, then rolls it the opposite way and lets it sit for a day. When unrolled, it lays flat enough for digitization.

Tip 14: ‘Velcro for Phase-Boxes’ by Bill Minter.

Do you find your velcro hooks and loops to be too strongly attached to each other that they do not pull apart easily? Mr. Minter has discovered a less aggressive velcro. Instead of being labelled as such, the only way to distinguish is that the box is marked ‘clear.’ It comes in strips, discs, or rectangles.

Tip 15: ‘Dry-Tearing of Paper for Infills’ by Bill Minter.
(photo of samples taken by Valerie Faivre)
Lay some wire mesh on a flat surface and place the infill paper on top. Run the tip of an awl, needle, or other pointed instrument along the line you want to tear. It will create a perforation that can be dry-torn. McMaster-Carr sells wire mesh different gauges and materials. Above are samples of two sizes.

Tip 16: ‘Toning of Paper’ by Bill Minter.

The Preval sprayer works great for small paper-toning projects. Clean well after use. During the Q&A, we learned: They sell replacement valves if the ones you have get clogged. The glass jars can be saved and reused.

42nd Annual Meeting – Architecture + Objects Joint Session, 29 May, 2014, “Conservation Realities and Challenges: from Auto Regulation to Imposition at Archaeological and Historical Sites in Colombia” by Maria Paula Alvarez

I was drawn to this presentation on account of my background in archaeology. Although I have never had the chance to visit Colombia, I was very interested to hear about the challenges, that Colombian conservators, archaeologists, and other allied professionals encounter in their efforts to preserve their country’s archaeological and historical sites.
Maria Paula Alvarez, Director at the Corporacion Proyecto Patrimonio, presented a number of interesting case studies to illustrate the types of conservation and preservation problems that she and her colleagues face and work on solving. Her examples included assessments, research, testing, and treatments at
1)         archaeological sites, such as:

  • The Archaeological Site of Fuente de Lavapatas, where the conservation issue was stone deterioration. Extensive studies – including the evaluation of the environmental conditions at the site and the geological and physical properties of the affected stone – were conducted to determine the causes of deterioration. As well, testing of treatment materials – including biocides for controlling biodeterioration and consolidants for disintegrated areas – were undertaken.
  • The Archaeological Park of Facatativa, where panels of rock art were deteriorating not only as a result of exposure to the natural environment, but also as a result of exposure to humans. Both biodeterioration and vandalism in the form of graffiti were damaging to the rock art panels. The panels received conservation attention for both problems.

2)         and historical monuments, such as:

  • The Jimenez de Quesada Monument in the city of Bogota, which had been damaged as a result of vandalism in the form of graffiti. The monument received a conservation treatment that included both the removal of the graffiti as well as the application of a coating to protect the monument against future graffiti vandalism.

In all of the cases that she presented, Maria spoke about the effect of the political, social, and economical climate on the sites’ conservation and preservation. She stressed the impact that such climates have on cultural heritage, from the care to the destruction of sites. She explained how various political, social, and economical circumstances have led her and her colleagues to determine goals and procedures for conservation and preservation projects. I found these concepts very powerful. For me, this presentation was a strong reminder of the complexities involved in the preservation of cultural heritage.