Maggie Barkovic and Olympia Diamond presented a case study that outlined the decision-making process that lead to the successful treatment of darkened, dirt-infused water stains on the bare canvas portion of a large-scale acrylic dispersion painting: Composition, 1963, by Justin Knowles. The authors attributed the treatment’s success to the combination of extensive evaluation of Knowles’ materials and aesthetic aims and the understanding of new, innovative cleaning techniques designed for acrylic dispersion paintings (with the help of Brownyn Ormsby, TATE, and Maureen Cross, Courtauld Institute of Art). This presentation served an excellent compliment to Jay Kruger’s presentation Color Field Paintings and Sun-Bleaching: An approach for removing stains in unprimed canvas, which discussed the treatment of acrylic solution and oil paintings on bare canvas.
Composition is a privately-owned work that was brought to the Conservation and Technology Department at the Courtauld Institute of Art for treatment in 2013. The large-format work is a two-dimensional acrylic painting with brightly colored geometric forms juxtaposed against an unpigmented acrylic sized canvas. The painting had sustained disfiguring water stains along the top and bottom edges which disrupted the aesthetic reading of the image, rendering it unexhibitable.
Context
In the first step of the conservation process, Barkovic and Diamond assessed how the water stain affected the aesthetic interpretation of the painting. They explored where this painting fit into the artist’s oeuvre: it was part of a series of early, pivotal works where Knowles explored his initial ideas of spatial tension using non-illusionistic geometric compositions that incorporate negative space in the form of unpainted canvas. The authors carried out technical examinations of four other paintings from this early stage in his career, finding that Composition was painted in a comparable manner to his other early works: a fine linen canvas was stretched on a wooden stretcher and then sized with an unpigmented (pEA/MMA) acrylic dispersion coating. Then, Knowles used pencil and pressure-sensitive tape to demarcate where he would paint the geometric forms with acrylic dispersion paints. Though he applied a transparent acrylic “size” layer over the linen/negative space, he still considered the visible canvas “raw” and unprimed. Through the examinations and research on Justin Knowles’ personal notes, the authors assessed that the characteristics and color of the linen canvas were equally important to the interpretation of the work as the paint colors. As such, the canvas should be treated and the water stains removed if at all possible.
Replicas
Second, the authors explained that they needed to identify the components of the water stain (with no prior knowledge of water-staining incident) in order to test cleaning methods. Replicas were made using linen and the same unpigmented acrylic polymer that Knowles most likely used. The replicas were then stained with dirty water. Using XRF spectroscopy and empirical testing as a guide, a visually accurate and equally tenacious water stain was made with iron, calcium, and organic “dirt” components from aged linen. The test replicas were aged in a light box for two years to allow the stain to photo-oxidize and bond with the fabric and size layers.
Testing
Third, the authors needed to determine how to treat the water stain with the presence of the unpigmented acrylic dispersion size layer, which swelled in water and was affected using polar solvents. Their goal was to remove the stain or reduce the appearance of the stain to make successful inpainting possible. The authors looked to successful textile and paper conservation treatments for possible methods. The initial cleaning and/or retouching tests included the use of solutions with various pH values, conductivities, chelating agents, surfactants, bleach (sodium borohydride), the application of toasted cellulose powder, and pastel retouching.
The authors thoroughly explained the various test groups, but a recapitulation of all of these various solutions is outside of the scope of this blog post. In general, higher pH values (around 8) and higher conductivity values (above 2.5 uS) allowed for better cleaning efficacy. Perhaps more notably, the chelating agent DTPA (diethylene triamine pentaacetic acid) greatly outperformed TAC in cleaning efficacy. This is likely because DPTA is a much stronger chelator that is much more suitable for sequestering iron and calcium (which XRF showed to be present in the stain). DPTA could be used safely because the acrylic size layer was unpigmented. Finally, the use of agar (rather than free solution) was found to be useful in the reduction of the stain. The agar gel allowed for greater control of the solution distribution onto the stain and dirt absorption into the gel. The most effective cleaning agent, which was eventually used to clean the painting, was made from a higher concentration of agar gel at 5% (w/v), using Boric Acid 0.5% (w/v), DTPA 0.5% (w/v), TEA, at pH 8, 2.4 mS.
Evaluation of Successful Treatment
While a successful treatment methodology was developed through empirical testing, an investigation into the effects on the surface morphology of an unpigmented acrylic dispersion size layer was thought necessary due to the different absorbencies among the test canvases, observed differences in retention times for the agar gel, and concerns about the higher pH required to reduce the stain. The lack of pigmentation and hard surface features made changes caused by cleaning more difficult to perceive, measure and contextualize, so changes in surface gloss and stain reduction were evaluated with a spectrophotometer and subjective observations by conservators. The impact of the cleaning methodology on the surface of the size layer and canvas fibers were examined with dynamic Atomic Force Microscopy (AFM) and high resolution digital microscopy. A preliminary investigation into possible residues from cleaning was also investigated using FTIR-ATR spectroscopy.
The number of samples for AFM was too small to draw concrete conclusions without more testing and utilizing additional analysis such as FTIR-ATR; however, a general trend was observed that an increase in the gel concentration from 2.5% (w/v) to 5% (w/v) appeared to reduce the time in which fiber flattening occurred. In addition, FTIR-ATR showed a decrease or complete removal of migrated surfactant from the acrylic size layer surface in all treated samples regardless of the agar concentration in the gel, and along with the swelling of the acrylic layer, was considered by the authors an acceptable risk with this treatment. IR bands corresponding to agar or the additives in the cleaning solutions were not detected.
Final Treatment
As mentioned previously, the cleaning agent that was eventually used to clean the painting was made from a higher concentration of agar gel at 5% (w/v), using Boric Acid 0.5% (w/v), DTPA 0.5% (w/v), TEA, at pH 8, 2.4 mS. The agar was hand-cut to perfectly align with the stain patterns on the canvas and weighted with sandbags to increase the gel-canvas contact. Using this method, stains were greatly reduced. However, a few, minor discolorations remained after the cleaning. Further tests were carried out to determine the best inpainting method for these residual discolorations. Dry pigment with Lascaux Jun Funori, Aquazol 50, Aquazol 200, watercolour and gum arabic and Paraloid B72 were all tested for optical effects, handling properties, and reversibility. The Aquazol 50 series was found to be the most effective overall and was used to inpaint the remaining discolorations.
Conclusion
The authors concluded by restating that the success of the treatment would not have been possible without the combination of art historical and material understanding of Knowles’ work and research into new cleaning methodologies for acrylic dispersion paint films. They thanked their project advisors Maureen Cross, Courtauld Institute, and Bronwyn Ormsby, Tate, and many others for their generous support and guidance throughout the project.
This is a joint paper by two objects conservators at the Metropolitan Museum of Art, Carolyn Riccardelli and Wendy Walker. Along with many others at this conference, the topic of this paper concerns treatment and installation considerations of Renaissance-period glazed terracotta from the della Robbia workshop. This paper discusses two masterpieces by Andrea della Robbia (1435-1523), both pretty dramatic in their scope of treatment.
The first, a lunette of Saint Michael the Archangel, starts with a tragedy. In 2008, it came crashing to the floor from over a doorway in the 15th Century galleries where it had hung on display at the Met since 1996. If you search online, you can find articles about that event, but I will not link to any of them here. What I will link to, however, is the press release from April of last year, announcing that the lunette is restored and back on view.
Riccardelli presents the treatment that took place over eight years, a massive undertaking mainly overseen by Walker. She describes how it offered the conservators a rare peak into the working methods of della Robbia. For example, they could see in a more intimate way exactly how the clay used to mold the lunette was wedged (not very well at all), which tells us that the makers must have understood their clay so well to know this step wasn’t necessary. They also found evidence of tool marks and fingermarks – yes, even fingerprints! – from pressing the clay into the molds. The paper outlines the treatment of this work, which includes the use of the “Tulio blend” (3:1 B-72/B-48N in acetone with 6% ethanol) as the main adhesive, and a mount that incorporates brass clips to hold the panels to an aluminum backing panel. We are all left with beautiful after-treatment images of the lunette and a happy ending to the story.
The second della Robbia piece presented, a massive tondo of Prudence, starts with an exhibition announcement at the Museum of Fine Arts, Boston, Della Robbia: Sculpting with Color in Renaissance Florence. Along with pieces from Italy never seen in the United States before, as well as loans from the Brooklyn Museum and the Los Angeles County Museum of Art, the Met’s Prudence was featured.
Riccardelli presents the conservation efforts to get Prudence ready for loan and exhibition, having one year to do it. The piece consists of 16 molded and modeled sections – a central tondo surrounded by a colorful garland – and nearly every piece had old restorations that needed to be addressed. This included an unstable mount. Their paper outlines the treatment steps taken, including cleaning and restoration removal (steam, solvent, mechanical), and a well-engineered mounting system that employs carbon fiber clips and straps, and a honeycomb aluminum backing panel. (More details about the use of carbon fiber clips in this treatment are presented in Riccardelli’s other paper during this conference, “Carbon Fiber Fabric and its Potential for Use in Objects Conservation.”)
It was during the cleaning phases that the conservators again made an exciting discovery, uncovering original markings and finger impressions that clearly indicate the proper order of the garland border pieces. More than this, the pre-treatment arrangement of the garland was incorrect! Their paper shows the dramatic shift from the previous arrangement to the corrected one, totally altering the feel of the piece and giving one the satisfaction of being able to return something home to its rightful place.
Glenn Wharton, Clinical Associate Professor in Museum Studies at New York University, started the talk by the initial following challenge: how to organize and access the data created by time based media conservators during the treatment process of a contemporary artwork? Based on the MediaWiki platform, this project ended up dealing with larger issues met in time-based media conservation.
Conservation Documentation
Conservators create a lot of documentation, in various formats (notes, videos, drawings, etc.) and one problem is how to organize this information and make it available within an institution. Also, as Wharton mentioned, at the New York University, teachers tend to help and encourage students to work and experiment with different programs.
The David Wojnarowicz Knowledge Base
Wharton followed by introducing David Wojnarowicz, an artist and activist who died of AIDS in 1992, who produced, among other materials, paintings, drawings, and videos. His archives left at NYU were the primary sources of information – a page of his journal was shown as an example. In order to complete these precious resources, the students interviewed several persons who worked with the artist, and a computer scientist did technical research on the tools he would have used.
As Wojnarowicz is getting more and more attention internationally today, people worry about how to preserve and exhibit his work. In that regard, the idea was to gather more information available for researchers, curators and conservators. One challenge was to document his “Magic Bow”, found under his bed and containing objects related to several of his artworks. The question here was how to report the very complex relationship between those elements and the actual artwork pieces using a searchable database.
The project goals and system requirements
Deena Engel, Clinical Professor in the Department of Computer Science at the Courant Institute of Mathematical Sciences of New York University, presented the goals to be reached by the future database. The idea was, along with conservator students, to think through the approach of the software development, in particular, how to capture the complex relationships between the different elements, with an easy to use interface, and a long term preservation of the data.
In order to select a suitable software, they established the requirements for the future database as follow:
Be a support for a directed graph model;
Support user authentication;
Be an open source software:
Require only standard maintenance;
Support extensive discoverability for all;
Have a clear navigation;
Support controlled vocabularies.
In the lab: Software testing
The students used the data collected early to test different softwares – such as Omeka, Drupal, Plone, Collection Space and WordPress. After a lot of searches, they chose the MediaWiki, an open-source software with a strong user community, easy to use and configure, which supports text, image, audio and video medias, allowing for example to publish conservation reports and audio interviews, and filled their technical needs – In particular, they wanted the pages to be available on all types of supports (phones, tablets, etc.).
Discoverability
The content was organized in categories and subcategories; for example the category “Works on Paper” was subdivided in “Drawings”, “Prints”, “Stencils” and “Xeroxes”. The different pages related to each other are connected via hyperlinks; furthermore, the “what links here?” part allows to reach the pages that lead to the current page.
Launching of the database
A Beta Test Session was organized with the NYU students, conservators and archivists, were questions were asked, in particular about the user interface, the user experience and the scholarly goals that had to be reached.
On April 21, 2017, a Symposium about David Wojnariwicz’s work was organized at the Fales Library & Special Collections, New York University, were the database was presented and launched.
Though, the project is not over! This is an ongoing research, and anyone can contribute by sending pieces of information to: fales.wojnarowicz@nyu.edu.
For the future, the scholars at New York University are interested in working with museum professionals on similar projects, using MediaWiki again or other software – Deena Engel mentioned that she would prefer to experiment with other tools.
This presentation allowed to appreciate the common effort made by scholars, archivists and art historians, as well as computer scientists and curators, in order to make available qualitative information about a contemporary artist’s complex work, in an accessible and intelligent form. Glenn Wharton added that university was a great place for that kind of research, because of the possibility to get research grants, the available time and the deep interest and motivation of the students.
Andrew began his talk by very graciously acknowledging that many other people have contributed data that informed his paper. Andrew’s work is based on research began by William Barrow, a paper chemist at the Library of Congress until the 1960’s. Barrow’s research on books tried to draw a connection between physical properties and chemical content. He had collected about 1000 books published between 1500 and the 19th century, and he took various measurements such as fold endurance, pH, alum content, etc. He tried to draw connections between those sets of data to predict the ageing characteristics of the paper. This collection was obtained by LoC in the 1970s, and are still used for destructive testing today. But where Barrow used macro and micro scale measurements, Andrew looks to the middle ground: polymer chemistry. For that, he uses size exclusion chromatography, or SEC.
SEC measures the degree of polymerization of cellulose using a roughly 1mm squared sample size. (It may be helpful to think of degree of polymerization as the molecular weight.) The degree of polymerization of a sample can be compared to known references. It should also be noted that papers have a mixture of molecules of different sizes, and SEC provides a distribution curve. The more large molecules in a sample, the less degraded the cellulose, meaning that the paper is in better condition. Andrew discussed several examples of treatments of iron gall ink on paper where SEC was used to show the effects of those treatments on the papers.
Barrow’s research indicated that pH was the best indication of the future physical properties of paper. Andrew took about 80 samples from Barrow’s collection and confirmed that the molecular weights of paper correlate with pH (when the pH drops, the molecular weight drops). Andrew then looked to see if the molecular weight corresponded to physical properties; with newer papers, the molecular weight does tend to be smaller. Poor tear resistance also corresponds to low molecular weight. In general, he found that the molecular weight determined by SEC is a better indicator than pH for future physical properties for both newer and older books.
SEC certainly has advantages. The sample size is ridiculously small. Tells you about the physical building blocks of the paper, giving a better idea of what’s in it and what state the cellulose is in. There are some disadvantages to overcome before this technique is in every lab. The test itself takes a week to do. It requires extremely expensive equipment and organic solvents, and one must have the technical knowledge to interpret the data. Andrew’s ultimate goal is to turn this into a rapid technique that’s affordable, so that the molecular weight distributions of an object can be included in an object’s record and be pulled up by a barcode. That’s an exciting prospect!
Andrew’s work presents a very interesting analytical option that future conservators might have access to. It would be nice to have a predicting model for the degradation of library objects. But it would be even more interesting to see the effects of treatment on paper. It is important that conservators continue to check our own work, and I’m glad to have assistance with that from scientists like Andrew.
Cindy’s talk was a mightily condensed summary a few of the techniques for measuring the pH of paper that the Preservation Research and Testing Division (PRTD) at the Library of Congress has investigated over the last 4 decades. Her introduction was a summary of the challenges presented by this task. Due to the chemical structure of cellulose and the nature of paper, most methods can only approximate the pH of paper. The method of sample preparation can impact the results of measuring. How paper ages means that there may be a different pH in different regions. The ions that dictate the pH may not be soluble in water, making measuring pH harder. And atmospheric carbon dioxide can react with your solution and affect your results. Notice that I said “solution.” Cindy ended her introduction by noting that you can’t measure the pH of a solid. But you can approximate it, and the PRTD has been trying to identify the best way to do this for decades.
The PRTD’s focus on the pH of paper began in 1971 with the deacidification program. Chemist George B. Kelly used titrated alkalinity and titrated acidity as an “external yardstick”, and four different extraction methods: TAPPI cold method, TAPPI hot method, surface pH measurement, and “pulped” pH method. Kelly determined that for acidic papers, the method of measuring pH didn’t matter much, but the alkaline papers had an acidic pH despite their 1% alkaline reserve. The hot extraction method was shown to be much more accurate with alkaline papers, as it was likely better at getting all the ions into solution. The pulp method came close. Cindy then went on to talk about the uncertain origins of the pulp method (i.e. it’s not discussed in any published literature, but is mentioned frequently in internal documents from the PRTD). (I do wish that Cindy had gone into detail about the process of each method of extraction, because I wasn’t too sure about how each process worked. She doesn’t mention pH strips, gels, or pH meters at all in this talk. And TAPPI stands for the Technical Association of the Pulp and Paper Industry.)
Then Cindy skipped to the late 1990’s (she does mention that a few papers had been published in the decades between). By this time, the PRTD ramps up its documentation efforts, as well as its protocols for sample collection and homogenization. Most of these protocols were put on their website. During the renovation of the instrumental suite in 2007, the lab’s emphasis shifted to developing non-destructive and micro-invasive techniques, which were more appropriate for art objects rather than circulating collections materials. This meant that the sampling methods had to adjust accordingly.
To address the new challenge of micro samples (or none at all), the PRTD tried to make surface pH measurement work, but found that tideline formation and sensitive media made that difficult. “Miniaturization” was another method the PRTD tried. For this technique, sample size can be a few milligrams to a few micrograms, depending on the paper to water ratio and other details of sample preparation. They found that slurrying helps, but filtration makes no difference in pH measurement. In addition, controlling the amount of carbon dioxide was key to getting an accurate reading with acidic papers. Both purged bags and sealed vials were tried, with comparable standard deviations but slightly different pH readings. The pH readings from the micro methods agreed fairly well with macro methods.
One of the best takeaways from Cindy’s talk was when she shared that during their renovation, the PRTD was sending out samples to a contractor for pH measurement, and papers that had an alkaline reserve were coming back with an acidic pH. Their conclusion was that not every method is appropriate for every type of paper, and that sample preparation can also affect results.
Here were some tips about each of the methods, taken from Cindy’s recap slide: The Hot Method is closest to objective measurement, but takes two hours per sample. The Pulped/Blender method generally agrees with the hot method, but is faster. The ISO cold method has a much higher standard deviation than the TAPPI cold method. Thea Surface pH method has the highest standard deviation of any method tested, and is difficult with alkaline papers, thick boards, and boards with adhesives. This method also causes tidelines. And the Mini Method is also difficult with thick boards, but the results comparable in repeatability to large scale extraction methods.
So, what does it take to accurately measure the pH of a piece of paper? A focus on repeatability and an optimistic attitude! The scientists and preservation specialists at the PRTD struggle with many of the same challenges that the rest of us do, albeit with fancier equipment. It sounds like just getting a ballpark figure for pH is as close as we can hope for for now. The PRTD is still investigating methods, and we should all look forward to their results!
Finally, one cool tip: You can make your own micro blender with a homemade Mylar blade attached to a Dremel tool!
In the abstract for their paper, Ariel O’Connor, Smithsonian American Art Museum (SAAM) objects conservator, and Dan Finn, SAAM media conservator, write that this presentation “aims to present a case study that is exemplary of the wide range of expertise that time-based media conservation can require, and the collaborative approach that it necessitates.” Their talk certainly demonstrates this, as it presents a myriad of challenges, from documentation tasks and working with living artists, to what to do when a massive cable failure occurs just minutes before the museum director is coming to see the work in action.
The paper discusses the kinetic sculpture titled “the willful marionette,” by Brooklyn-based artists Lilla LoCurto and Bill Outcault, and the piece incorporates sculpture (a 3-D printed, blue poly(lactic acid) biodegradable plastic marionette with strings made of fishing line), software (Puppet Master), and electronics. The custom software is designed to interact with its audience, responding in real time to recognizable human gestures with gestures of its own. Meet the artists and get a glimpse of the marionette, affectionately named Little Bill, in this short video.
O’Connor and Finn outline the documentation process they employ at SAAM, making us all realize how incredibly detail-oriented the documentation of time-based media works really needs to be. This includes a testing and acceptance report, an identity report, various iteration reports, documentation photographs, artist interviews, copious notes, and organization and storage of all files, such as the STL files that can be used to reprint the sculpture in the future, if need be.
The authors candidly recount stories about working with this exciting and challenging piece and getting it ready for the museum director to review. For instance, an issue with Little Bill not blinking properly was fixed by the good old “CTRL-ALT-DEL” method. But when the 80-lb. line that mainly held up the sculpture spontaneously snapped, they had to be resourceful and quick-on-their feet, looking to the facilities crew for the right tools needed to remedy the situation.
Future challenges for this work are similar to many time-based media works, including what will happen to the proprietary software that Little Bill is operated on, as well as storage considerations for the plastic sculpture itself.
My personal area of interest and intended future practice is in the conservation of historic interiors. Therefore, I am always keen on portability both in tools and materials as well as forms of analysis. The other advantage to the techniques presented in this workshop is that physical sampling is not required, which is always attractive and music to a curator’s ears.
The workshop met my personal expectations, but the title “Effectively Using…” could have suggested to some that this was going to be more of a “boot camp” for being able to implement these techniques back home. This style of workshop was more of an information/demonstration session and is great for anyone considering buying similar instrumentation and/or for gaining a better understanding of the general benefits and limitations of portable spectroscopy.
Given the short duration of this workshop, I was initially concerned that I might have signed up for a 2 ½ hour lecture without any hands-on component. Participants were encouraged to bring our own samples and indeed at least an hour was dedicated to looking at samples and exploring the instrumentation first-hand. Although we did run over the scheduled time, and were gently shuffled out of the room as hotel staff started to break down tables.
The workshop was led by Tom Tague, Ph.D. Applications Manager at Bruker, and Dr. Francesca Casadio, Director of the Conservation Science department at the Art Institute of Chicago. I really appreciated having these different perspectives. Tague did not assume the role of salesperson during the workshop, but as you would expect he was very positive in his description of the capabilities of the Bruker instrumentation. Casadio kept Tague grounded in the realities of our complex samples and what can be confidently identified using these techniques. At the same time, it was useful to have Tague there to speak to the specifics of the instrumentation and push Casadio a little bit to consider what some of the newer technology could offer. There was also a Bruker sales representative present to assist with running the instrumentation and software and offer information on pricing.
Overall the session was well organized. I know I was not the only attendee who was ecstatic that I got to take home a flash drive loaded with the presenters’ PowerPoint slides. The spectra from my samples that were analyzed were also loaded directly onto this flash drive before the end of the workshop.
The first part of the session did consist of pure lecture. Tague’s presentation focused on specifications of the Bruker portable instruments and descriptions of the techniques.
An interesting tip he offered was using sandpaper to take surface samples. He lightly abraded a painted surface and then placed the sandpaper in front of the portable FTIR (ALPHA)—no additional sample prep necessary.
Having just completed my Master’s degree in conservation I was able to follow the presentation fairly well, but I fear that it may have been overly technical and too fast for someone who does not work with these analytical techniques on a regular basis. Nonetheless, I anticipated this to be an intermediate-level workshop when I signed-up.
As would be expected based on the organizers of the workshop, the instrumentation provided and discussed were all Bruker models. Two ALPHA portable FTIR spectrometers were present. The ALPHA is set up to receive different “snap-on” modules. The two modules available for demonstration were the “External Reflectance” module and the “Platinum ATR” module. The BRAVO Handheld Raman spectrometer was also available for interaction.
Here are some key facts about each instrument:
The base ALPHA starts around $14,000 and each module is on average $6,000 in addition.
ALPHA “External Reflectance”
Does not require direct contact with a sample/object
No size limitations as long as unit can be mounted/held in appropriate orientation to the sample
Camera integrated in unit to help orient, find appropriate working distance/focus, and document sample location
Collects reflectance spectrum NOT absorbance
Can collect specular and diffuse reflection; reflective and non-reflective materials can be analyzed
Footprint of instrument is about 8” X 11”
Weighs about 13lbs.
Can be tethered to a laptop
About 6mm sampling area
Approximately 4cm-1 spectral resolution
ALPHA “Platinum ATR”
There is pressure/direct contact with the sample
The IR beam does penetrate into the sample
BRAVO Handheld Raman
$45,000-$55,000
Slightly narrower than 8” X 11” (looks like an oversized ELSEC environmental data monitor; less heavy than the Alpha)
Class I safe laser
2mm sampling spot size
No camera or viewing capability to help align collection area
Object needs to be in contact, but no pressure required
Approximately 8cm-1 spectral resolution
Fluorescence mitigation built into software/data collection
Dual lasers built in and used/activated simultaneously
Optimal wavelength and reduced risk of damaging sample
Touch screen allows for control and data collection without tethering to laptop
Tethering also capable via WiFi to laptop
In terms of the ALPHA “External Reflectance” one of the big selling points is that there is no size restriction or need to balance the object on a stage. The trade-off in allowing data collection without physical sampling is that the spectra generated are in % reflectance. The majority of reference spectra available for free and through the Infrared and Raman Users Group (IRUG) are % absorbance or % transmittance (its inverse). The Bruker software does offer the capability to convert the data using the Kramers-Kronig Transformation. Francesca Casadio seemed to prefer to analyze data from its original state in reflectance. Characteristic peaks for bonds are slightly shifted from their location in transmittance spectra, but at Casadio’s level of experience she is able to take these nuances into account with some ease. She was honest with the attendees summarizing that this form of IR spectroscopy is “not like portable XRF; one needs to have experience and repetition for familiarity with interpreting spectra.”
For those interested in more on interpreting reflectance spectra of art objects Casadio recommended the following publications from a research group in Perugia Italy:
“Reflection infrared spectroscopy for the non-invasive in situ study of artists’ pigments.” C. Miliani, F. Rosi, A. Daveri & B. Brunetti, Appl. Phys. Mater. Sci. Process. 106, 295–307 (2012) (http://dx.doi.org/10.1007/s00339-011-6708-2)
“In Situ Noninvasive Study of Artworks: The MOLAB Multitechnique Approach.” C. Miliani, F. Rosi, B.G. Brunetti & A. Sgamellotti, Acc. Chem. Res. 43, 728–738 (2010) (http://dx.doi.org/10.1021/ar100010t)
“Non-invasive identification of metal-oxalate complexes on polychrome artwork surfaces by reflection mid-infrared spectroscopy.” L. Monico, F. Rosi, C. Miliani, A. Daveri & B.G. Brunetti, Spectrochim. Acta Part -Mol. Biomol. Spectrosc. 116, 270–280 (2013)
“In-situ identification of copper-based green pigments on paintings and manuscripts by reflection FTIR.” D. Buti, F. Rosi, B.G. Brunetti & C. Miliani, Anal. Bioanal. Chem. 405, 2699–2711 (2013)
It is important to keep in mind the basis of data collection to understand the limitations of what can be analyzed with the ALPHA “External Reflectance” on a given object. For example, with a varnished painting the spectral reflectance of the varnish will typically only allow the varnish itself to be detected (with some exceptions depending on thickness of the varnish and underlying pigment composition). Similar reflective material properties make plastics easily detectable with this technique. Matte objects are still good candidates for analysis with the ALPHA, but the data will be collected via diffuse reflection. The ALPHA does not seem like an appropriate technique for discerning between individual layers within a given structure unless coupled with other techniques.
One of the ALPHA’s at the workshop was supplied by Casadio from the Art Institute’s lab, and she has extensive experience using the ALPHA. Her presentation was more about working knowledge of the instrumentation. She polled the attendees and focused on case studies mainly of pigment analysis and identification of plastics. Casadio emphasized the benefit of the ALPHA as a mapping tool that does not require sampling. Perhaps one or two samples could be taken from a work of art and more confidently characterized with bench top FTIR and/or GC-MS and then the use of specific materials could be mapped without additional sampling using the ALPHA. Casadio’s case studies often combined multiple analytical techniques. She finds the ALPHA to be a nice compliment to XRF. Overall, Casadio has found the ALPHA to be very useful in characterizing different plastics and also good at detecting deterioration surface products (e.g. zinc soaps) especially with modern and contemporary collections. Casadio noted that the ALPHA detects very strong signal and peaks for waxes and PVA coatings. Casadio has been able to use the ALPHA for collaborations with other institutions and collections, which is another boon of its portability.
I was disappointed that Casadio had not had previous experience with the BRAVO Handheld Raman. At the Art Institute she has a bench top Raman unit. She seemed skeptical about the BRAVO’s capabilities and some of the claims that Tague was making that it could “see” indigo and other organic pigments without surface enhanced Raman spectroscopy (SERS). Casadio stated that in her personal opinion with Raman it is better to bring the art to the unit than the other way around. By the end of the workshop she did seem impressed with the quality of spectra the BRAVO was generating, but there was not enough time to have further discussion and to tease out Casadio’s candid opinion on the instrument.
I was most excited for the practical demonstration with the instruments especially because I had come armed with over 10 samples. I was anticipating that I may not even get to analyze one sample, but was very pleased that I was able to look at 7 samples with the BRAVO portable Raman. This much time with the instrument was due in part to many participants not bringing samples.
If a similar workshop is organized in the future, it might be good to have participants sign up ahead of time for slots with the instrument if they are interested in analyzing a specific sample. It was a fairly large group – about 18 participants. Attendees that did not bring samples were still interested in watching the process of collecting data and interpreting the spectra. This was challenging; even with three instruments there tended to be 5-7 people crowding around a laptop screen. Dividing us into smaller groups, having the laptops hooked up to a projection screen, or further limiting the number of participants may be additional considerations for future workshops.
It seemed like the majority of participants were conservators rather than conservation scientists. I personally do not work with spectroscopic techniques on a regular enough basis to be able to confidently interpret spectra on the fly. Francesca Casadio was able to offer her expertise and interpretation while working with samples from the participants, but neither Tom Tague nor his Bruker colleague could offer specialized interpretation. Some of the participants seemed frustrated that the instruments were not connected to an art materials database for instant gratification and matching.
Both Tague and Casadio strongly emphasized the importance of each institution building its own reference database specific to the collection. The IRUG database was promoted, but as a supplement to an institution’s own reference database. Neither of the instructors felt that the database that comes with the Bruker software was appropriate for art materials.
My personal goal during the workshop was to pit these portable instruments against their stationary counterparts and to pit the two complimentary techniques against each other. Therefore, I brought known samples from my institution’s reference collection of traditional paints. All the paints were oil-based and mixed with some degree of lead white. The reference pigments I chose were mostly organics (indigo, madder, cochineal). Colonial Williamsburg has had the opportunity to partner with the College of William and Mary in order to perform SERS on objects in the paintings collection. My colleagues and I were curious to see how this portable unit compared to spectra produced with SERS. With the minimal time, I chose to focus on the BRAVO because our institution already has a bench top FTIR.
Tom Tague was set-up at the BRAVO “station” during the practical session, and as I stated previously he was not comfortable offering any interpretation of the data. I was excited to review the spectra we collected back at my home institution (Colonial Williamsburg Foundation/CWF) alongside Kirsten Travers Moffitt, the conservator in charge of our materials analysis lab. Moffitt performs a lot of FTIR analysis on our collection, but has less experience with Raman.
All the organic paint spectra from the BRAVO were certainly “neater” than what I am used to seeing in terms of raw data from a bench top Raman with oil paint samples. I personally would attribute the quality of the spectra to the dual laser capability. I’m not sure how much impact the fluorescence mitigation had because the spectra were still pretty noisy and it was challenging even for Moffitt to distinguish significant peaks. It appears that the fluorescence of an oil binder is still problematic with the BRAVO. In Tague’s presentation he showed an example of indigo detection with the BRAVO, but this was on an illuminated manuscript, where fluorescence of the binding media would be less of an issue.
At CWF we only have a reference database for IR spectra, but looking at possible peaks in the indigo/lead white sample spectrum, the characteristic peaks for indigo that Tague mentioned (545, 1463, 1578) do not appear to be present. It seems that the lead white is dominant, with a strong peak around 1050. In conclusion, Tague is partially right that the BRAVO can detect some organic pigments, but likely only if they are present in high enough concentrations (not mixed) and are not in highly fluorescent binding media (like oil).
The other samples I looked at were reproduction wallpaper samples from Adelphi. I was curious to see if we could detect anything useful about the pigments on an object that would normally be challenging to sample and could not be brought to the lab if it were installed in a historic interior.
The resulting spectra were less noisy than those of the oil paint reference samples, again likely due to the non-oil binding medium on the wallpaper.
Despite the better quality of the spectra, we still did not have the resources (i.e. a good reference database for Raman and experience working with Raman spectra) to confidently characterize the pigments present. I am sharing this to illustrate Casadio’s point that the ALPHA and BRAVO require a certain level of expertise and do not provide instant answers.
One of the other participants, Ann Getts, a textile conservator at the De Young Museum in San Francisco, brought various sequins from a costume in storage with a suspicious vinegar odor. Getts had time to look at one of the sequins with both ALPHA modules, and her case study demonstrates some of the trade-offs with the non-contact “External Reflectance” module.
She began with the “External Reflectance” module and the first hurdle was getting the instrument positioned at the appropriate working distance from the sample. Without an adjustable stand, we had to use trial and error to shim up the ALPHA so that the camera could focus on the sequin. The resulting spectrum suggested cellulose acetate (as suspected by Getts initially), but even Casadio still felt insecure about drawing any concrete conclusions based on this spectrum. Then the sequin was analyzed with the “Platinum ATR” module and right away Casadio concluded that indeed it was cellulose acetate.
Each of these instruments has their advantages and disadvantages. Overall the ALPHA seems like a good bang for your buck given the duality of the modules. The price point is pretty reasonable also considering the portability.
The BRAVO is fairly new technology and the dual lasers seem promising, but at this point it does not seem like a must have for the average institution. I would encourage anyone thinking about purchasing any of these instruments to consult with both of the workshop leaders.
In general I would specifically recommend the ALPHA to:
Institutions that have a lot of sampling restrictions
Institutions with a lot of oversized works
Institutions that focus on modern and contemporary art (especially with plastics and large Color Field paintings)
Institutions with a conservation scientist on staff
In general I would specifically recommend the BRAVO to:
Institutions that have a lot of sampling restrictions
Institutions wanting to focus on analysis of paper-based art
Institutions with a lot of oversized works
Institutions that already have staff with Raman expertise
Institutions looking to purchase a Raman instrument
This blog represents my personal conclusions and understanding of the workshop. I would encourage any of the other participants and the instructors to post in the comments if they have differing opinions or think that I have misunderstood any of the technical aspects of the instrumentation.
Photograph conservator Diana Diaz introduced her presentation as a study case which deals with “overwhelming protection” of photographic materials.
The project started in 2006 when the Harry Ransom Center acquired the photographer Arnold Newman’s archives, including various photographic and other materials, such as photographic albums, sketch books, documentation of many projects… and color transparencies.
More precisely, a corpus of 35 mm Kodak Kodachrome color slides in plastic mounts was found. The slides were wrapped together with sealing tapes, forming in 16 sets. The tapes displayed, on the edge of each pack, handwritten inscriptions indicating the dates and subjects of the photographs. The dates inscribed on the tape enabled to date each project, the whole collection ranging from 1954 to 1972. Diana Diaz showed several examples of the images, like one taken for a project shot in Spain in 1970 for Holiday Magazine.
These slides series are of interest as they inform on the photographer’s working methods. For instance, they showed different cropping, compositions, and exposures experimented within each series. One can see how Newman would play with lights and colors and produce variations of the same images, among which he would then make his final selection for the publication. Diaz then listed all the assignments projects covered in the slides, shot in various places (Spain, Canada, California…) for different magazines, such as Harper’s Bazaar or Life.
However, when the slides were found, the images were still inaccessible since after the removal of the tape applied on one edge displaying the inscriptions, another white tape underneath maintained the stacks of slides together. Three types of tape were identified among the 16 sets:
a masking tape;
a discolored white tape;
a white tape still tacky.
The conservation treatment needed then was difficult to engage because the tapes were in contact, not only with the slides mounts, but also with the films themselves – on both the image and support sides.
Therefore, to remove the tape carrier, Diaz logically proceeded by types of tape.
The white tape still tacky was removed mechanically with a spatula, without any adhesive residue left at the end of the treatment.
The masking tape was strongly adhered and did require a heated spatula combined with the use of solvents.
The discolored white tape was removed with the help of water vapor.
After all the carriers were removed, Diaz evaluated the materials and condition of the residual adhesives in order to determine which solvent to use. She referred to Smith et. al.’s paper1, which not only presents the history of pressure sensitive tape and their ageing properties, but also appropriate solvents and suitable methods of application for their removal. Thus, Diaz used naphtha (a mix of hydrocarbons) to successfully remove the rubber-based adhesive, and ethanol for the oily adhesives. The solvents were applied gently with a cotton swab in a circulation motion and in one direction to minimize the scratches and increase the efficiency.
The photographic documentation under Ultra-Violet illumination allowed to assess the removal of all the adhesives. Finally, the slides were individually rehoused in conservation materials.
Although this treatment was successful, several questions are being raised: Are there remaining solvents residues in the photographic materials at the end of the treatment? Has the surface been scratched? Indeed, the topic of the effect of solvents on color transparencies, in particular regarding the innocuousness for the photographic materials, would require further research to help photograph conservator to choose a suitable treatment.
1 Bibliographic reference: Merrily A. Smith, Norvell M. M. Jones, Susan L. Page, & Marian Peck Dirda. “Pressure-Sensitive Tape and Techniques for its Removal From Paper”
JAIC 1984, Volume 23, Number 2, Article 3 (pp. 101 to 113 http://cool.conservation-us.org/coolaic/jaic/articles/jaic23-02-003.html
The NanoRestArt project is a multinational network of conservators, scientists, and industry partners working to develop and test novel nanotechnology-based materials intended for the conservation and preservation of modern and contemporary cultural heritage. Funded by an EU Framework Programme for Research and Innovation Horizon 2020 grant, the project consists of 27 partners, most of which reside in the European Union. The research and development process for these tools is divided into four major categories: gels and nanostructured fluids for cleaning, nanocontainers and nanoparticles for surface protection and strengthening, sensors for molecular detection, and environmental impact of the new materials. As a partner in NanoRestArt, the Tate is investigating the development and evaluation of cleaning systems in collaboration with the Research Center for Colloids and Nanoscience (CSGI), who developed the nano-structured cleaning agents.
Dr. Angelova’s discussion of her work with the Tate and the NanoRestArt project focused on testing nano-structured cleaning systems, investigating their effects on Michael Dillon’s Op Structure sculptures and on mock-ups intended to replicate the properties of the artwork. Chemical analysis of one Op Structure sculpture revealed that it is made entirely of poly(methyl methacrylate), or PMMA, and adhered with PMMA cement. It is an excellent candidate for the NanoRestArt evaluation process because it is composed of a synthetic polymer material which cannot be easily treated with conventional conservation techniques and can benefit from wet surface cleaning. The plastic structure is in very good condition but does show evidence of dust accumulation and surface soiling from handling as well as adhesive labels in need of removal.
The mock-up samples were created by treating a series of semi-opaque acrylic polymer sheets with a variety of soiling materials to mimic finger grease, dirt, and adhesive labels, also including some un-soiled control surfaces. In addition, a range of materials were used to clean the samples including the novel NanoRestArt gels created by CSGI as well as typical cleaning agents used by conservators. PMMA is a highly glossy material which is easily scratched by surface wiping and dissolved by many common solvents. The NanoRestArt gels were therefore chosen as appropriate cleaning materials to avoid such issues during cleaning, and can be loaded with a variety of fluids for cleaning purposes.
The evaluation process involved treating each soiled and control mock-up sample surface in triplicate using each cleaning method, and evaluating the results using a Hirox microscope, gloss meter, colorimeter, and infrared spectrometer. Conservators rated each cleaning agent based on its ease of use, health and safety characteristics, control, soil removal effectiveness, tendency to leave residues, and gloss change using a number system. After treating the mock-up samples, the Tate research team found that to the naked eye, simple cleaning solutions (such as saliva or deionized water) worked well to remove the soil, but left scratches and streaks when viewed under the microscope. Additionally, soiled surfaces cleaned with gels showed evidence of gel residues and microdroplets as the sample surfaces are non-absorbant. The best cleaning results derived from a microcloth moistened with a combination of a surfactant and a chelator known as triammonium citrate. For the adhesive labels, some microemulsion cleaning agents were successful in removing them from mock-up samples. Dr. Angelova mentioned that they were not able to load the microemulsions into the NanoRestArt gels, but this would probably be an ideal cleaning solution.
When working with the actual Op Structure sculptures, conservators chose to clean a small, inconspicuous soiled area, beginning with water and working up to the surfactant and chelator solution – a process which effectively removed dirt without scratching the surface. Based on the mock-up tests, conservators were able to successfully remove adhesive labels from the artwork using a solution of water and isopropyl alcohol.
APOYOnline (Association for Heritage Preservation of the Americas) is a non-profit organization that facilitates communication and exchange among heritage preservation professionals throughout Latin America and the Caribbean region. Beginning August 30 and extending until September 2, 2016, APOYOnline hosted its first regional conference and workshop in Medellin, Colombia. Attended by 73 participants from 15 countries, the theme of the conference was “exchanges and practical tips”. While presentations focused on a range of cultural materials, the primary emphasis of the conference was on preservation of photographic heritage due the importance of photograph collections in Latin America and the immediacy of addressing these collections. Based on the presentation given at the Annual AIC Meeting, the APOYOnline conference appeared informative, fun, well-planned, well-received, and resulted in successfully engendering international collaborations.
Colombia was chosen as the conference host country because of its central location within Latin America, and Medellin as the host city to promote the revitalized city. Logistical planning for the conference required coordinating team meetings across four different time zones, taking full advantage of communication technology such as WhatsApp and Skype. In addition, there was an incredible amount of fundraising to support the conference and its participants. Major initial backing came from Tru Vue, Banco de la República, and the University of Delaware, which then attracted more supporters, resulting in a total of 21 financial donors. Through this campaign, APOYOnline was able to provide scholarship to all 73 participants – 60% partial grants and 40% full grants for conference attendance.
The program was divided into two major sections: paper presentations in the mornings and workshops in the afternoons. In total, the conference had 14 papers and 24 poster presentations. Paper topics focused on a wide range of preservation and risk management projects, including education, storage, collections care, impact of microbiological research, emergency response, treatment of ceramic murals, and more. In addition, posters discussed glass plate negative collection preservation, conservation of audio visual materials, and paper conservation in tropical climates among other topics. All sessions were recorded and made available on the APOYOnline webpage for free. The workshop on conservation of photographs involved lectures, discussions, and hands-on demonstrations about identification and preservation of photographic materials and were translated into three languages for all participants. Originally intended for 25 people, the conference organizers were eventually able to open the workshop to all attendees. Some of the most important issues for photograph collections in Latin America include immediate inventory, cleaning, storage, and preventive preservation. The workshop therefore provided participants with a better awareness of the needs for their collections and information that they could then bring back to their institutions.
During the conference, there was a meeting with the participants entitled “Vision 2020” in which the future of APOYOnline was discussed. Suggestions from the session included hosting more events, dissemination of activities, and research. APOYOnline is therefore working to strengthen networks with universities, provide more professional training, and act as an international bridge by bringing people to Latin America and vice versa. The organizing team for the conference received a large amount of thank you notes from attendees on how the meeting impacted their work and collections. The next APOYOnline conference will take place in Antigua, Guatemala to advocate more for countries in Central America and the Caribbean region.
Further information about APOYOnline can be found at www.apoyonline.org as well as through Twitter, Facebook, and YouTube.