This blog post is part of a series of observations about the London “Gels in Conservation” conference co-hosted by the Tate and IAP (International Academic Projects, Ltd). In mid-October, over the course of three days, some 41 authors presented research, techniques and ideas on gels in conservation. The talks were excellent, and I’ve focused on four that were notable for the wide range of materials treated and challenges faced. They ranged from coating/grime removal from a giant sequoia tree cross section, to dirt and varnish removal from Delacroix wall paintings, to removal of repairs from a fragile felt hat from a 18th century ship wreck, and an experiment comparing residues left behind by various gels on paper.
The fourth talk I wanted to highlight is Michelle Sullivan’s “Rigid polysaccharide gels for paper conservation: a residue study” — of particular interest to me as a paper conservator. It was one of the few studies exploring quantitatively if residue is left behind by gels used in the treatment of works on paper. If so, did how does that residue impact the paper? To easily track residue on the paper samples, fluorescein dyes visible in UV light were added to the gels tested. The experiment used agarose, gellan gum and methyl cellulose gels in three different concentrations applied to three different papers for three different time periods. In addition, a few variables were added to mimic treatment, such as applying the gels through Japanese paper and clearing the gels using a damp swab. Besides surface examination, cross sections of the samples were also taken to see if the gels were penetrating the paper surface. The cross sections seemed to suggest that gellan gum was being absorbed into the paper. Sullivan found that all the gels tested left a residue, with gellan gum apparently leaving behind the most. She found that applying the gels through a Japanese paper barrier was the most effective method in minimizing residue. After oven aging for 21 days, the rag sample treated with gellan gum darkened slightly, while all the other samples did not. Sullivan proposed that the darkening might be related to the gelatin content of the rag test paper. She plans to expand her test variables and continue to build on this research. This feels like very important research and I eagerly await to results of the next phase of her work.
This blog series is a result of receiving the FAIC Carolyn Horton grant to help me attend the conference. I would like to gratefully acknowledge the FAIC for helping make it possible for me to attend this important conference.
AIC members from all specialty groups are invited to attend and participate in the event “A failure shared is not a failure: learning from our mistakes,” happening on Saturday, June 2nd, from 4:30 to 6:00 p.m. — click here to add it to your Sched. We will gather and share our cautionary tales, including treatment errors, mishaps, and accidents, with the idea of helping our colleagues not to repeat them.
Discussing mistakes is a hot topic that has already been embraced by others in our community. Two examples of events scheduled during the month of May are: “Mistakes were made,” a regular feature at the American Alliance of Museums conference, and the lecture “Conservation Confidential” hosted by our conservation colleagues across the pond in the Independent Paper Conservators’ Group.
Participants can speak for up to 5 minutes; if you prefer to remain anonymous, a reader will be happy to present your tale on your behalf. If you are unable to attend AIC’s Annual Meeting but would like to submit a tale to be read by one of our organizers or a colleague, please reach out.
Screens to project PowerPoint slides containing your images/video will be available (16:9 format), and a Dropbox folder will be made available for submissions. Please also bring your presentation on a USB Drive (highly encouraged). Time permitting, audience members inspired by their colleagues will be welcome to present. If appropriate (and acceptable to the speaker), the floor will be opened for questions and discussion following presentations. Extra points for suggesting safeguards and solutions!
Please note that this is a forum for sharing personal mistakes and solutions only. Participants are requested not to name other persons, organizations, work places, and avoid politics—institutional, national, and global!
The event will include a cash bar, so come, relax, unwind, share, laugh, groan, and learn. We plan to publish the event for those who wish to be included.
If you are interested in participating or have questions about the event, please contact Tony Sigel at tony_sigel@harvard.edu or by calling 617-767-1900 (cell), or Rebecca Gridley at rebecca.ec.gridley@gmail.com by May 10th.
Please include 2-3 quick sentences introducing your topic and indicate whether you plan to use a PowerPoint with images and/or video.
The Journal of American Institute for Conservation (JAIC) seeks submissions for a special issue on the topic of “Reflectance hyperspectral imaging to support documentation and conservation of 2D artworks.” Two-dimensional artworks include paintings, works on paper, tapestries, and photographic materials. The focus of this special issue is on hyperspectral systems that provide continuous reflectance spectra over the portion of the spectral range from the UV to the Mid-IR. Specific areas of interest include:
Description of the best methodologies and acquisition parameters of workflows for operating hyperspectral imaging cameras under museum conditions or in non-controlled environments such as when studying outdoor frescoes or murals;
Hyperspectral image cube processing workflows to mine datasets for useful information such as pigment or binder maps, or visualizing compositional changes or revisions;
Defining, testing, implementing, and developing specific criteria for optimizing the format of acquired data and processing procedures for analysis, storage, usage, and dissemination of hyperspectral imaging data and results;
Case studies on the identification of artists’ materials using reflectance hyperspectral imaging, mapping distribution or improving visualization of compositional paint changes or revisions.
Authors are invited to submit an abstract and article outline to the special issue organizers by January 31, 2018. Complete article submissions are due April 30, 2018. JAIC guidelines and its style guide are found at www.conservation-us.org/jaic. Articles selected by the guest organizers should be submitted through our online portal at jac.edmgr.com. Datasets can be included as supplemental information.
You may send inquiries about the issue to Julio M. del Hoyo-Meléndez, JAIC Editor-in-Chief, at jdelhoyo@muzeum.krakow.pl.
Send proposals to special issue guest organizers by January 31, 2018:
John K. Delaney at j-delaney@nga.gov
Senior Imaging Scientist, Scientific Research Department,
National Gallery of Art, Washington DC
Marcello Picollo at picollo@ifac.cnr.it
Research Scientist, Institute for Applied Physics “Nello Carrara” (IFAC)
National Research Council (CNR), Florence, Italy
I was particularly interested in “Preventive Conservation in the Renovation of the Harvard Art Museums: Before, During, and Ever After” by Angela Chang, Penley Knipe and Kate Smith, as my employer LACMA is currently undergoing a similar museum building project.
Angela Chang, who presented the paper, began her talk with a brief summary of the museum’s history, which concluded with the presentation of the new LEED Gold building by Renzo Piano as well as the new storage facility that housed the entire collection during the museum building’s construction. She demonstrated how Harvard’s conservators successfully integrated the aspect of preventive conservation into an already established design and construction process. She also stressed the importance of cooperation and communication with external groups, such as administrators, donors, architects, and others, for the success of the project.
Angela discussed three main topics in conjunction with the new building.
Samples of all potential and existing materials in the construction of the storage facility and the new museum were tested using the Oddy Test. Results of the tests, among other topics, were discussed in weekly construction meetings held with architects, contractors, engineers, and project managers. Only 50% of 900 tested material samples passed the test and some materials needed to be tested repeatedly due to sample mix-ups. Existing fireproofing material made of cementitious plaster, for instance, was completely removed from the storage facility for the sake of the preservation of the collection and health of humans.
300 computerized and smart, single or double blinds control the light levels in the exhibition spaces and the conservation labs, but the new museum building turned out to be more light flooded than initially expected. A seasonal programming schedule was derived from a light monitoring program based on over 50 readings and requirements from the facility department. Based on the seasonal occurrence of light leaks, conservation staff needed to identify exhibition areas not suited for light sensitive artworks and still works on permanent displayin order to safely exhibit parts of the collection. Light blocking films, for instance, are currently being tested to address light leaks.
For a short time now, visitor incidents are recorded systematically and measured with a program developed by Security, Conservation, Collections Management, and IT called Art Touch Cards. The 46 guards can notify conservation and collection management staff immediately with urgent issues; minor issues are reported by filling out cards that are compiled and reviewed daily. Based on quarterly analysis of the data, artworks and galleries with a high incident rate can be identified and issues can be addressed. Improvements were made by adding colored lines of tape in the galleries as visual barriers, editing label texts, limiting the amount of visitors in one room, staffing galleries, and training guards.
Angela summarized her presentation by pointing out that all departments serve a collective purpose and that how relatively simple management systems, like the Art Touch Cards, can bridge interdepartmental communication gabs.She reiterating how the success of the building process, as well as its maintenance, is dependent on the close collaboration of different departments and external groups.
The final talk of the June 1st RATS session was by Jana Dambrogio, Thomas F. Peterson (1957) Conservator, MIT Libraries, Curation & Preservation Services. Jana has been working for several years on the subject of “letterlocking,” the many techniques by which a letter can be folded to form its own envelope. Some of these letters are folded very simply while others are outfitted with complex security features that indicate if a letter has been opened by someone other than the intended recipient. Jana’s research has even suggested that a single individual might have had more than one technique for folding letters.
Most of this research has been carried out by studying unfolded letters, examining folds, cuts, and other physical evidence in order to reverse engineering the original folded structure. Now, Jana and a team from Queen Mary, University of London are using Computed Microtomography (CT scanning) to discern the interior structure of unopened letters. A collection of 600 such letters is held by the Museum voor Communicatie in The Hague, Netherlands.
The letters are part of a group of 2,600 that came to the Museum stored in a 17th century trunk. Jana explained that in the period when the letters were written, the mail operated on a “cash on delivery” system. The letters in the trunk were never retrieved, and thus remained in the custody of the postmaster. While about 2,000 have previously been opened, the “Signed, Sealed & Undelivered” project team are studying the 600 that have never been opened, using a novel application of CT imaging.
During the talk, Jana shared many videos from the project website, demonstrating techniques for letterlocking and showing the potential of the imaging technique.
Author Sara Wohler discussed the fascinating history of Alexander Calder’s airplane model, Mexico #3, the last work he completed before his death, and then presented the conservation treatment of the model. Author Ralph Weigandt then discussed the technical analysis of the paint film on the airplane. This presentation served as a fun continuation of the painted airplane theme, following Lauren Horelick’s May 30th talk “When an Airplane Acts like a Painting: Applying Established Conservation Methodologies to Ephemeral Aircraft Materials.”
Background
Wohler described the beginning of Alexander Calder’s airplane-making career: In 1972, New York advertiser George Gordon approached Calder with the idea of painting an full-scale airplane. Calder loved the idea, as it would combined his experience in kinetic art and his background in engineering. Gordon paired Calder with Braniff International Airways, and Calder created the designs for two airplanes: Flying Colors of South America and Flying Colors of the United States. These were both tremendous public successes.
^Braniff International Airways employee ceremony, 1975, with Flying Colors of the United States.
The author then described the process in which Calder painted the planes: He began by experimenting with designs on several 1/25-scale Westway Aircraft Models. The chosen design from the model was then scaled up using graph paper that was attached to the full-size airplane. Calder and his team then used pounce wheels to poke holes through the design on the graph paper, and black spray paint was applied through the pounce holes. The graph paper was removed, and the paint colors were spray applied by a Braniff team. Calder supervised the entire process, and hand-painted the engine necelles during the spray process.
Then the author described the artistic process for the model Mexico #3. In 1976, Braniff commissioned a third plane from Calder, this one to celebrate the great relationship between the U.S. and Mexico. The author provided amazing historic film footage of Calder painting the Mexico #3 model plane. She noted that the plane itself was made of fiberglass, and Calder created his design using gouache. On November 11, 1976, Calder completed and signed the work, and tragically, passed away later that evening. Although the design was completed, Mexico #3 was not transferred to an airplane, as Calder was no longer alive to approve of the final result.
^Calder painting the Mexico #3 model.
Treatment
The model airplane was brought to Kuneij Berry Associates, Chicago, for conservation treatment. Through examination, the author found that the fiberglass model airplane had two priming layers, blue and grey, and a final, even, white coating. Calder painted onto the proprietary white surface using gouache, possibly that he made himself. While the airplane was quite dirty and structurally had sustained a few losses, the treatment was relatively straightforward.
The plane was in poor aesthetic condition; it had previously been displayed in a planter with dirt and plants around it, exposing it to both dirt and moisture. Fortunately, the gouache paint layer was generally in good condition and intact, aside from a few abrasions. The synthetic varnish layer, which had protected the gouache layer, was covered in surface dirt and grime. The plane was first surface cleaned with deionized water and PVOH sponges, but a lot of the dirt remained embedded in the varnish. The synthetic varnish was removed with aromatic solvents. Care was taken to only thin the varnish on top of the gouache paints, as the paints were sensitive to aromatic solvents.
^Detail of the varnish removal, cleaned (left) and with remaining varnish (right).
Structurally, the plane had suffered a few chips to its wings and there were a few areas of flaking paint. The flaking paint was consolidated with Paraloid B72. To recreate the tips of the wings that had been chipped away, molds were made of Elastosil M4600 A/B and cast using Milliputti. The cast pieces were sanded and adhered to the wings using Paraloid B72.
Shallow losses in the white priming layer were filled and inpainted simultaneously with Golden MSA colors. Losses in the gouache colors were then inpainted with QoR watercolors. The model was then sprayed with a few, light, protective layers of RegalRez 1094. After the successful treatment, it was recommended that the painting be displayed in a new, more environmentally stable location.
^Sara Wohler inpainting Mexico #3.
Technical Analysis
The technical analysis of Calder’s gouache paint was carried out by Ralph Weigandt, who is currently the primary researcher on the collaborative National Science Foundation (NSF-SCIART) grant with the University of Rochester’s Integrated Nanotechnology Center to advance the scientific understanding and preservation of daguerreotypes. The authors carried out technical analysis of the gouache paint in order to better understand Calder’s materials and techniques, potentially inform the conservation treatment, and to pioneer the use of Focus Ion Beam (FIB) milling for SEM-EDX analysis and PLM examination on paint films. Through Transmission Electron Microscopy, SEM-FIB allows for the elemental analysis of paint layers at the nanometer scale!
Weigandt explained in depth about the sample preparation process, the Focus Ion Beam milling of the larger sample into the much smaller (~12 um x 0.5 um) cross-section, the comparison between traditional SEM-EDX spectroscopic elemental analysis and mapping vs. the Transmission Electron Microscopy and associated SEM-EDX elemental analysis and mapping capabilities. In essence, the FIB milling and TEM allows for highly precise, high resolution elemental analysis and mapping, allowing scientists and conservators to see the inorganic composition of individual pigment particles. A poster from University of Rochester graduate student So Youn Kim outlines the project with excellent photographs and illustrations.
In the end, the elemental analysis did not contribute greatly to the decision-making process of the treatment, but did provide excellent information about Calder’s painting techniques and materials for Mexico #3, which can inform a discussion about his art-making process for this piece and his art in general. It is clear that this Focus Ion Beam technique coupled with Transmission Electron Microscopy and SEM-EDX elemental analysis is an exciting analytical technique that will be extremely useful in the precise identification of inorganic pigments, fillers, etc., in paint films. Furthermore, it is great to see yet another example of private conservators working with scientific departments at universities (or elsewhere) to investigate materials of cultural heritage objects!
Maggie Barkovic and Olympia Diamond presented a case study that outlined the decision-making process that lead to the successful treatment of darkened, dirt-infused water stains on the bare canvas portion of a large-scale acrylic dispersion painting: Composition, 1963, by Justin Knowles. The authors attributed the treatment’s success to the combination of extensive evaluation of Knowles’ materials and aesthetic aims and the understanding of new, innovative cleaning techniques designed for acrylic dispersion paintings (with the help of Brownyn Ormsby, TATE, and Maureen Cross, Courtauld Institute of Art). This presentation served an excellent compliment to Jay Kruger’s presentation Color Field Paintings and Sun-Bleaching: An approach for removing stains in unprimed canvas, which discussed the treatment of acrylic solution and oil paintings on bare canvas.
Composition is a privately-owned work that was brought to the Conservation and Technology Department at the Courtauld Institute of Art for treatment in 2013. The large-format work is a two-dimensional acrylic painting with brightly colored geometric forms juxtaposed against an unpigmented acrylic sized canvas. The painting had sustained disfiguring water stains along the top and bottom edges which disrupted the aesthetic reading of the image, rendering it unexhibitable.
Context
In the first step of the conservation process, Barkovic and Diamond assessed how the water stain affected the aesthetic interpretation of the painting. They explored where this painting fit into the artist’s oeuvre: it was part of a series of early, pivotal works where Knowles explored his initial ideas of spatial tension using non-illusionistic geometric compositions that incorporate negative space in the form of unpainted canvas. The authors carried out technical examinations of four other paintings from this early stage in his career, finding that Composition was painted in a comparable manner to his other early works: a fine linen canvas was stretched on a wooden stretcher and then sized with an unpigmented (pEA/MMA) acrylic dispersion coating. Then, Knowles used pencil and pressure-sensitive tape to demarcate where he would paint the geometric forms with acrylic dispersion paints. Though he applied a transparent acrylic “size” layer over the linen/negative space, he still considered the visible canvas “raw” and unprimed. Through the examinations and research on Justin Knowles’ personal notes, the authors assessed that the characteristics and color of the linen canvas were equally important to the interpretation of the work as the paint colors. As such, the canvas should be treated and the water stains removed if at all possible.
Replicas
Second, the authors explained that they needed to identify the components of the water stain (with no prior knowledge of water-staining incident) in order to test cleaning methods. Replicas were made using linen and the same unpigmented acrylic polymer that Knowles most likely used. The replicas were then stained with dirty water. Using XRF spectroscopy and empirical testing as a guide, a visually accurate and equally tenacious water stain was made with iron, calcium, and organic “dirt” components from aged linen. The test replicas were aged in a light box for two years to allow the stain to photo-oxidize and bond with the fabric and size layers.
Testing
Third, the authors needed to determine how to treat the water stain with the presence of the unpigmented acrylic dispersion size layer, which swelled in water and was affected using polar solvents. Their goal was to remove the stain or reduce the appearance of the stain to make successful inpainting possible. The authors looked to successful textile and paper conservation treatments for possible methods. The initial cleaning and/or retouching tests included the use of solutions with various pH values, conductivities, chelating agents, surfactants, bleach (sodium borohydride), the application of toasted cellulose powder, and pastel retouching.
The authors thoroughly explained the various test groups, but a recapitulation of all of these various solutions is outside of the scope of this blog post. In general, higher pH values (around 8) and higher conductivity values (above 2.5 uS) allowed for better cleaning efficacy. Perhaps more notably, the chelating agent DTPA (diethylene triamine pentaacetic acid) greatly outperformed TAC in cleaning efficacy. This is likely because DPTA is a much stronger chelator that is much more suitable for sequestering iron and calcium (which XRF showed to be present in the stain). DPTA could be used safely because the acrylic size layer was unpigmented. Finally, the use of agar (rather than free solution) was found to be useful in the reduction of the stain. The agar gel allowed for greater control of the solution distribution onto the stain and dirt absorption into the gel. The most effective cleaning agent, which was eventually used to clean the painting, was made from a higher concentration of agar gel at 5% (w/v), using Boric Acid 0.5% (w/v), DTPA 0.5% (w/v), TEA, at pH 8, 2.4 mS.
Evaluation of Successful Treatment
While a successful treatment methodology was developed through empirical testing, an investigation into the effects on the surface morphology of an unpigmented acrylic dispersion size layer was thought necessary due to the different absorbencies among the test canvases, observed differences in retention times for the agar gel, and concerns about the higher pH required to reduce the stain. The lack of pigmentation and hard surface features made changes caused by cleaning more difficult to perceive, measure and contextualize, so changes in surface gloss and stain reduction were evaluated with a spectrophotometer and subjective observations by conservators. The impact of the cleaning methodology on the surface of the size layer and canvas fibers were examined with dynamic Atomic Force Microscopy (AFM) and high resolution digital microscopy. A preliminary investigation into possible residues from cleaning was also investigated using FTIR-ATR spectroscopy.
The number of samples for AFM was too small to draw concrete conclusions without more testing and utilizing additional analysis such as FTIR-ATR; however, a general trend was observed that an increase in the gel concentration from 2.5% (w/v) to 5% (w/v) appeared to reduce the time in which fiber flattening occurred. In addition, FTIR-ATR showed a decrease or complete removal of migrated surfactant from the acrylic size layer surface in all treated samples regardless of the agar concentration in the gel, and along with the swelling of the acrylic layer, was considered by the authors an acceptable risk with this treatment. IR bands corresponding to agar or the additives in the cleaning solutions were not detected.
Final Treatment
As mentioned previously, the cleaning agent that was eventually used to clean the painting was made from a higher concentration of agar gel at 5% (w/v), using Boric Acid 0.5% (w/v), DTPA 0.5% (w/v), TEA, at pH 8, 2.4 mS. The agar was hand-cut to perfectly align with the stain patterns on the canvas and weighted with sandbags to increase the gel-canvas contact. Using this method, stains were greatly reduced. However, a few, minor discolorations remained after the cleaning. Further tests were carried out to determine the best inpainting method for these residual discolorations. Dry pigment with Lascaux Jun Funori, Aquazol 50, Aquazol 200, watercolour and gum arabic and Paraloid B72 were all tested for optical effects, handling properties, and reversibility. The Aquazol 50 series was found to be the most effective overall and was used to inpaint the remaining discolorations.
Conclusion
The authors concluded by restating that the success of the treatment would not have been possible without the combination of art historical and material understanding of Knowles’ work and research into new cleaning methodologies for acrylic dispersion paint films. They thanked their project advisors Maureen Cross, Courtauld Institute, and Bronwyn Ormsby, Tate, and many others for their generous support and guidance throughout the project.
Andrew began his talk by very graciously acknowledging that many other people have contributed data that informed his paper. Andrew’s work is based on research began by William Barrow, a paper chemist at the Library of Congress until the 1960’s. Barrow’s research on books tried to draw a connection between physical properties and chemical content. He had collected about 1000 books published between 1500 and the 19th century, and he took various measurements such as fold endurance, pH, alum content, etc. He tried to draw connections between those sets of data to predict the ageing characteristics of the paper. This collection was obtained by LoC in the 1970s, and are still used for destructive testing today. But where Barrow used macro and micro scale measurements, Andrew looks to the middle ground: polymer chemistry. For that, he uses size exclusion chromatography, or SEC.
SEC measures the degree of polymerization of cellulose using a roughly 1mm squared sample size. (It may be helpful to think of degree of polymerization as the molecular weight.) The degree of polymerization of a sample can be compared to known references. It should also be noted that papers have a mixture of molecules of different sizes, and SEC provides a distribution curve. The more large molecules in a sample, the less degraded the cellulose, meaning that the paper is in better condition. Andrew discussed several examples of treatments of iron gall ink on paper where SEC was used to show the effects of those treatments on the papers.
Barrow’s research indicated that pH was the best indication of the future physical properties of paper. Andrew took about 80 samples from Barrow’s collection and confirmed that the molecular weights of paper correlate with pH (when the pH drops, the molecular weight drops). Andrew then looked to see if the molecular weight corresponded to physical properties; with newer papers, the molecular weight does tend to be smaller. Poor tear resistance also corresponds to low molecular weight. In general, he found that the molecular weight determined by SEC is a better indicator than pH for future physical properties for both newer and older books.
SEC certainly has advantages. The sample size is ridiculously small. Tells you about the physical building blocks of the paper, giving a better idea of what’s in it and what state the cellulose is in. There are some disadvantages to overcome before this technique is in every lab. The test itself takes a week to do. It requires extremely expensive equipment and organic solvents, and one must have the technical knowledge to interpret the data. Andrew’s ultimate goal is to turn this into a rapid technique that’s affordable, so that the molecular weight distributions of an object can be included in an object’s record and be pulled up by a barcode. That’s an exciting prospect!
Andrew’s work presents a very interesting analytical option that future conservators might have access to. It would be nice to have a predicting model for the degradation of library objects. But it would be even more interesting to see the effects of treatment on paper. It is important that conservators continue to check our own work, and I’m glad to have assistance with that from scientists like Andrew.
Cindy’s talk was a mightily condensed summary a few of the techniques for measuring the pH of paper that the Preservation Research and Testing Division (PRTD) at the Library of Congress has investigated over the last 4 decades. Her introduction was a summary of the challenges presented by this task. Due to the chemical structure of cellulose and the nature of paper, most methods can only approximate the pH of paper. The method of sample preparation can impact the results of measuring. How paper ages means that there may be a different pH in different regions. The ions that dictate the pH may not be soluble in water, making measuring pH harder. And atmospheric carbon dioxide can react with your solution and affect your results. Notice that I said “solution.” Cindy ended her introduction by noting that you can’t measure the pH of a solid. But you can approximate it, and the PRTD has been trying to identify the best way to do this for decades.
The PRTD’s focus on the pH of paper began in 1971 with the deacidification program. Chemist George B. Kelly used titrated alkalinity and titrated acidity as an “external yardstick”, and four different extraction methods: TAPPI cold method, TAPPI hot method, surface pH measurement, and “pulped” pH method. Kelly determined that for acidic papers, the method of measuring pH didn’t matter much, but the alkaline papers had an acidic pH despite their 1% alkaline reserve. The hot extraction method was shown to be much more accurate with alkaline papers, as it was likely better at getting all the ions into solution. The pulp method came close. Cindy then went on to talk about the uncertain origins of the pulp method (i.e. it’s not discussed in any published literature, but is mentioned frequently in internal documents from the PRTD). (I do wish that Cindy had gone into detail about the process of each method of extraction, because I wasn’t too sure about how each process worked. She doesn’t mention pH strips, gels, or pH meters at all in this talk. And TAPPI stands for the Technical Association of the Pulp and Paper Industry.)
Then Cindy skipped to the late 1990’s (she does mention that a few papers had been published in the decades between). By this time, the PRTD ramps up its documentation efforts, as well as its protocols for sample collection and homogenization. Most of these protocols were put on their website. During the renovation of the instrumental suite in 2007, the lab’s emphasis shifted to developing non-destructive and micro-invasive techniques, which were more appropriate for art objects rather than circulating collections materials. This meant that the sampling methods had to adjust accordingly.
To address the new challenge of micro samples (or none at all), the PRTD tried to make surface pH measurement work, but found that tideline formation and sensitive media made that difficult. “Miniaturization” was another method the PRTD tried. For this technique, sample size can be a few milligrams to a few micrograms, depending on the paper to water ratio and other details of sample preparation. They found that slurrying helps, but filtration makes no difference in pH measurement. In addition, controlling the amount of carbon dioxide was key to getting an accurate reading with acidic papers. Both purged bags and sealed vials were tried, with comparable standard deviations but slightly different pH readings. The pH readings from the micro methods agreed fairly well with macro methods.
One of the best takeaways from Cindy’s talk was when she shared that during their renovation, the PRTD was sending out samples to a contractor for pH measurement, and papers that had an alkaline reserve were coming back with an acidic pH. Their conclusion was that not every method is appropriate for every type of paper, and that sample preparation can also affect results.
Here were some tips about each of the methods, taken from Cindy’s recap slide: The Hot Method is closest to objective measurement, but takes two hours per sample. The Pulped/Blender method generally agrees with the hot method, but is faster. The ISO cold method has a much higher standard deviation than the TAPPI cold method. Thea Surface pH method has the highest standard deviation of any method tested, and is difficult with alkaline papers, thick boards, and boards with adhesives. This method also causes tidelines. And the Mini Method is also difficult with thick boards, but the results comparable in repeatability to large scale extraction methods.
So, what does it take to accurately measure the pH of a piece of paper? A focus on repeatability and an optimistic attitude! The scientists and preservation specialists at the PRTD struggle with many of the same challenges that the rest of us do, albeit with fancier equipment. It sounds like just getting a ballpark figure for pH is as close as we can hope for for now. The PRTD is still investigating methods, and we should all look forward to their results!
Finally, one cool tip: You can make your own micro blender with a homemade Mylar blade attached to a Dremel tool!
My personal area of interest and intended future practice is in the conservation of historic interiors. Therefore, I am always keen on portability both in tools and materials as well as forms of analysis. The other advantage to the techniques presented in this workshop is that physical sampling is not required, which is always attractive and music to a curator’s ears.
The workshop met my personal expectations, but the title “Effectively Using…” could have suggested to some that this was going to be more of a “boot camp” for being able to implement these techniques back home. This style of workshop was more of an information/demonstration session and is great for anyone considering buying similar instrumentation and/or for gaining a better understanding of the general benefits and limitations of portable spectroscopy.
Given the short duration of this workshop, I was initially concerned that I might have signed up for a 2 ½ hour lecture without any hands-on component. Participants were encouraged to bring our own samples and indeed at least an hour was dedicated to looking at samples and exploring the instrumentation first-hand. Although we did run over the scheduled time, and were gently shuffled out of the room as hotel staff started to break down tables.
The workshop was led by Tom Tague, Ph.D. Applications Manager at Bruker, and Dr. Francesca Casadio, Director of the Conservation Science department at the Art Institute of Chicago. I really appreciated having these different perspectives. Tague did not assume the role of salesperson during the workshop, but as you would expect he was very positive in his description of the capabilities of the Bruker instrumentation. Casadio kept Tague grounded in the realities of our complex samples and what can be confidently identified using these techniques. At the same time, it was useful to have Tague there to speak to the specifics of the instrumentation and push Casadio a little bit to consider what some of the newer technology could offer. There was also a Bruker sales representative present to assist with running the instrumentation and software and offer information on pricing.
Overall the session was well organized. I know I was not the only attendee who was ecstatic that I got to take home a flash drive loaded with the presenters’ PowerPoint slides. The spectra from my samples that were analyzed were also loaded directly onto this flash drive before the end of the workshop.
The first part of the session did consist of pure lecture. Tague’s presentation focused on specifications of the Bruker portable instruments and descriptions of the techniques.
An interesting tip he offered was using sandpaper to take surface samples. He lightly abraded a painted surface and then placed the sandpaper in front of the portable FTIR (ALPHA)—no additional sample prep necessary.
Having just completed my Master’s degree in conservation I was able to follow the presentation fairly well, but I fear that it may have been overly technical and too fast for someone who does not work with these analytical techniques on a regular basis. Nonetheless, I anticipated this to be an intermediate-level workshop when I signed-up.
As would be expected based on the organizers of the workshop, the instrumentation provided and discussed were all Bruker models. Two ALPHA portable FTIR spectrometers were present. The ALPHA is set up to receive different “snap-on” modules. The two modules available for demonstration were the “External Reflectance” module and the “Platinum ATR” module. The BRAVO Handheld Raman spectrometer was also available for interaction.
Here are some key facts about each instrument:
The base ALPHA starts around $14,000 and each module is on average $6,000 in addition.
ALPHA “External Reflectance”
Does not require direct contact with a sample/object
No size limitations as long as unit can be mounted/held in appropriate orientation to the sample
Camera integrated in unit to help orient, find appropriate working distance/focus, and document sample location
Collects reflectance spectrum NOT absorbance
Can collect specular and diffuse reflection; reflective and non-reflective materials can be analyzed
Footprint of instrument is about 8” X 11”
Weighs about 13lbs.
Can be tethered to a laptop
About 6mm sampling area
Approximately 4cm-1 spectral resolution
ALPHA “Platinum ATR”
There is pressure/direct contact with the sample
The IR beam does penetrate into the sample
BRAVO Handheld Raman
$45,000-$55,000
Slightly narrower than 8” X 11” (looks like an oversized ELSEC environmental data monitor; less heavy than the Alpha)
Class I safe laser
2mm sampling spot size
No camera or viewing capability to help align collection area
Object needs to be in contact, but no pressure required
Approximately 8cm-1 spectral resolution
Fluorescence mitigation built into software/data collection
Dual lasers built in and used/activated simultaneously
Optimal wavelength and reduced risk of damaging sample
Touch screen allows for control and data collection without tethering to laptop
Tethering also capable via WiFi to laptop
In terms of the ALPHA “External Reflectance” one of the big selling points is that there is no size restriction or need to balance the object on a stage. The trade-off in allowing data collection without physical sampling is that the spectra generated are in % reflectance. The majority of reference spectra available for free and through the Infrared and Raman Users Group (IRUG) are % absorbance or % transmittance (its inverse). The Bruker software does offer the capability to convert the data using the Kramers-Kronig Transformation. Francesca Casadio seemed to prefer to analyze data from its original state in reflectance. Characteristic peaks for bonds are slightly shifted from their location in transmittance spectra, but at Casadio’s level of experience she is able to take these nuances into account with some ease. She was honest with the attendees summarizing that this form of IR spectroscopy is “not like portable XRF; one needs to have experience and repetition for familiarity with interpreting spectra.”
For those interested in more on interpreting reflectance spectra of art objects Casadio recommended the following publications from a research group in Perugia Italy:
“Reflection infrared spectroscopy for the non-invasive in situ study of artists’ pigments.” C. Miliani, F. Rosi, A. Daveri & B. Brunetti, Appl. Phys. Mater. Sci. Process. 106, 295–307 (2012) (http://dx.doi.org/10.1007/s00339-011-6708-2)
“In Situ Noninvasive Study of Artworks: The MOLAB Multitechnique Approach.” C. Miliani, F. Rosi, B.G. Brunetti & A. Sgamellotti, Acc. Chem. Res. 43, 728–738 (2010) (http://dx.doi.org/10.1021/ar100010t)
“Non-invasive identification of metal-oxalate complexes on polychrome artwork surfaces by reflection mid-infrared spectroscopy.” L. Monico, F. Rosi, C. Miliani, A. Daveri & B.G. Brunetti, Spectrochim. Acta Part -Mol. Biomol. Spectrosc. 116, 270–280 (2013)
“In-situ identification of copper-based green pigments on paintings and manuscripts by reflection FTIR.” D. Buti, F. Rosi, B.G. Brunetti & C. Miliani, Anal. Bioanal. Chem. 405, 2699–2711 (2013)
It is important to keep in mind the basis of data collection to understand the limitations of what can be analyzed with the ALPHA “External Reflectance” on a given object. For example, with a varnished painting the spectral reflectance of the varnish will typically only allow the varnish itself to be detected (with some exceptions depending on thickness of the varnish and underlying pigment composition). Similar reflective material properties make plastics easily detectable with this technique. Matte objects are still good candidates for analysis with the ALPHA, but the data will be collected via diffuse reflection. The ALPHA does not seem like an appropriate technique for discerning between individual layers within a given structure unless coupled with other techniques.
One of the ALPHA’s at the workshop was supplied by Casadio from the Art Institute’s lab, and she has extensive experience using the ALPHA. Her presentation was more about working knowledge of the instrumentation. She polled the attendees and focused on case studies mainly of pigment analysis and identification of plastics. Casadio emphasized the benefit of the ALPHA as a mapping tool that does not require sampling. Perhaps one or two samples could be taken from a work of art and more confidently characterized with bench top FTIR and/or GC-MS and then the use of specific materials could be mapped without additional sampling using the ALPHA. Casadio’s case studies often combined multiple analytical techniques. She finds the ALPHA to be a nice compliment to XRF. Overall, Casadio has found the ALPHA to be very useful in characterizing different plastics and also good at detecting deterioration surface products (e.g. zinc soaps) especially with modern and contemporary collections. Casadio noted that the ALPHA detects very strong signal and peaks for waxes and PVA coatings. Casadio has been able to use the ALPHA for collaborations with other institutions and collections, which is another boon of its portability.
I was disappointed that Casadio had not had previous experience with the BRAVO Handheld Raman. At the Art Institute she has a bench top Raman unit. She seemed skeptical about the BRAVO’s capabilities and some of the claims that Tague was making that it could “see” indigo and other organic pigments without surface enhanced Raman spectroscopy (SERS). Casadio stated that in her personal opinion with Raman it is better to bring the art to the unit than the other way around. By the end of the workshop she did seem impressed with the quality of spectra the BRAVO was generating, but there was not enough time to have further discussion and to tease out Casadio’s candid opinion on the instrument.
I was most excited for the practical demonstration with the instruments especially because I had come armed with over 10 samples. I was anticipating that I may not even get to analyze one sample, but was very pleased that I was able to look at 7 samples with the BRAVO portable Raman. This much time with the instrument was due in part to many participants not bringing samples.
If a similar workshop is organized in the future, it might be good to have participants sign up ahead of time for slots with the instrument if they are interested in analyzing a specific sample. It was a fairly large group – about 18 participants. Attendees that did not bring samples were still interested in watching the process of collecting data and interpreting the spectra. This was challenging; even with three instruments there tended to be 5-7 people crowding around a laptop screen. Dividing us into smaller groups, having the laptops hooked up to a projection screen, or further limiting the number of participants may be additional considerations for future workshops.
It seemed like the majority of participants were conservators rather than conservation scientists. I personally do not work with spectroscopic techniques on a regular enough basis to be able to confidently interpret spectra on the fly. Francesca Casadio was able to offer her expertise and interpretation while working with samples from the participants, but neither Tom Tague nor his Bruker colleague could offer specialized interpretation. Some of the participants seemed frustrated that the instruments were not connected to an art materials database for instant gratification and matching.
Both Tague and Casadio strongly emphasized the importance of each institution building its own reference database specific to the collection. The IRUG database was promoted, but as a supplement to an institution’s own reference database. Neither of the instructors felt that the database that comes with the Bruker software was appropriate for art materials.
My personal goal during the workshop was to pit these portable instruments against their stationary counterparts and to pit the two complimentary techniques against each other. Therefore, I brought known samples from my institution’s reference collection of traditional paints. All the paints were oil-based and mixed with some degree of lead white. The reference pigments I chose were mostly organics (indigo, madder, cochineal). Colonial Williamsburg has had the opportunity to partner with the College of William and Mary in order to perform SERS on objects in the paintings collection. My colleagues and I were curious to see how this portable unit compared to spectra produced with SERS. With the minimal time, I chose to focus on the BRAVO because our institution already has a bench top FTIR.
Tom Tague was set-up at the BRAVO “station” during the practical session, and as I stated previously he was not comfortable offering any interpretation of the data. I was excited to review the spectra we collected back at my home institution (Colonial Williamsburg Foundation/CWF) alongside Kirsten Travers Moffitt, the conservator in charge of our materials analysis lab. Moffitt performs a lot of FTIR analysis on our collection, but has less experience with Raman.
All the organic paint spectra from the BRAVO were certainly “neater” than what I am used to seeing in terms of raw data from a bench top Raman with oil paint samples. I personally would attribute the quality of the spectra to the dual laser capability. I’m not sure how much impact the fluorescence mitigation had because the spectra were still pretty noisy and it was challenging even for Moffitt to distinguish significant peaks. It appears that the fluorescence of an oil binder is still problematic with the BRAVO. In Tague’s presentation he showed an example of indigo detection with the BRAVO, but this was on an illuminated manuscript, where fluorescence of the binding media would be less of an issue.
At CWF we only have a reference database for IR spectra, but looking at possible peaks in the indigo/lead white sample spectrum, the characteristic peaks for indigo that Tague mentioned (545, 1463, 1578) do not appear to be present. It seems that the lead white is dominant, with a strong peak around 1050. In conclusion, Tague is partially right that the BRAVO can detect some organic pigments, but likely only if they are present in high enough concentrations (not mixed) and are not in highly fluorescent binding media (like oil).
The other samples I looked at were reproduction wallpaper samples from Adelphi. I was curious to see if we could detect anything useful about the pigments on an object that would normally be challenging to sample and could not be brought to the lab if it were installed in a historic interior.
The resulting spectra were less noisy than those of the oil paint reference samples, again likely due to the non-oil binding medium on the wallpaper.
Despite the better quality of the spectra, we still did not have the resources (i.e. a good reference database for Raman and experience working with Raman spectra) to confidently characterize the pigments present. I am sharing this to illustrate Casadio’s point that the ALPHA and BRAVO require a certain level of expertise and do not provide instant answers.
One of the other participants, Ann Getts, a textile conservator at the De Young Museum in San Francisco, brought various sequins from a costume in storage with a suspicious vinegar odor. Getts had time to look at one of the sequins with both ALPHA modules, and her case study demonstrates some of the trade-offs with the non-contact “External Reflectance” module.
She began with the “External Reflectance” module and the first hurdle was getting the instrument positioned at the appropriate working distance from the sample. Without an adjustable stand, we had to use trial and error to shim up the ALPHA so that the camera could focus on the sequin. The resulting spectrum suggested cellulose acetate (as suspected by Getts initially), but even Casadio still felt insecure about drawing any concrete conclusions based on this spectrum. Then the sequin was analyzed with the “Platinum ATR” module and right away Casadio concluded that indeed it was cellulose acetate.
Each of these instruments has their advantages and disadvantages. Overall the ALPHA seems like a good bang for your buck given the duality of the modules. The price point is pretty reasonable also considering the portability.
The BRAVO is fairly new technology and the dual lasers seem promising, but at this point it does not seem like a must have for the average institution. I would encourage anyone thinking about purchasing any of these instruments to consult with both of the workshop leaders.
In general I would specifically recommend the ALPHA to:
Institutions that have a lot of sampling restrictions
Institutions with a lot of oversized works
Institutions that focus on modern and contemporary art (especially with plastics and large Color Field paintings)
Institutions with a conservation scientist on staff
In general I would specifically recommend the BRAVO to:
Institutions that have a lot of sampling restrictions
Institutions wanting to focus on analysis of paper-based art
Institutions with a lot of oversized works
Institutions that already have staff with Raman expertise
Institutions looking to purchase a Raman instrument
This blog represents my personal conclusions and understanding of the workshop. I would encourage any of the other participants and the instructors to post in the comments if they have differing opinions or think that I have misunderstood any of the technical aspects of the instrumentation.