43rd Annual Meeting – Book & Paper Session, May 15, "Heat-Set Tissue: Finding a Practical Solution of Adhesives by Lauren Varga and Jennifer K. Herrmann"

The National Archives & Records Administration (NARA) has been making heat-set mending tissue in house for many years.  Recent digitization initiatives have increased the need for efficient stabilization mending.  NARA prefers heat-set tissue for this type of mending for many reasons.  The tissue is flexible and easily reversible.  It requires no moisture for application and is easy to use.  The transparency of the tissue, which they can control by making the tissue in house, does not interfere with the digitization of text. To make the heat-set tissue, NARA starts with an appropriate weight of Japanese tissue, which is toned (if necessary) before application of the adhesive.  The tissue is wetted and smoothed out against silicone Mylar to remove bubbles.  A batch of the acrylic emulsion polymers Rhoplex AC 234 + AC 73 were mixed and applied through a screen onto the wet tissue.  The tissue was then allowed to dry on the Mylar until ready for use. Unfortunately, the Rhoplex adhesives they had been using for many years have been discontinued, and they had to search out a new blend of adhesives to continue making the tissue.  NARA tried two different blends of adhesives:  Avanse MV-100 + Plextol B500 and Avanse MV-100 + Rhoplex M200.  NARA settled on a 4 : 1 : 1 ratio of water : Avanse MV-100 : Plextol B500 for their new mix. PROS:

  • FTIR analysis showed that the adhesive, when applied through a screen, does not sink through the Japanese tissue.
  • Blocking tests also showed the tissue safe to use on multiple layers of documents.
  • The mixture passed the PAT test for use on photographs.

CONS:

  • Avanse MV-100 has optical brighteners in it, which is something of a concern.  Advanced aging test showed that the optical brighteners did not migrate into the documents which had been mended, however, so it was deemed acceptable for use.
  • The tissue also has a high sheen from the silicone Mylar that can be objectionable to some clients.  It isn’t bad enough to cause problems for digitization, however, and it can be removed with a swab of alcohol if necessary.

43rd Annual Meeting – Sustainability (Track B) General Session, May 15, "Sustaining Georgia's Historical Records: NEH Sustaining Cultural Heritage Collections Implementation Grant at the Georgia Archives" by Kim Norman and Adam Parnell

Georgia Archives Conservator Kim Norman and Assistant Director of Operations Adam Parnell shared data from the Georgia Archives’ successful NEH Sustaining Cultural Heritage Collections Implementation Grant project in order to support and encourage other institutions seeking to justify implementing similar environmental strategies.  Kim Norman started off with a brief history of the Georgia Archives to set the context of the project.
In 2003, the Georgia Archives opened in its current facility, which was designed to meet the highest archival standards of the time, prioritizing security and environmental protection for the collections. The complex, multi-zoned mechanical system made it possible to monitor environmental conditions closely, but proved to be unwieldy and costly to operate. The NEH SCHC Implementation Grant project aimed to reduce energy consumption while simultaneously continuing to uphold best practices for the preservation of collection materials.
Refusing to let laryngitis derail his commitment to sharing this project, Adam Parnell whispered his way through the talk. The audience’s patience and encouragement served as testament to their interest in hearing what he had to say. The Georgia Archives essentially transitioned from a “run all the equipment all the time” model to a “run equipment only as needed” model. The original HVAC system was run 24/7 for 365 days a year, using up about 700kW/hour and incurring electricity costs of over $30,000 per month. Dehumidifiers were run constantly, even when the outside air was within an acceptable range. Heating and cooling units were also run constantly, at the same time, stressing the system, which needed constant monitoring and repair.
The new model relieved stress on the system and made use of passive environmental conditions whenever possible. The environmental standard was set to 55-60 degrees F with a 35-40% RH set point. The new system installed a “weather station” with “adaptation intelligence,” so, for example, when it’s raining, the draw of outside air reduces to a minimum to avoid increasing the indoor RH. The system can shut down cooling units when the outside air dips below 40 degrees Fahrenheit. Likewise, the system turns off the dehumidifiers when outside RH is below 50%. The heating boilers are now run at 140 degrees F instead of the former 180 degrees, and they are turned off altogether when the outside air temperature spikes above 90 degrees.
Using the new model, kilowatt usage has dropped from 700 kW/hour to 365 kW/hour, decreasing the monthly electric bill by nearly 40% to about $18,000.  Increased savings are also expected in reduced gas consumption and plant water usage.
Resource Links:

43rd Annual Meeting – Book and Paper Session, May 16, "The Effects of MPLP on Archives: 10 Years Later" Panel Discussion Moderated by Andrea Knowlton

Introduction
Andrea Knowlton, Assistant Conservator for Special Collections at UNC Chapel Hill and moderator for this panel discussion, began the session with a brief introduction about the origins of MPLP, short for “More Product, Less Process”, and its impact on archives collections over the past decade.  The concept of MPLP originated with a 2005 article in the American Archivist, entitled “More Product, Less Process: Revamping Traditional Archival Processing.”  Authors Greene and Meissner sought to address the massive processing backlogs which were, and are, a common concern and source of inefficiency in archives collections.
In order to receive maximum benefits from this blog post, I strongly encourage you to review the article, which may be found at: http://www.archivists.org/prof-education/pre-readings/IMPLP/AA68.2.MeissnerGreene.pdf
As Andrea explained, this article encouraged a reduction of arrangement and description activities, as well as a reduction in the initial time and resources invested in preservation activities, such as refoldering, rehousing, and removing staples, in order to facilitate access to collections.  She described that this approach to processing was controversial in the archives field, but is now widely accepted and practiced.  As Andrea pointed out, though MPLP is a major topic in archives, its impact has not been widely discussed in conservation.  I personally can vouch for this; since volunteering to write this blog post, I have explained the concept of MPLP to several conservator friends, so if this is new to you, you are in good company!  As someone who is currently working with an archives collection, I was truly looking forward to this panel discussion.
Laura McCann, Conservation Librarian at NYU Libraries, “Partnering for Preservation and Access.”
After Andrea’s introduction, each of the panelists gave short presentations about the impact of MPLP on conservation in their institutions.  Laura McCann, Conservation Librarian at NYU Libraries, was the first speaker, and said that their experience with MPLP has been “a happy story.”  NYU Libraries have archival materials held in 3 separate repositories, and these archives had 3 separate management policies until recently.  Their policies have become consolidated and streamlined largely thanks to MPLP.  Laura described that MPLP allowed them to rethink their core values, to refocus on ways to be more user-centered, and to better understand their resources in order to plan and manage more responsibly and sustainably.  She pointed to three main areas in which MPLP has impacted their institution:
-Organizational changes: a new Archival Collections Management Department was formed, headed by Chela Weber, which included a new position for a Preservation Archivist, Fletcher Durant, who functions as a preventive conservator and liaison between conservation and archives.
-Workflow changes: there was a shift in the type of materials treated, with higher emphasis on materials that were being actively used for teaching, exhibitions, and loans.  This in turn has led to a better understanding of how conservation work increases access.
-Methods/Materials changes: efforts were made to house and store items in a more efficient manner.  Instead of creating custom housings, they decided to move toward modification of standard sized boxes because they found that this saves space.
Laura also mentioned that she had recently published an article on the impact of MPLP, and suggested this resource for those who were interested in learning more:
Laura McCann. “Preservation as Obstacle or Opportunity? Rethinking the Preservation-Access Model in the Age of MPLP.” Journal of Archival Organization 11, 1-2 (2013): 23-48.
Michael Smith, Collections Manager at Library and Archives Canada, “Acquisition, Preservation and Immediacy- A Different Approach to Balancing the Demands of Making Archival Material Quickly Accessible.”
The second panelist, Michael Smith, a Collections Manager at Library and Archives Canada, discussed two examples of the impact of MPLP in his talk.  The first example Michael described was a major project involving The Sir John Coape Sherbrooke Collection, which includes 37 notebooks, 79 maps, paintings, and other documents and artifacts.  They were faced with the challenge of making these items digitally available by a tight deadline, and this required a streamlined approach to processing, treatment, and digitization.  Treatment and description activities were carried out concurrently, with archivists working side by side with conservators during treatment.  The materials were tracked using temporary numbers during processing so that they could be processed efficiently, and once the materials were described, they were digitized, bar coded, and stored.  Michael emphasized that collaboration between archivists and conservators was an essential part of this project.
The second example Michael described was their First World War Records Digitization project.  The records in this collection included medical history documents, pay sheets, casualty forms, etc.  Processing this collection involved the removal of every imaginable type of fastener, and Michael included a great image of a large bin full of fasteners.  In total there were 3.5 kilometers of documents which needed to be digitized.  This differed greatly from their usual digitization workflow, in which, Michael described, items are usually digitized as requested by clients.  Prior to digitization, they carried out “material triage,” or minor repairs, and a Banctec, or high volume, scanner was used. While this scanner is not normally used for archives documents, they found it was needed for this project and could be slowed down and used safely.  This project also required both archivists and conservators to rethink and modify their previous workflow model for processing, treatment, and digitization, and consequently required archivists and conservators to work together as a team.
Michael concluded by summarizing lessons learned, including the importance of clear communication, adaptability, and teamwork.
Kim Norman, Preservation Manager/Conservator, Georgia Archives, “MPLP and Conservation at the Georgia Archives.”
Kim began her talk by questioning if MPLP is to archives what phase treatment is to conservation.  She went on to describe some conservators’ concern that phase treatment often results in simple, quick fixes, after which the objects are returned to storage and their greater needs are forgotten.  Kim emphasized that the size of unprocessed collections often makes full treatment of every individual item too overwhelming, and that treating in phases allows materials to be accessed sooner.
She then described examples from her institution of how their workflow has been adapted to better suit the goals of MPLP.  In the Georgia Archives, archivists are trained in some minor preservation and treatment techniques, such as making custom enclosures and sleeves.  She discussed how, while conservators might want to remove fasteners and complete minor repairs, archivists feel these steps are not usually high priorities, and the overarching goal is to ensure access quickly. She provided an example of a group of courthouse documents which were arranged and described but received only minor treatment, including humidification and flattening, so that they could be accessed in a timely manner.
Open Discussion
After the panelists’ presentations were completed, members of the audience were invited to ask questions and to comment on their experiences with the impact of MPLP.  The major discussion points are described below:
1) Laura was asked to speak more about the initiative for rehousing odd-shaped items.  She explained that this practice was started about 2 years ago, due to a combination of factors including a major renovation, new staff, and policy changes.  They are still dealing with rehousing items in the off-site storage, and are slowly calling back odd-sized boxes to replace them with standardized boxes, but items that are not housed at all are their first priority.
2) Laura was also asked to elaborate on the impact of the goals of being data driven and making collections quickly accessible.  She was asked if items that receive minimal attention and rehousing during preprocessing are coming back later to conservation.  Laura replied that all items have a small amount of preservation initially, after which they track use of the items and then enhance description and preservation as necessary.  She emphasized that if they notice an item is being used frequently, then it may be identified for further treatment later on.  Kim mentioned that in her institution items do not come back frequently and treatment is generally need-based.  She gave an example of a large group of fire-damaged courthouse documents that were treated because they needed to be immediately accessible.
3) A point about audio/visual materials in archives was raised, and it was mentioned that these materials pose a major processing challenge because they are being sent to high density storage with minimal processing with little expectation of reformatting or use, but are decaying quickly.
4) An audience member from a small National Park Service site commented that MPLP has created a feeling of going from maximum to minimum in terms of processing, and as a small institution they are faced with the challenge of finding a middle ground where they can address their inherent problems while also balancing their resources in a thoughtful and efficient manner.  Laura emphasized the value of collecting data and defining goals.  She suggested starting with fairly low, sustainable goals, and progressing from there.  Michael commented on the challenge of keeping up with a processing backlog while more material is constantly coming in.
5) As expected, the issue of fasteners reared its rusty head.  An audience member confessed that this issue keeps her up at night, and questioned if we should be disposing of these, because they are evidence of the history of archiving.  She suggested maintaining fasteners, or at least maintaining evidence of the original filing system.  Michael mentioned that they had considered melting down their giant box of fasteners and making something out of the metal.  Laura, on a more serious note, agreed that fasteners are great objects, and can tell a story, but often interfere with the larger goal of making materials accessible.
6) A private practice conservator who works with small institutions in the South brought up the great point that MPLP fundamentally assumes ideal climate control is already in place, especially in regard to leaving fasteners on documents. She asked for suggestions for how to advise local collections without adequate climate control as to how to implement MPLP.  Both Kim and Laura emphasized the importance of addressing the building envelope first while simultaneously considering how MPLP approaches should be adapted to best fit the needs of the individual institution.  Other audience members supported these suggestions.
7) The issue of mold was introduced, in terms of adding time or inefficiency to the processing workflow.  Michael discussed that mold remediation was included in their workflow from the beginning, and while it was definitely an extra step and caused slight delay, it fit in well with the rest of the workflow.  The option of using a vendor for mold remediation was discussed, although it was agreed that vendors were most cost effective when large amounts of materials were involved.  This segued into a discussion of Integrated Pest Management, and museumpests.net was suggested as a good resource for finding vendors.
8) The final topic of discussion was managing workflow schedules in terms of time, and managing the expectation that major processing/digitization projects need to be addressed as quickly as possible on top of other ongoing projects. The audience member who raised this point asked others to elaborate on who determines the work schedule, how they negotiate for more time, and how they deal with the pressure of these expectations.  Michael responded that, at his institution, they are generally not in the position to negotiate deadlines, but can generally negotiate in other areas, such as hiring extra staff or accepting high risk of damage, in order to better meet the deadline.  Kim commented that the needs of her institution are much more fluid and patron driven.  Laura also mentioned that the digitization initiatives at her institution are not as aggressive, but that having a preservation archivist working equally closely with archivists and conservators helps with scheduling major projects.  Other audience members reinforced Michael’s suggestion that, in cases where other parties have determined deadlines which are non-negotiable, other compromises should be suggested, such as stopping work on all other projects or hiring extra help.  It was also mentioned that this may be a good opportunity to point out how previous conservation work may have allowed digitization to be completed faster.
Conclusion
This blog post is a beast, but a necessary one.  As was emphasized by the panelists and audiences members, MPLP has had a major impact on conservation workflows in archives, and both the theme of this conference and the 10 year anniversary of MPLP made this a great time for this discussion.  I thought the point about assumed climate control was an especially good one, as was the final point regarding the pressures of digitization on top of the many other responsibilities conservators have outside of treatment work. This is directly related to Julie Biggs and Yasmeen Khan’s talk “Subject and Object: Exploring the Conservator’s Changing Relationship with Collection Material.”  While it was great to hear that the effects of MPLP have been overwhelmingly positive, I would have liked a more in-depth discussion of why MPLP was controversial in the archives field, as well as if we as conservators have noticed any of the negative effects that initially worried some archivists.

43rd Annual Meeting- Book and Paper Session, May 14, "The Brut Chronicle: Revived and Reconstructed by Deborah Howe"

In her talk about the treatment of Dartmouth College Library’s Brut manuscript, Collections Conservator Deborah Howe addressed the history of the manuscript, its condition and intended use, and the process involved in determining an appropriate binding structure. The major challenge she encountered was that the Brut was bound in a historic binding in poor condition that was not contemporary to the text block.

Brut_BT
Dartmouth Brut Before Treatment

Deborah began her talk with an overview of the Brut text and its historic significance. According to Deborah, The Brut text is a chronicle of both English history and mythology, and covers the history of England from its settlement until 1461; it contains records of battles and histories of rulers, as well as tales of Merlin and King Lear.  While variants of the text were written in Latin, French, and Middle English, 181 of 240 existing Brut manuscripts are written in Middle English, including Dartmouth’s Brut manuscript.  Dartmouth’s Special Collections Library acquired their copy of the Brut with the intention that the manuscript would be used heavily for research and teaching.  Dartmouth’s Brut is of particular interest because it has a significant amount of marginalia which is now available for scholarly research.  Prior to being acquired by Dartmouth, the Brut was in a private collection, and was not available to scholars.
This manuscript was unusual in that it was bound in a stationers binding that was in poor condition and was no longer functional.  As Deborah explained, this created a dilemma, because while the binding dated to around 1600, it was not contemporary to the text block, which dates to about 1430.  In making her treatment decision, Deborah consulted with conservation colleagues who suggested stabilization of the stationers binding in conjunction with limited use.  Because this book was intended to be used frequently, Deborah felt that a different solution was necessary.
First, the Brut was disbound, surface cleaned, mended, and digitized.  In the process of disbinding Deborah found evidence of a previous binding, which she conjectured might have included wooden boards.  Prior to determining an appropriate new binding, Deborah created a model of the stationers binding.  She also had the opportunity to consult with a group of Brut scholars who were visiting Dartmouth for a conference, and asked for their opinions regarding binding possibilities.  Through this consultation, Deborah decided that a binding was needed that would reflect the history of the book but would also suit its current needs.
Deborah chose to resew the Brut on tanned leather supports which were left long.  She created new boards from multiple layers of handmade flax paper, and three slots were left in the boards where the supports could be inserted.  Later, in response to a question, Deborah also mentioned that she attached strips of parchment to the supports as stiffeners in order to facilitate putting the leather supports into the boards.  She then created a chemise of alum-tawed leather to cover the book as a whole.  This created a reversible binding; there are no linings or adhesive present on the spine, and the cover can be easily removed to show the sewing.  The stationers binding and sewing materials were saved and are stored with the Brut.  In conclusion, Deborah emphasized the practical nature of this solution, in that the new binding references historic materials while making the book accessible and stable.
Brut_AT2
Dartmouth Brut After Treatment: showing supports inserted into boards

Brut_AT4
Dartmouth Brut After Treatment

 
 
 
 
 
 
In the questions session, Deborah elaborated on how often Dartmouth’s Brut is used and how the new binding was holding up. She mentioned that it has been a few years since the treatment was completed and that the book is accessed, either for teaching or research, at least once per week.  Despite this frequent handling, the new binding is still in great condition, and functions well.  One audience member asked Deborah to elaborate on her collaboration with scholars, and Deborah emphasized that this opportunity was both rare and essential.  Another audience member asked about the impact of digitization on access, and Deborah responded that digitization has increased access greatly, but that the digitized manuscript is mainly accessed by scholars, while the physical book is frequently used for classes.
Deborah’s talk tied in nicely with the two talks that followed, including Evan Knight’s “Understanding and Preserving the Print Culture of the Confederacy” and Todd Pattison’s “The Book as Art; Conserving the Bible from Edward Kienholz’s The Minister,” in that all three speakers devoted time to in-depth discussions of their treatment rationale and their inner debates regarding a possible range of treatment options.  Many thanks to Deborah for providing the images!

43rd Annual Meeting – Book & Paper Session, May 14, "The Book as Art: Conserving the Bible from Edward Kienholz's The Minister by Todd Pattison"

Todd gave a thought provoking talk on the biases a conservator brings to treatment proposals. His primary point was that while conservators have a responsibility to bring their expertise and ethical considerations to every treatment they do, they must also be flexible and considerate of curators’ wishes. He contended that while there were always wrong treatment decisions that could be made, there was no one right treatment decision. Every book is a living object. Treatment should be as unique as the treated item and should be considered in context with the item’s purpose and environment. To support his argument, Todd shared four examples from the NEDCC’s experience.

Kienholz_TheMinister
Edward Kienholz’s The Minister

Example 1: Edward Kienholz’s The Minister
The Albright-Knox Art Gallery approached the NEDCC to treat a damaged bible. The bible was just one small part of a larger artwork by Edward Kienholz entitled The Minister. Like many of Kienholz’s artworks, The Minister was comprised of found objects, including the damaged bible. The NEDCC had been contacted because an overly enthusiastic patron of the gallery had accidentally separated the text block from the bible’s cover. Even before this catastrophic event, however, the bible had been in damaged, dirty and weak structural condition. This evidence of use in the bible’s pre-artwork past was an integral component of The Minister. As such, the NEDCC’s proposal for a standard treatment was not acceptable because it would have altered the appearance of the bible (and thus The Minister as a whole) too much. Instead, the bible’s structure was stabilized while carefully retaining all of the original spine linings and visible signs of damage.
 
Example 2: Riviere binding of Ben Johnson’s Works
The NEDCC quoted a 17th century copy of Ben Johnson’s Works which had been rebound in the early 20th century by the Riviere Bindery. During the rebinding, the text block had been bleached, oversewn, and bound in a tight red morocco binding. There was absolutely no question that the binding was causing further damage to the text, however the curator considered the piece to be a valuable teaching tool – not only for the original content of the text, but also as an example of an expensive personal possession from the early 20th century. It was important to the curator that the binding be preserved, not replaced with a binding sympathetic to the century in which the volume was published, regardless of the fact that disbinding the volume to address the structural problems would have provided stronger protection to the weakened paper of the text block. As a result, the NEDCC repaired the Riviere binding and otherwise left the binding and sewing structure as they received it.
(For those interested, Princeton University Library has a lovely collection of Riviere bindings online.)
 
Example 3: A View of Antiquity by Jonathan Hamner, et al
The discussed copy of A View of Antiquity came to the NEDCC in beautiful disrepair. The binding had parted way with the pastedowns, the sewing thread was missing entirely. All in all, it could have served as a wonderful teaching tool on bookbinding structure of the 17th century. As such, the NEDCC’s first instinct was to quote nothing more than a box to protect the volume; however, this volume was central to the institution’s identity. The volume was an important marketing tool for the institution, and it needed to look the part, so the NEDCC did a thorough and aesthetically pleasing restoration of the volume.
 
Example 4: Battlefield Bible
Todd’s last example was a bible covered in mud to the point of textual illegibility. As a conservator, one’s first instinct would be to wash the text block, but that would have destroyed the history of the volume – for its provenance was that it had been recovered from the battlefield at Gettysburg.
This last example reminded me strongly of the recent Preserving the Evidence: The Ethics of Book Conservation Symposium held at the Newberry library in April. Jeanne Drewes of the Library of Congress discussed a copy of Lincoln’s second inaugural speech that was found to have a fingerprint on it. They are currently doing DNA testing to find out if the fingerprint belonged to Lincoln himself. Had that document been cleaned, the evidence would have been destroyed.

AIC 43rd Annual Meeting- Book and Paper Group-Case Study: A Practical Approach to the Conservation & Restoration of a Pair of Large Diameter English Globes (Lorraine Bigrigg & Deborah LaCamera)

This talk presented the multi-disciplinary treatment involved in conserving two English globes- one celestial, one terrestrial. Overall it took 1400 studio hours! That is no typo! Deborah has kindly forwarded some screenshots of the powerpoint that you will find below.
1. Title slide
The globes were made between 1845-51 by Malby & Co (http://www.georgeglazer.com/globes/globeref/globemakers.html#malby jump to “Malby”) and they were acquired in 1851 by the University of Deseret, the university founded by the Church of Later Day Saints, now University of Utah.
The structure of these globes goes back to the early 16th century. Globes are essentially 2 hemispheres molded over a form and joined at the equator with an adhesive. The globe is then covered with plaster and paper gores (a gore is the name for the printed sections of paper that contain the informational content of the globe) and the entirety is burnished and varnished.
The Malby globes of the University of Utah were in poor condition, with cracks and losses and discolored varnish. The speakers considered the options for treatment of the two globes and decided they needed to treat them differently, since the terrestrial globe was so damaged that all of the gores needed to be removed and the hemispheres realigned, while the celestial globe had only small areas of damage so it did not need to be completely disassembled.

2. Terrestrial Sphere Condition
Terrestrial Sphere Condition

3. Celestial Sphere Condition
Celestial Sphere Condition

 
 
 
 
 
 
 
The treatment involved removing the varnish, removing the paper gores with a hand-held steamer, realigning the terrestrial globe’s hemispheres, cleaning the gores, mending, filling losses, reattaching the gores, burnishing, then varnishing.
Gore Removal
Gore Removal

Filling Holes
Filling Holes

 
 
 
 
 
 
Where there were areas of loss to the information on the gores, the TKM studio found gore reference sheets from Malby at the Royal Geographic Society in London. These were copied and then printed using pigmented ink-jet printer for the celestial globe. Gores from another globe, also at the RGS in London, were used as a reference for the terrestrial globe replacement gores. The reproductions were inserted as fills in the specific areas of loss in the cartography. Since this treatment, carried out in 2007, new techniques have become available, and the TKM studio has been using Pronto plates (http://www.nontoxicprint.com/polyesterplatelitho.htm ) for the past year or so. These plates use traditional printer’s ink, which is light, solvent, and heat stable.
The filled gores were registered and reattached to the hemispheres using a wheat starch paste-methyl cellulose mix.
Remounting Gores
Remounting Gores

After mounting the gores, the globes were burnished, then sized with 3% gelatin mixture. After in-painting, the globes were varnished with Dammar containing Tinuvin. The authors stressed that the entire project was multi-disciplinary as the stand was repaired and the metalwork was cast and engraved to form the completed object.
The treatment is published in the most recent Journal of the Institute for Conservation, volume 38 no.1 2015

AIC 43rd Annual Meeting- Book and Paper Group: Foxing and Reverse Foxing: Condition Problems in Modern Paper and the Role of Inorganic Additives (Sarah Bertalan)

This talk was given by Sarah Bertalan, as the culmination of a career observing foxing and reverse foxing during her conservation practice. It is such an interesting topic that there was too much information to squeeze into 30 minutes, and Bertalan left us hanging without a conclusion but I will provide some links to articles that she mentioned at the end of the talk summary, as well as to the conservation wiki on foxing. I hope that she will publish in the BPG Annual Postprints, as I’m sure the majority of the attendees of her talk would agree! It is a topic that I find so interesting, and I even compiled my own literature review of foxing while in graduate school!
Bertalan’s observation of 19th and 20th century papers has led her to propose that foxing, the “reddish, brownish, roundish stains that occur in a random pattern,” is not caused by mold or by metal inclusions, but rather by inorganic additives that were added to the paper. It is widely reported that treating foxing stains can frustratingly lead to their reappearance within a relatively short amount of time.  Bertalan also considers the capacity for 19th and 20th century papers to discolor over their entire surface, not just in in the staining we associate with foxing. In such cases, the stains may be extensive but superficial, and the condition would be due to contact with catalyzing, acidic materials, not migration of degradation products, as is seen for matburnReverse foxing is a term that remains undefined, and the cause is unknown. White spots, as if negative images of the dendrite-like reddish brown foxing spots, will appear and be visible in normal light, often after a paper has been treated. Reverse foxing has been identified frequently on Van Gelder Zonen papers.
Inorganic additives were added to papers in the 19th and 20th centuries to achieve specific results. Additives such as minerals and metal oxides were added to modify the surface and texture, to act as fillers and opacifiers, brighteners and to aid in ink retention. The additives are extremely reactive and act as salts, catalyzing acid-base reactions. While foxing was not visible immediately after the paper was made, elevated humidity, changing pH and daylight would provide the environment to form foxing. Contemporaries were aware of the effect of humidity on papers, notably the appearance of foxing stains.
Bertalan observes that the sensitivity of papers coated or immersed in metal salts to light is well documented, as the earliest photographs were made with paper coated in metal salts. The supporting evidence Bertalan presents for inorganic particles causing foxing is the presence of opaque zones that correspond to foxing stains when paper is seen in transmitted light. Furthermore, when foxing is not visible in normal light, opaque dendrite-like inclusions in the paper can be seen in UV light as well as transmitted light. Even when the reddish-brown stains have been washed out of the paper, the opaque regions still remain when viewed in UV light.
Resources:
http://www.conservation-wiki.com/wiki/Foxing_(PCC)
Soyeon Choi Literature Review on Foxing (you must use your AIC sign-in to access the article) http://www.maneyonline.com/doi/pdfplus/10.1179/019713607806112378
Browning, B.L. Analysis of Paper 1977.

43rd Annual Meeting-Book and Paper Session, May 15, 2015, "16-17th Century Italian Chiaroscuro Woodcuts: Instrumental Analysis, Degradation and Conservation" by Linda Stiber Morenus, Charlotte Eng, Naoko Takahatake, and Diana Rambaldi

The presenter, Linda Stiber Morenus, began her discussion of these complex prints with a description of the printing process. Chiaroscuro woodcuts were intended to emulate chiaroscuro drawings, which were comprised of black chalk shadows and white chalk highlights on colored paper. Color oil-based printing inks were first used to print 14th-century textiles, being used on paper by the mid 15th-century. The chiaroscuro woodblock prints required two to five separate woodblocks, inked with different shades lighter and darker than the midtone colored paper.
In order to better characterize the media, Morenus collaborated with art historian Takahata, and conservation scientists Eng and Rimbaldi from the Los Angeles County Museum of Art (LACMA). In addition to prints at LACMA, the team studied prints from the British Museum and Library of Congress. Out of over 2000 surveyed woodcuts, 72 were studied in depth, with X-ray Fluorescence (XRF), Fiber Optic Reflectance Spectroscopy (FORS), and Raman spectroscopy. Inorganic compounds were indicated by XRF analysis. FORS was especially helpful for detection of indigo. Raman spectroscopy provided additional information about organic colorants.
Renaissance artists’ manuals, such as Cennino Cennini’s Libro dell’Arte guided the research by providing information on the most likely colorants for printing inks. Inorganic pigments included lamp black, lead white, ochres, vermillion, verdigris, and orpiment. Organic pigments included indigo and a variety of lake pigments.
After providing background information, the presenter began to focus on deterioration and conservation of the chiaroscuro prints. The prints from the Niccolo Vicentino workshop had a high lead content. The inks typically had a low vehicle-to-pigment ratio, tending to turn gray around the edges, due to the presence of lead sulphide. Verdigris corrosion was also a common problem, as found on “Christ Healing the Paralytic Man” by Giuseppe Niccolo Vicentino, as well as 13 other prints from the same workshop. Typical copper-induced paper degradation included yellow-brown halos around inked areas and cracks in the paper.
Fading and discoloration were major problems for the organic colorants, such as indigo and the yellow lakes. Morenus compared copies of Ugo da Carpi’s “Sybil Reading a Book” in the British Museum and the Library of Congress, finding clear evidence that the indigo in the British copy had faded. The British Museum had confirmed the presence of indigo through Raman spectroscopy. At least 8 of the prints were found through XRF to have high levels of calcium in the same areas where indigo had been identified, suggesting the presence of chalk-based lakes. Organic greens had shifted to blue or brown where organic yellows had faded or become discolored.
The presenter concluded with suggestions and caveats for conservation treatment. First, she advised conservators to exercise caution in aqueous treatment, in order the preserve the topography of the prints. The woodblock creates a relief impression in the paper, and the layering of the inks adds another level of texture that might be altered by humidification, flattening, washing, or lining treatments. The low binder content also makes the inks more vulnerable to saponification and loss during alkaline water washing. Morenus warned that the hydrogen peroxide color reversion treatment for darkened lead white would be particularly risky, because the white lead sulphate end product has a lower refractive index than basic lead carbonate original pigment. This means that treated lead white becomes more translucent, and the lower “hiding power” shifts the tonal balance of the print to appear darker overall.
For exhibit recommendations, Morenus suggested that we should always expect to find fugitive organic colorants in chiaroscuro prints, so exhibit rotations should be planned accordingly. Maximum exhibit conditions should be 5 foot-candles (50 lux) of visible light for 12 weeks of exposure, no more often than every three years. She also indicated that overmatting should be avoided to reduce the risk of differential discoloration.
During the Question and Answer period, Morenus clarified the color order used in printing. Some prints were inked from dark to light, but most were printed with the lightest color first.
I thoroughly enjoyed learning about these beautiful prints, but I think that the discussion of the lead white conversion treatment-induced refractive index shift was the most important “take-away” from the presentation.

42nd Annual Meeting – Book and Paper Group Session (BPG), May 30, “Salvage of Paper Materials from the Flooding of São Luiz do Paraitinga” by Fernanda Mokdessi Auada

 On Friday May 30th, Ms. Fernanda Mokdessi Auada presented an account of the joint salvage effort undertaken by the Nucleus for Conservation of Public Files of São Paulo (APESP) and the Nucleus of Restoration-Conservation Edson Motta, Laboratory del National Service for Industrial Apprenticeship (NUCLEM-SENAI) following the 2010 flooding of São Luiz do Paraitinga, Brazil. Collective gasps went up from the audience as Auada showed photographs of the devastated city. Among the images was the city all but subsumed by the Paraitinga river, and shots of devastating structural damage to the city’s principal church (São Luiz de Tolosa) and its municipal library. 
 

During the flood of 2010, the fall of the city’s principal church
During the flood of 2010, the fall of the city’s principal church

 

Thousands of documents, over 15 linear meters in total, were immersed in the flood waters for over 20 days. The papers related primarily to the population’s citizenship and legal identity, making it vital for conservators to save the information contained in the wet and moldy files. Despite the grave condition of the documents–and the challenge of having virtually no money or trained support staff–the overall salvage was a success, Ms. Auada said.

The documents arrived for salvage in three allotments. The first two allotments were treated manually, using traditional flood damage salvage procedures. First, the documents were separated and air dried flat on top of absorbent paper. The documents were then individually documented and inventoried during dry cleaning, these steps carried out in a dedicated cleaning area. Documents that could not be separated mechanically after drying were separated while immersed in an aqueous bath. Papers soiled with heavy accretions of dirt and mud were washed to recover legibility. The papers were then mended, flattened and rehoused in paper folders and corrugated polypropylene boxes. Incredibly, 95% of the documents in the first and second allotments were recovered.

The third allotment, from the Public Ministry, proved to be more problematic, calling for radical treatment. These documents arrived at the APESP three months following the flood, after having been stored wet and housed in garbage bags. Upon drying the materials, it was determined that the extensive mold damage would be impossible to treat using traditional methods. Representing a “worst-case” scenario, this allotment of 176 files was submitted to decontamination by gamma irradiation. The moldy documents were packed in corrugated cardboard boxes and sent to the Radiation Technology Centre for Nuclear and Energy Research Institute (CTR-IPEN) at the University of São Paulo. While still within the cardboard storage boxes, the documents were dosed for disinfection (not sterilization) at 11kGy. This was the first time this type of salvage procedure had been carried out in Brazil.

Following irradiation, the papers were separated and dry cleaned using brushes. The dry removal of the mold spores proved easier and faster than the first two non-irradiated allotments, with sheets separating easily. Perhaps most importantly, the biohazard was eliminated, eliminating the need to quarantine the documents during documentation and dry cleaning. Ms. Auada described the costs of the treatment as acceptable, even within the project’s meager budget. The irradiated documents will be monitored for long term effects of the radiation, with polymerization of the cellulose being of primary concern.

42nd Annual Meeting – Book and Paper Session, May 29, 2014, "Digital Rubbings. Monitoring Bookbindings with the Portable Mini-Dome (RICH)” by Lieve Watteeuw

In this talk, Lieve Watteeuw showed images produced by the Reflectance Imaging for Cultural Heritage (RICH) project and demonstrated the functionality of the Mini-Dome module. I was excited to see this presentation after reading advertisements for the “New Bownde” Conference at the Folger Shakespeare Library last year, in which the RICH project and Mini-Dome were featured. Fortunately, for those unable to attend the presentation, extensive documentation about the project is available online through the project’s webpage and blog.

Downloaded from: http://portablelightdome.files.wordpress.com/2013/11/foto-1.jpg
The Mini-Dome module

The Mini-Dome module is a small, hemispherical-shaped imaging device that is tethered to a laptop. The module creates dynamic digital images through polynomial texture mapping, a technique commonly referred to as Reflectance Transformation Imaging (RTI). This technique involves taking a series of images from a fixed camera position, while changing the angle of lighting, in order to reveal the surface of an object. The original dome was created in 2005 at Katholieke Universiteit Leuven for reading cuneiform tablets, but has since been used to image bindings, illumination, wax seals, cuir bouilli, and other cultural objects. The current module is equipped with a single 28 MP digital camera and 260 white LED lights to capture a total of 260 images in approximately four minutes. Watteeuw showed a video of the Mini-dome in action during her presentation, but readers can view a similar video here.
KU Leuven, Maurits Sabbe Library, second quarter 16th century, SPES Binding, panel stamp on brown calf leather.
Example of images and filters. KU Leuven, Maurits Sabbe Library, 16th century SPES Binding, panel stamp on brown calf leather.

After capture, Watteeuw described how the images are processed by software and interpreted by seven dynamic filters. She  demonstrated some of these filters, including sharpen, shading, generate models, line drawings, and sketch.  Using the mouse or interface buttons, the user can zoom in, drag the image, or change the direction of lighting in real-time. For those that would like to experience the software interface, a web viewer is available here. Watteeuw reports that the software suite can also export to 3D shaded or rendered modes.

Watteeuw’s presentation included a demonstration of a measurement tool built into the image processing software. The tool can be used to measure the distance between two points or generate a height map for a portion of the object. This blog post includes an image of a height map created on the blind-tooled surface of a leather binding. Watteeuw explained that by scaling the image, the measurements can be accurate to 10 microns.

Watteeuw’s presentation included several examples of how the Mini-Dome could be used to learn more about the production of a binding. Images of a late 15th century book of hours were manipulated with filters to show tool marks on the uncovered wooden boards, providing evidence of how the boards were shaped and the lacing of the sewing supports. A second example showed a 16th century book (pictured above and described here), in which the binder scored a vertical line in the leather to align a large, central impression. Watteeuw described instances in which previously unknown marks or designs were revealed by manipulating the filters or direction of the lighting, such as three leaves emerging from an emblem design, or shallow impressions from a decorative roll being more clearly defined. This tool could be quite useful for identifying individual finishing tools and documenting how they changed or became damaged over years of use.

In addition to leather bindings, Watteeuw shared images of Belgian damask silk, remains of a ribbon, and embroidered bindings from the Folger Shakespeare Library. Once again, the dynamic filters in the software suite were applied in order to enhance details of the objects. The “sketch” tool provides clear images of weaving and embroidery patterns and could be very useful to textile historians and conservators. The measurement tool could also be used to gather data on the thickness of threads or cord used to construct the object.

ccsmicrodome
The Micro-Dome module fitted to a copy stand.

The RICH project will continue until 2015 and additional investigations are already underway. Watteeuw reports that the topographic data from an object is exportable into spreadsheet form. Engineers on the team are currently exporting high points of objects scanned to create a large data set for further analysis. Additional projects include the development of optical character recognition (OCR) for specific tool shapes or patterns on bindings. Testing is proceeding on a smaller “micro-dome” (pictured above) that  is constructed in two pieces so that it can be placed inside the opening of a book to capture images of the gutter or surface of a page. Watteeuw described a student research project currently in progress to measure sewing in manuscript textblocks.

Two questions were asked by audience members following the presentation. The first individual asked if a database of the existing images is available. Watteeuw answered that an open access database of all images captured would be ideal; however, since this is a research project using prototypes, the team is collaborating with institutions to link with existing databases. A second audience member asked if any attempts had been made to identify tools of various workshops. Watteeuw replied that a corpus was needed before any comparisons could be made.

Advanced imaging technologies, such as RTI, offer tremendous opportunities for the study of cultural objects and for digital libraries in general. The RICH project has produced a suite of tools that could be used by scholars and practicing conservators to gain a better understanding of an object’s composition and production. Wider use of devices such as the Mini-Dome in imaging collections of note and greater access to the software suite is required in order to exploit the full potential of the technology.