42nd Annual Meeting – Painting Session, May 30, "A Hangover, Part III: Thomas Couture's Supper After the Masked Ball"

Conservators are often faced with objects that have had extensive past treatments. While undertaken with the best intentions, some treatments have resulted in aesthetically jarring effects and loss of original information embedded in the construction of the work. Fiona Beckett explored these challenges of decision-making within the treatment of Thomas Couture’s Supper After the Masked Ball (1855).
The large painting is a depiction of a scene in the Maison d’Or in Paris following a party in the infamous hangout for artists and writers. The hungover revelers acted as vehicles for Couture’s commentary about the degradation of society’s morals. Although the composition was originally intended for use as a wall paper design, Couture seemed to have a soft-spot for this scene and the finished painted version was kept in his studio as illustrated by its numerous appearances in drawings and depictions of the studio space.

Thomas Couture's Supper After the Masked Ball (1855) Courtesy of the National Gallery of Canada
Thomas Couture’s Supper After the Masked Ball (1855)
Courtesy of the National Gallery of Canada

Supper After the Masked Ball had undergone two linings and at least two cleaning treatments in the past. It had been relegated to storage for the last 90 years because of its problems. While one lining was done with glue paste the second used wax resin resulting in an uneven combination of the two residues on the verso of the canvas. Ms. Beckett described the factors that had to be considered before removal of the lining. Some of the effects from the lining treatments included wax residue stains, shrinkage of the canvas and compression tenting from the glue paste, and flattening caused by the irons. Additionally, Couture’s habit of testing tints of colors on the verso of his paintings was obscured by the lining’s presence. The condition of the lining was such that it had already began to separate fairly easily from the original canvas and it was decided, after determining that it was not appreciably stabilizing the painting, to remove it. After removal, the color tints were indeed visible on the verso of the canvas. Another interesting aspect of Ms. Beckett’s treatment was her use of Gellan Gum to locally moisten and soften the glue residues on the verso prior to mechanical removal with a spatula.
The decision to not re-line Supper After the Masked Ball followed the trend to refrain from re-lining, but was also informed by other factors specific to the painting. The original canvas was in good condition after the lining removal and the previous linings appeared to not have been necessary. The residual glue and wax residues seemed to have added strength to the canvas as well. Lastly, the absence of the lining allowed easy viewing of the brush marks on the verso.
Final steps in the treatment included a spray application of B-72 to the verso, strip lining with Lascaux P110 fabric and BEVA, and building up the face of the stretchers to an even surface with the addition of mat board and a felt-like non-woven polyester.
Supper After the Masked Ball was an excellent case study to illustrate the decision-making processes conservators must use when approaching prior extensive treatments. Ms. Beckett made an astute observation that it is quite easy for us to criticize these past treatments, but we must acknowledge that they were carried out with the intentions to preserve and stabilize using the most advanced technology available at the time. Often it’s the case that these linings and such really did have a positive effect on the preservation of the pictorial surface, although these measures need to sometimes be undone in the present day when we have less invasive and more effective processes available.

42nd Annual Meeting – Book and Paper Group Session, May 30, “Conserving the Iraqi Jewish Archive for Digitization” by Katherine Kelly and Anna Friedman

Katherine Kelly and Anna Friedman presented on a two-year project funded by the Department of State and carried out at the National Archives and Records Administration (NARA) to conserve and digitize the Iraqi Jewish Archive. This is not an archive that was collected in the traditional sense, but rather materials taken from the Jewish community over many years–the collections were discovered in the flooded basement of the Iraqi Intelligence Headquarters in Baghdad in 2003.
National Archives conservators Doris Hamburg and Mary Lynn Ritzenthaler traveled to Iraq shortly after the discovery to advise on recovery and preservation of the collection. The damaged materials were frozen and flown to the US, where they were vacuum freeze-dried. Following a smaller-scale project in 2006 to assess the collection, the hard work to clean, stabilize, and digitize the heavily-damaged and moldy collections was carried out during the two year project that was the focus of this presentation.
I am always amazed at the sheer scale of projects undertaken at NARA and the organization required to tackle the work within a limited timeframe. Katherine and Anna’s presentation included discussion of adaptations of the usual National Archives workflows to increase the efficiency of the project and to aid conservators in their work. For most materials, the first step in stabilization was to remove inactive mold. Distorted items were humidified and flattened, and tears were mended. Items that had originally been attached to documents with water-soluble adhesive, like stamps and some photographs, had often released due to the flood waters and subsequent humidity; these items were repositioned and reattached whenever possible. Once stabilized, materials could be rehoused, catalogued, and digitized. Through every step of the process, materials were tracked through the workflow using SharePoint software.
The culmination of the project is a digital collection of all 3846 items, which allows the materials to be made available to everyone. An exhibition featuring highlights of the collection was shown both at the National Archives in DC and at the Museum of Jewish Heritage in New York. Another component of the project was the creation of a website with detailed information about the collection and its history, documentation of procedures, and an online version of the exhibit. I particularly enjoyed the short video describing the history of the project, featuring many of the conservators who were involved over the years.
I often listen to NPR while working in the lab, and last November I was excited to hear my former classmate Katherine Kelly in a feature on All Things Considered. If you missed Katherine and Anna’s presentation in San Francisco, I highly recommend a visit not only to the project website, but also to the NPR feature to learn more about the important work to preserve this collection and make it accessible.

42nd Annual Meeting – Electronic Media Group Luncheon, May 30, “Sustainably Designing the First Digital Repository for Museum Collections”

Panelists:
Jim Coddington, Chief Conservator, The Museum of Modern Art
Ben Fino-Radin, Digital Repository Manager, The Museum of Modern Art
Dan Gillean, AtoM Product Manager, Artefactual Systems
Kara Van Malssen, Adjunct Professor, NYU MIAP, Senior Consultant, AudioVisual Preservation Solutions (AVPreserve)
This informative and engaging panel session provided an overview of The Museum of Modern Art’s development of a digital repository for their museum collections (DRMC) and gave attendees a sneak peak at the beta version of the system. The project is nearing the end of the second phase of development and the DRMC will be released later this summer. The panelists did an excellent job outlining the successes and challenges of their process and offered practical suggestions for institutions considering a similar approach. They emphasized the importance of collaboration, communication, and flexibility at every stage of the process, and as Kara Van Malssen stated towards the end of the session, “there is no ‘done’ in digital preservation” — it requires an inherently sustainable approach to be successful.
This presentation was chock-full of good information and insight, most of which I’ve just barely touched on in this post (especially the more technical bits), so I encourage the panelists and my fellow luncheon attendees to contribute to the conversation with additions and corrections in the comments section.
Jim Coddington began with a brief origin story of the digital repository, citing MoMA’s involvement with the Matters in Media Art project and Glenn Wharton’s brainstorming sessions with the museum’s media working group. Kara, who began working with Glenn in 2010 on early prototyping of the repository, offered a more detailed history of the process and walked through considerations of some of the pre-software development steps of the process.
Develop your business case: In order to make the case for creating a digital repository, they calculated the total GB the museum was acquiring annually. With large and ever-growing quantities of data, it was necessary to design a system in which many of the processes – like ingest, fixity checks, migration, etc.- could be automated. They used the OAIS (open archival information system) reference model (ISO 14721:2012), adapting it for a fine art museum environment.
Involve all stakeholders: Team members had initial conversations with five museum departments: conservation, collections technologies, imaging, IT applications and infrastructure, and AV. Kara referenced the opening session talk on LEED certification, in which we were admonished from choosing an architect based on their reputation or how their other buildings look. The same goes for choosing software and/or a software developer for your repository project – what works for another museum won’t necessarily work for you, so it’s critical to articulate your institution’s specific needs and find or develop a system that will best serve those needs.
Determine system scope: Stakeholder conversations helped the MoMA DRMC team determine both the content scope – will the repository include just fine arts or also archival materials? – and the system scope – what should it do and how will it work with other systems already in place?
Define your requirements: Specifically, functional requirements. The DRMC team worked through scenarios representing a variety of different stages of the process in order to determine all of the functions the system is required to perform. A few of these functions include: staging, ingest, storage, description & access, conservation, and administration.
Articulate your use cases: Use cases describe interactions and help to outline the steps you might take in using a repository. The DRMC team worked through 22 different use cases, including search & browse, adding versions, and risk assessment. By defining their requirements and articulating use cases, the team was able to assess what systems they already had in place and what gaps would need to be filled with the new system.
At this point, Kara turned the mic over to Ben Fino-Radin, who was brought on as project manager for the development phase in mid-2012.
RFPs were issued for the project in April 2013; three drastically different vendors responded – the large vendor (LV), the small vendor (SV), and the very small vendor (VSV).
Vetting the vendors: The conversation about choosing the right vendor was, in this blogger’s opinion, one of the most important and interesting parts of the session. The LV, with an international team of thousands and extremely polished project management skills, was appealing in many ways. MoMA had worked with this particular vendor before, though not extensively on preservation or archives projects. The SV and VSV, on the other hand, did have preservation and archives domain expertise, which the DRMC team ultimately decided was one of the most important factors in choosing a vendor. So, in the end, MoMA, a very big institution, hired Artefactual Systems, the very small vendor. Ben acknowledged that this choice seemed risky at first, since the small, relatively new vendor was unproven in this particular kind of project, but the pitch meeting sold MoMA on the idea the Artefactual Systems would be a good fit. Reiterating Kara’s point from earlier, that you have to choose a software product/developer based on your own specific project needs, Ben pointed out that choosing a good software vendor wasn’t enough; choosing a vendor with domain expertise allowed for a shared vocabulary and more nimble process and design.
Dan Gillean spoke next, offering background on Artefactual Systems and their approach to developing the DRMC.
Know your vendor: Artefactual Systems, which was founded in 2001 and employs 17 staff members, has two core products: AtoM and Archivematica. In addition to domain expertise in preservation and archives, Artefactual is committed to standards-based solutions and open source development. Dan highlighted the team’s use of agile development methodology, which involves a series of short term goals and concrete deliverables; agile development requires constant assessment, allowing for ongoing change and improvement.
Expect to be involved: One of the advantages of an agile approach, with its constant testing, feedback, and evolution, is that there are daily discussions among developers as well as frequent check-ins with the user/client. This was the first truly agile project Artefactual has done, so the process has been beneficial to them as well as to MoMA. As development progressed, the team conducted usability testing and convened various advisory groups; in late 2013 and early 2014, members of cultural heritage institutions and digital preservation experts were brought in to test and provide feedback on the DRMC.
Prepare for challenges: One challenge the team faced was learning how to avoid “scope creep.” They spent a lot of time developing one of the central features of the site – the context browser – but recognized that not every feature could go through so many iterations before the final project deadline. They had to keep their focus on the big picture, developing the building blocks now and allowing refinement to happen later.
At this point in the luncheon, the DRMC had it’s first public demo. Ben walked us through the various widgets on the dashboard as well as the context browser feature, highlighting the variety and depth of information available and the user-friendly interface.
Know your standards: Kara wrapped up the panel with a discussion of ‘trustworthiness’ and noted some tools available for assessment and auditing digital repositories, including the NDSA Levels of Digital Preservation and the Audit and Certification of Trustworthy Digital Repositories (ISO 16263:2010). MoMA is using these assessment tools as planning tools for next the phases of the DRMC project, which may include more software development as well as policy development.
Development of the DRMC is scheduled to be complete in June of this year and an open source version of the code will be available after July.

42nd Annual Meeting – We can fix it but should we? Take 2: Part Two – The Treatment of Mr Chips Tad Fallon

As the title indicates this paper is the second part of a treatment that was discussed at last year’s session, and subsequently implimented during the past year. It brings up some fascinating and controversial issues and I admire Tad’s courage and boldness in presenting it to the profession. In short he totally refinished and recolored a work of art-furniture by a living artist.
I strongly encourage everyone, whether WAG or not to read the final paper because the detailed rational for the treatment are beyond what a humble first time blogger is capabable of re-iterating. The treatment was not casual and it is the thought and consideration that enlivens the context and makes this such a stimulating paper.
Tad’s treatment has created a different work of art. It is not the same as the original. The artist-applied colors and varnishes were first destroyed by UV light and the remnants removed by Tad. Tad then applied new colors by working with the artist. Although the resulting furnture is not exactly the same; it could be as close as anybody is ever going to get. He interviewed the artist and the artist assitant, he gathered the exact same materials, he learned the techniques of application (from the artist and otherwise) and documented everything. He also researched the effect of the treatment on the value of the piece which is something he can add to a treatment that few other furniture conservators can provide. He consulted with not only the owner of course, but with dealers as to the effect his treatment would have on value.
It is context that makes this paper so interesting. I would probably never do this treatment because in my lab and with the goals of my institution I would not be justified in such a radical intervention. But I am not allowed to talk to John Townsend, or Lockwood DeForest or any of the hundreds of annoymous workers that have made the furniture that I have treated. Even if I could, I doubt that I would listen to their advice without falling into professional funk.
I treat furniture to be placed in a historic house that is interpreted for the early 21st century viewer. That is my context. An example that occurred to me after Tad’s presentation was an 18th century French commode that I recently returned to a 19th century Gilded Age setting. Like any piece of marquetry furniture it had been stripped and sanded on a probably routine basis to restore the colors. (Not an option for Tad.) At some point the veneers became too thin for this practice to continue and, probably not coincidently, the standards of collectors have changed to accept the patina of age. It has been French polished to a whore’s shine, but still almost everyone that sees it thinks it is beautiful. It has sat in the same corner with everything else in the room for over 70 years. For me to even approach 50% of Tad’s intervention with this piece would be ridiculous.
Tad’s paper stimulated me to look at my own context and the assumptions that I bring to any treatment especially wooden furntiure. “It should suggest the artist’s intent but still show its age” – what exaxctly does that mean? It all depends on context. Now if I could just tone down some of these upholstery fabrics a little bit . . . Why is the context for a chair different than an upholstered chair seat?

42nd Annual Meeting – Architecture Session, 31 May, "Lime-Metakaolin Grouts for Conservation" by Norman Weiss

This technical talk discussed a new method and material for architectural conservation. Norman Weiss began by addressing the problems of lime grouts, and the nature of metakaolin. The problem of lime grouts is the anaerobic nature of the area the lime is being applied to, as the lime requires carbon dioxide from air for the setting phase of the lime cycle. Metakaolin is a class N pozzelan, a ‘thermally activated clay’, where dehydroxylation is accompanied by a loss of crystal structure. Metakaolin is between fumed silica (0.3 µm) and Portland cement (5 µm) in terms of particle size. Metakaolin is very quickly and specifically dehydroxylated between 500 and 530 degrees Celsius, a process which Weiss noted you “don’t have to finesse”.
Weiss continued by addressing the purpose of pozzelans within architectural conservation. The material must densify and fill gaps, must strengthen in order to create bonds, must reduce the amount of cement needed, and must reduce the amount of ‘bleed water’ in order to be the most desirable pozzelan possible. Metakaolin has the potential to achieve all of these goals, if used in a specific way.
This material has been previously looked at somewhat, the first study having been completed in 1993, and the commercial introduction of the material in 1994. Weiss also noted that there has been a lot of research into metakaolin as a grout in Portugal.
Metakaolin has a high water demand, and is not great for a grout, as it does not set well; in order to function, it requires a superplasticizer. Weiss and his colleagues have experimented with the reaction between lime and metakaolinite, which forms Straetlingite. What seemed to be the most important part of this talk was the discussion of the new methodology that Weiss and his colleagues have developed and patented using this material. The wall is mechanically stabilized, and the void is filled through a tube. This method gives the material the time necessary for the slow-strengthening material to achieve the necessary strength.
It will be interesting to see in the future the results of further study and case studies of completed projects using this material and methodology.

42nd Annual Meeting – Textiles Session, May 30, “In Consideration of the Thangka” by Denise Migdail

Any talk with the word “thangka” in the title is one I’m sure to attend.  I’ve been hooked on these incredible graphic pieces since seeing one entitled “Protectress Riding a Zombie”.  Because who couldn’t like an art form that depicts riding a zombie?  So I was very happy to hear that Denise Migdail of San Francisco’s Asian Art Museum was giving a talk entitled, “In Consideration of the Thangka”.  The meat of the presentation revolved around the method the Asian Art Museum has developed to store and display the 154 thangkas in their collection, which, I have to say, is very clever.  But more about that later.
Denise began with a quick overview of what a thangka is: a Buddhist image used for meditation and/or teaching.  Although most are painted, they can also be appliquéd, embroidered or even woven.  Denise’s colleague, Jeff Durham, assistant curator of Himalayan Art, says that “thangka” translates to the highly technical term “flat thingy”. However, any conservator who has worked on one will tell you that’s an unfortunate misnomer.
The idea to revamp the storage system came with the 2000-2003 move of the museum from its former home in Golden Gate Park to its current home in the Civic Center.  The museum received an NEH storage grant and decided that they wanted to eschew their previous hanging storage for flat storage because a) low-binder paint b) fragile silks and c) wooden hanging dowels.  Although this project started before Denise’s arrival, she has been elemental in its development since she came on staff in 2006.  She found from experience that the beautiful glass cases installed in the new museum were incredibly hard to access.  Only one pane of glass could be moved at a time, allowing relatively small access points for objects that can get really, really big.  The staff realized that the boards the thangkas were stored on would aid significantly in getting them into the case.  So hey, why not keep them on the boards during display as well as storage?  Many different types of mounting boards were experimented with, including Tycore, (takes up a lot of space and is pretty expensive), Coroplast (sharp edges and flexes a lot) and blue board, (still flexes, especially at large sizes). D-Lite boards were ultimately deemed the best option.  Navy velveteen was originally selected as a show fabric for both its tooth and complementary color.  The thangkas themselves were variously tied, pinned, or stitched to the boards.  Eventually the decision was made to start using standard-sized boards because reusing is a great way to go green, and also to save money.  Unfortunately, this meant that piercing the boards by tying the thangkas to them customized them too much.  Since other rotations in the Asian’s galleries were currently being mounted with the aid of rare earth magnets, it was decided they would be a good solution for the thangkas too. Kimi Taira, employed at the Asian and writing an entry for the AIC objects wiki at this time (http://www.conservation-wiki.com/wiki/Magnet_Mounts), contributed much. The D-Lite boards translated well to magnet mounts, since they were rigid enough to support steel to attract the magnets. And since the gallery display was concurrently being updated, the navy velveteen was replaced with cotton flannel and a show-fabric surround.  When a thangka had a bottom dowel, L and U hooks held to the board via magnets offered support.
Denise finished up her presentation by talking about some specific treatments they did on certain thangkas.  The one I found most interesting was the recasting of a missing dowel knob.  They made a RTV (room temperature vulcanized rubber) mold using the remaining dowel and cast a new one in resin, which was then painted.  The end result was quite impressive.
At this time, the Asian Art Museum has three standard board sizes, with minor variations.  Many thanks to Denise Migdail for sharing this great green solution with us! Look at this link to the Asian’s website for pictures and a great video clip: http://www.asianart.org/collections/magnet-mounts

42nd Annual Meeting – Paintings Session, May 30, "Aspects of Painting Techniques in 'The Virgin and Child with Saint Anne' Attributed to Andrea Salai" – Sue Ann Chui and Alan Phenix.

This paper, presented by Sue Ann Chui, intrigued and enticed us to want more. She noted at the beginning that the title had changed to “Leonardo’s Obsession: A Workshop Variant of his ‘Virgin and Child with Saint Anne’ from the Hammer Museum, UCLA.” This is a pertinent point to keep in mind in the broader scope of the day’s PSG talks.
Leonardo da Vinci spent fifteen years working on the painting of “Virgin and Child with Saint Anne” (now at the Louvre), keeping it in his possession, leaving it unfinished at the time of his death. While continuing to work in his studio, other variants were being created in the workshop. It was noted that the Hammer painting is in remarkable condition (both structurally and aesthetically) and that the panel is virtually unaltered.
The oil on wood panel painting, in storage for many years and thought to be an early copy, was attributed to Salai (Gian Giacomo Caprotti da Oreno 1480-1524). The panel support, estimated to be poplar with coniferous wood battens, tangential cut and not thinned, is remarkably close to Leonardo’s original panel (the “Louvre” panel) with similar tool marks and dowels. In addition to these similarities, the panel’s thickness (2-2.8cm) would suggest that both wood panels came from the same workshop in northern Italy.
Analysis revealed the ground to be calcium sulfate and glue with an imprimatura of lead white. Compositional changes can be seen in the under drawing (infrared imaging) of Saint Anne’s left foot and several other areas. Walnut oil was characterized as the binding medium in other samples. Pigments were characterized as lead white, carbon black, vermillion, lead tin yellow, red iron oxides, natural ultramarine, azurite, orpiment, transparent glazes of copper green and red lake.
The Virgin’s mantle, with a complex stratigraphy, presents some interesting questions. Does the stratigraphy represent an original sequence or changes by the artist? Analysis of the blue mantle reveals three applications of grey, along with ultramarine, and two applications of red lake glazes on top of the imprimatura and below the grey layers. Is a thinly applied transparent glaze as a preliminary layer, similar to Leonardo’s technique, intentional? The purple toned sleeve of Saint Anne, comprised of reds, red lake and layers of what appear to be retouching varnish is changed from a red-brown to a purple color similar to color found in the Louvre painting.
Two interesting finds in the Hammer Museum’s panel were imprints from fabric and fingerprints. Historical references mention the use of a textile to even out a glaze, as seen in an area of blue on the panel and using the palm of the hand to uniformly spread a glaze (leaving fingerprints in the paint – who might those fingerprints belong to?). Differing paint application in the scene’s plant foliage hint the passages may be by two different hands. Fine brushstroke’s in the face of Saint Anne suggest a very accomplished artist, leaving us to wonder if perhaps the master provided some assistance to workshop apprentices. It would seem the Hammer panel was almost certainly created in da Vinci’s studio.
The change in the title of the presentation tied in nicely with Elise Effmann Clifford’s presentation “The Reconsideration of a Reattribution: Pierre-Edourd Baranowski by Amedeo Modigliani.” In her talk Elise pointed out the biases and prejudices we all carry and need to be aware of. The need to look at each work afresh, consider all the findings of technical analysis, provenance, along with curatorial knowledge and instinct must inform how we approach artworks, while being mindful of our own biases.
As for my personal bias regarding the analysis of the Hammer panel I must admit that, like many in the attentive audience, I was hoping for a surprise ending that announced the Hammer painting would, in fact, be declared to be by the hand of the master. The session was packed full of high quality technical analysis (including a peek into workshop practices) suggesting deeper questions and the paint geek’s favorite, paint cross-sections!
————
Additional articles you may be interested in being cognizant of biases, the writer’s and your own!
 
LA Times article on Hammer St. Anne:
http://articles.latimes.com/2013/feb/05/entertainment/la-et-cm-leonardo-getty-20130206
Recent article in The Art Tribune mentions the Armand Hammer, UCLA panel:
http://www.thearttribune.com/Saint-Anne-Leonardo-Da-Vinci-s.html
Guardian article on over cleaning of panel:
http://www.theguardian.com/artanddesign/2011/dec/28/louvre-leonardo-overcleaned-art-experts
ArtWatch article:
http://artwatchuk.wordpress.com/tag/leonardos-virgin-and-child-with-st-anne/

Credit: via Tumblr from WTF Art History
Workshop of Leonardo da Vinci, The Virgin and Child with Saint Anne, c. 1508-1513, oil on panel. University of California, Hammer Museum, Willitts J. Hole Art Collection, Los Angeles
Credit: via Tumblr from WTF Art History

 
 
 

42nd Annual Meeting, Textiles Session, May 29th: Analysis of Organic Dyes in Textiles by Direct Analysis in Real Time–Time-of-Flight Mass Spectrometry by Cathy Selvius-DeRoo, Ruth Ann Armitage

Direct Analysis in Real Time – Time of Flight Mass Spectrometry (DART-TOF) was shown to be a viable method of organic dye analysis in the presentation by Cathy Selvius-DeRoo. The beauty of the technique is that it requires only a small fiber sample, and no advanced preparation such as dye extraction, in order to get positive identification for a variety of dyes, both plant and insect based.
The project began with a grant to purchase the equipment.  From there, various colorants were tested from a dye sample book, in order to develop the protocol.  The sample was put in the ionizing gas airstream (helium) and heated to a temperature of 350 – 500 degrees.  The result was fast and accurate identification of several dye classes, such as quinones, tannins and indigoids.
The presenter had a relaxed, personable style and shared some of her tips for success as well as lessons learned, including: better results were achieved with the higher temperature and with the addition of acid hydrolysis, which could be added just prior to putting the sample in the airstream using an eyedropper. The presenter confessed that flavonoids could be difficult to discern because the spectra are very similar for the various components.
After the method proved reliable, the technique was tested on textiles with undocumented dyes.  The most satisfying was to substantiate family lore on a Civil War coat.  The story was that a mother of a soldier dyed a Union issued coat to resemble a Confederate coat.  Analysis revealed that the indigo was overdyed with Walnut (also referred to as Butternut). Cool.
Full disclosure – I signed up for blogging this talk because I’m a bit of a science junkie.  I don’t always understand it, and in a small private practice, I certainly don’t have a Mass Spectrometer in the studio, but I appreciate knowing how to solve problems and who to go to for help.

42nd Annual Meeting – Case Studies in Sustainable Collection Care, May 30, “Becoming ‘Fit for Purpose’: A Sustainable and Viable Conservation Department at the British Library,” by Dr. Cordelia Rogerson

In this presentation, Dr. Cordelia Rogerson spoke about radical changes in the approach to treatment decision-making at the British Library under her direction as Head of Conservation. The changes in approach were sparked by deep cuts to the Library’s budget, resulting in a reduction of the number conservators working in the lab by half–from 70 conservators to 35. At the same time, there was increased demand for conservation work in the busy library where collections are constantly in use. These cuts forced conservators to evaluate the fundamental nature and purpose of their work to determine if they could do less treatment without compromising use of the Library’s collection.
In response, the British Library conservation department adopted a “fit for purpose” model to govern how much treatment to do for materials sent to the conservation lab. Items are evaluated to determine what treatment is absolutely necessary for the immediate projected use of the item, and only this necessary treatment is undertaken. “Vulnerable damage” (such as a long tear across a page) is likely to be repaired, while “stable damage” (such as a loss at the corner of a leaf that does not interfere with safe handling) will be left untreated in many cases. This represented a shift away from a previous emphasis on full (or more complete) treatment for the majority of the materials coming to the lab. The new model does still allow for high priority items and items selected for exhibition to receive treatment beyond stabilization.
After applying “fit for purpose” to seven discrete projects with positive results, the model was adopted as the guiding principle for all work in the lab. By doing only the work deemed necessary, the lab has greatly increased the number of repairs completed each year. At the same time, the efficiencies gained have actually allowed the conservators to devote more treatment time to high priority collections.
As one of only two conservators working in a lab for a medium-sized special collections, I found that many of the challenges, decisions, and compromises of the changing operations at the British Library sound familiar. I appreciated hearing how a “fit for purpose” decision-making structure works in the setting of one of the largest institutions of its kind and the dramatic impact it can have in cost-savings and efficiencies on this scale.
In the future, I would be interested to hear more discussion of “fit for purpose” decision-making in the conservation of library and archival collections, digging deeper into the diverse interpretations that might emerge for a range of materials in varied contexts.

42nd Annual Meeting, Paintings & Wooden Artifacts Joint Session, May 31, "Long Term Hygromechanical Monitoring of Panel Paintings," by Paolo Dionisi Vici

As a conservation student entering my first year of graduate study this fall, I was at first intimidated by the topic at hand and the thought of relaying this information to the conservation community, potentially including research scientists, techs, and seasoned conservators who may have a jump-start on understanding these concepts, their implementation, and design execution. However, Paolo Dionisi Vici’s presentation not only made the material pertinent and compelling, but also accessible to a layman like myself. I only hope I can do justice to the complexities of the issue.
The talk abstract provides a great summary as to the ‘why’ of hygromechanical monitoring of panel paintings. Mathematical models and theoretical systems regarding the short and long term effects of environmental conditions on objects need to be substantiated by real life data sets in order to move forward with our understanding of the impact of microclimates (and their fluctuations) on objects. This topic is in direct diologue with the conference theme, Conscientious Conservation: Sustainable Choices in Collection Care, and harkens back to the opening session talks broadly titled Exploring Sustainable Preservation Environments, in which the generally accepted environmental paramaters of the museum were discussed, questioned, and even at times directly challenged. Data-logging by experimental measures, as exemplified by Vici’s talk, is paramount to the future of this conversation.
Vici posed an excellent question at the beginning of his talk by asking “What does stability really mean?” As an example of the potential complexity of this issue, he refered to the localized monitoring of one of the viking ships in Oslo, in which different data responses were logged based on the instrumentations’ location on the ship. The abstract aptly states, “Due to the specificity of each artwork… the analysis of an artifact’s response… can supply useful information about its “individual” sensitivity to the exhibition microclimate….” As the viking ship demonstrates, the complexity of individual responses can even vary within a single (albeit enormous in this case) object.
Now to get to the nitty gritty of the talk, and the part where I formally apologize for my unavoidable oversimplication (of what I suspect Vici already drastically simplified) of the sophisticated instrumentation used to monitor panel paintings. I should mention that while this instrumentation can be used to monitor a variety of wooden objects (such as the viking ship), the abstract notes that “panel paintings are useful in representing the complexity of possible reactions.”
The system of monitoring, the Deformometric Kit (DK), employs two displacement transducers, attached perpendicularly to the grain of the wooden panel. Linear deformations in the panel can be measured based on the proportional change of length of the transducers and subsequent trigonometric calculations. The transducers can be mounted on the back of a panel in different configurations and are not visible while the object is on exhibition.
The DK has undergone several design modifications to improve the specificity of the data being collected and its practicality in a museum context. In earlier models, the transducers were screwed directly into the panels. This complicated the data, because the specifics of what was being measured (surface vs. interior deformations and fluctations) could not be determined. Improvements at the Metroplitan Museum of Art were tested, and the transducers were eventually glued to the surface of the panel. According to Vici, minor shifts in the mounting glue would not negatively reflect the recorded data, because the information being gathered between the two vertical elements reflects general, averaged fluctuations.  A further improvement was made when the base of the system was split, with ‘clips’ being glued to the surface of the panel and the transducers then being attached to these clips, making the transducers removable for transportation of the panel.
Vici provided several examples of the DK in action. Simulations of the potential asymmetry of a panel’s surfaces were conducted by connecting transducers to both sides of test panels. The effects of the movement of moisture as it reached equilibrium within the panel could then be monitored. The data Vici shared with us from these trials spanned hundreds of days, and the applicability of this system’s monitoring to both short and long term condition fluctuations should not go unmentioned. The DK also assisted in inform conservators regarding the appropriate crosspieces needed to provide auxiliary support for a long crack running through The Annunciation, oil on wood, Peter Candid, 1585. The DK was able to assist in determining how rigid the cross pieces needed to be and what kind of connection to the panel would be most appropriate.
I would like to reiterate that Vici did an incredible job engaging the audience with what could have been a very esoteric topic. And, yes, while it could be said that this is AIC, and perhaps only we could be ‘enthusiastic about dust’ (a group of people of which I am proud to be among), I felt the room earnestly abuzz after his talk. One of the most important thoughts that I took away from this talk was the importance of empirical validation of theoretical modeling. It is this sort of empirical validation that will inform our decisions as conservators and museum specialists moving forward with the care of our collections.