Katherine Kelly and Anna Friedman presented on a two-year project funded by the Department of State and carried out at the National Archives and Records Administration (NARA) to conserve and digitize the Iraqi Jewish Archive. This is not an archive that was collected in the traditional sense, but rather materials taken from the Jewish community over many years–the collections were discovered in the flooded basement of the Iraqi Intelligence Headquarters in Baghdad in 2003.
National Archives conservators Doris Hamburg and Mary Lynn Ritzenthaler traveled to Iraq shortly after the discovery to advise on recovery and preservation of the collection. The damaged materials were frozen and flown to the US, where they were vacuum freeze-dried. Following a smaller-scale project in 2006 to assess the collection, the hard work to clean, stabilize, and digitize the heavily-damaged and moldy collections was carried out during the two year project that was the focus of this presentation.
I am always amazed at the sheer scale of projects undertaken at NARA and the organization required to tackle the work within a limited timeframe. Katherine and Anna’s presentation included discussion of adaptations of the usual National Archives workflows to increase the efficiency of the project and to aid conservators in their work. For most materials, the first step in stabilization was to remove inactive mold. Distorted items were humidified and flattened, and tears were mended. Items that had originally been attached to documents with water-soluble adhesive, like stamps and some photographs, had often released due to the flood waters and subsequent humidity; these items were repositioned and reattached whenever possible. Once stabilized, materials could be rehoused, catalogued, and digitized. Through every step of the process, materials were tracked through the workflow using SharePoint software.
The culmination of the project is a digital collection of all 3846 items, which allows the materials to be made available to everyone. An exhibition featuring highlights of the collection was shown both at the National Archives in DC and at the Museum of Jewish Heritage in New York. Another component of the project was the creation of a website with detailed information about the collection and its history, documentation of procedures, and an online version of the exhibit. I particularly enjoyed the short video describing the history of the project, featuring many of the conservators who were involved over the years.
I often listen to NPR while working in the lab, and last November I was excited to hear my former classmate Katherine Kelly in a feature on All Things Considered. If you missed Katherine and Anna’s presentation in San Francisco, I highly recommend a visit not only to the project website, but also to the NPR feature to learn more about the important work to preserve this collection and make it accessible.
Tag: AIC’s 42nd Annual Meeting
42nd Annual Meeting – Electronic Media Group Luncheon, May 30, “Sustainably Designing the First Digital Repository for Museum Collections”
Panelists:
Jim Coddington, Chief Conservator, The Museum of Modern Art
Ben Fino-Radin, Digital Repository Manager, The Museum of Modern Art
Dan Gillean, AtoM Product Manager, Artefactual Systems
Kara Van Malssen, Adjunct Professor, NYU MIAP, Senior Consultant, AudioVisual Preservation Solutions (AVPreserve)
This informative and engaging panel session provided an overview of The Museum of Modern Art’s development of a digital repository for their museum collections (DRMC) and gave attendees a sneak peak at the beta version of the system. The project is nearing the end of the second phase of development and the DRMC will be released later this summer. The panelists did an excellent job outlining the successes and challenges of their process and offered practical suggestions for institutions considering a similar approach. They emphasized the importance of collaboration, communication, and flexibility at every stage of the process, and as Kara Van Malssen stated towards the end of the session, “there is no ‘done’ in digital preservation” — it requires an inherently sustainable approach to be successful.
This presentation was chock-full of good information and insight, most of which I’ve just barely touched on in this post (especially the more technical bits), so I encourage the panelists and my fellow luncheon attendees to contribute to the conversation with additions and corrections in the comments section.
Jim Coddington began with a brief origin story of the digital repository, citing MoMA’s involvement with the Matters in Media Art project and Glenn Wharton’s brainstorming sessions with the museum’s media working group. Kara, who began working with Glenn in 2010 on early prototyping of the repository, offered a more detailed history of the process and walked through considerations of some of the pre-software development steps of the process.
Develop your business case: In order to make the case for creating a digital repository, they calculated the total GB the museum was acquiring annually. With large and ever-growing quantities of data, it was necessary to design a system in which many of the processes – like ingest, fixity checks, migration, etc.- could be automated. They used the OAIS (open archival information system) reference model (ISO 14721:2012), adapting it for a fine art museum environment.
Involve all stakeholders: Team members had initial conversations with five museum departments: conservation, collections technologies, imaging, IT applications and infrastructure, and AV. Kara referenced the opening session talk on LEED certification, in which we were admonished from choosing an architect based on their reputation or how their other buildings look. The same goes for choosing software and/or a software developer for your repository project – what works for another museum won’t necessarily work for you, so it’s critical to articulate your institution’s specific needs and find or develop a system that will best serve those needs.
Determine system scope: Stakeholder conversations helped the MoMA DRMC team determine both the content scope – will the repository include just fine arts or also archival materials? – and the system scope – what should it do and how will it work with other systems already in place?
Define your requirements: Specifically, functional requirements. The DRMC team worked through scenarios representing a variety of different stages of the process in order to determine all of the functions the system is required to perform. A few of these functions include: staging, ingest, storage, description & access, conservation, and administration.
Articulate your use cases: Use cases describe interactions and help to outline the steps you might take in using a repository. The DRMC team worked through 22 different use cases, including search & browse, adding versions, and risk assessment. By defining their requirements and articulating use cases, the team was able to assess what systems they already had in place and what gaps would need to be filled with the new system.
At this point, Kara turned the mic over to Ben Fino-Radin, who was brought on as project manager for the development phase in mid-2012.
RFPs were issued for the project in April 2013; three drastically different vendors responded – the large vendor (LV), the small vendor (SV), and the very small vendor (VSV).
Vetting the vendors: The conversation about choosing the right vendor was, in this blogger’s opinion, one of the most important and interesting parts of the session. The LV, with an international team of thousands and extremely polished project management skills, was appealing in many ways. MoMA had worked with this particular vendor before, though not extensively on preservation or archives projects. The SV and VSV, on the other hand, did have preservation and archives domain expertise, which the DRMC team ultimately decided was one of the most important factors in choosing a vendor. So, in the end, MoMA, a very big institution, hired Artefactual Systems, the very small vendor. Ben acknowledged that this choice seemed risky at first, since the small, relatively new vendor was unproven in this particular kind of project, but the pitch meeting sold MoMA on the idea the Artefactual Systems would be a good fit. Reiterating Kara’s point from earlier, that you have to choose a software product/developer based on your own specific project needs, Ben pointed out that choosing a good software vendor wasn’t enough; choosing a vendor with domain expertise allowed for a shared vocabulary and more nimble process and design.
Dan Gillean spoke next, offering background on Artefactual Systems and their approach to developing the DRMC.
Know your vendor: Artefactual Systems, which was founded in 2001 and employs 17 staff members, has two core products: AtoM and Archivematica. In addition to domain expertise in preservation and archives, Artefactual is committed to standards-based solutions and open source development. Dan highlighted the team’s use of agile development methodology, which involves a series of short term goals and concrete deliverables; agile development requires constant assessment, allowing for ongoing change and improvement.
Expect to be involved: One of the advantages of an agile approach, with its constant testing, feedback, and evolution, is that there are daily discussions among developers as well as frequent check-ins with the user/client. This was the first truly agile project Artefactual has done, so the process has been beneficial to them as well as to MoMA. As development progressed, the team conducted usability testing and convened various advisory groups; in late 2013 and early 2014, members of cultural heritage institutions and digital preservation experts were brought in to test and provide feedback on the DRMC.
Prepare for challenges: One challenge the team faced was learning how to avoid “scope creep.” They spent a lot of time developing one of the central features of the site – the context browser – but recognized that not every feature could go through so many iterations before the final project deadline. They had to keep their focus on the big picture, developing the building blocks now and allowing refinement to happen later.
At this point in the luncheon, the DRMC had it’s first public demo. Ben walked us through the various widgets on the dashboard as well as the context browser feature, highlighting the variety and depth of information available and the user-friendly interface.
Know your standards: Kara wrapped up the panel with a discussion of ‘trustworthiness’ and noted some tools available for assessment and auditing digital repositories, including the NDSA Levels of Digital Preservation and the Audit and Certification of Trustworthy Digital Repositories (ISO 16263:2010). MoMA is using these assessment tools as planning tools for next the phases of the DRMC project, which may include more software development as well as policy development.
Development of the DRMC is scheduled to be complete in June of this year and an open source version of the code will be available after July.
42nd Annual Meeting: Health and Safety Session, ‘Solvents, Scents and Sensibility: Swapping – Solvent Substitution Strategies’ by Chris Stavroudis
Part I of ‘Solvents, Scents, and Sensibility: Sequestering and Minimizing’ was presented on Friday and encouraged the use of Pemulen TR – 2 in cleaning as an alternative to solvents or as a vehicle for solvents.
The topic of Part II was substituting safer solvents for more hazardous ones. Chris Stavroudis began the talk with a warning: There is no perfect substitute for Xylenes. He did, however, address some alternatives later in his talk.
Some of the harmful solvents that Chris suggested replacing were:
Benzene (a carcinogen) – can be replaced with xylene or toluene (although these alternatives are also hazardous)
n-Hexane (a neurotoxin) – can be replaced with n-Heptane
DMF – replace with n-methyl-2-pyrrolidone (NMP), although this may also be hazardous
Methanol – replace with Ethanol
Cellosolve and Cellosolve Acetate – just don’t use them! May be able to substitute butyl Cellosolve
Chlorinated Solvents – don’t use them. 1,1,1 trichloroethane is the least of the evils, but is terrible for the environment
Xylenes (it is a mixture of isomers and contains varying levels of ethyl benzene) – It may be safer to use xylene (single isomer) but this hasn’t been adequately tested.
Stavroudis stressed the fact that there is a difference between a safe solvent and an untested solvent. The two should not be confused and proper safety precautions must be made. He gave multiple examples of solvents that were once considered to be safe and that we now know can be hazardous (ex: d-limonene).
The use of silicone solvents was encouraged because they are versatile, as they can be cyclic or linear, and have a very low polarity. Silicone solvents may be safer than alternative solvents. They are found in make-up, are practically odorless (although this makes exposure difficult to gauge).
Another safer solvent that Chris mentioned was Benzyl Alcohol which has aromatic and alcoholic functionality, although it is toxic to the eyes.
Chris ended his talk with a review and discussion of solubility theory, including the Hildebrand and Hansen Solubility parameters and the TEAS diagram. This review was focused on the problem of finding a replacement for Xylene, a solvent that would have the same solubility characteristics. Chris’ Modular Cleaning Program is a greener and healthier technique/tool and includes Hildebrand, Hansen, and TEAS solubility theories. Using these theories the solvent mix that most closely matches the solubility characteristics of Xylene is a mixture of nonane and benzyl alcohol. There is more experimentation to be done and the next version of MCP can help you experiment with solvent mixtures and solubilities.
“42nd Annual Meeting,” Collections Care Speciality session, May 29th, 2014, "Simple Method for Monitoring Dust Accumulation in Indoor Collections." Bill Wei
“Simple Method for Monitoring Dust Accumulation in Indoor Collections,” by Bill Wei was the first session in the Collections Care specialty section that was given on Thursday afternoon. As a museum technician in Preventive Conservation, dust is something I deal with on an almost daily basis. I thought that Bill’s talk could lend some valuable insight to my work, and I wasn’t wrong. Bill Wei is a Senior Conservation Scientist at the Rijksdienst voor het Cultureel Erfgoed, and in his session he presented on a simple and easily implemented way a museum could monitor how fast dust accumulates in an indoor collections space. He used the Museum de Gevangepoort and the Galerij Prins Willem V to demonstrate how the method.
The talk started off with a humorous introduction by Bill about views on dust in museum spaces. How for some people, museum professionals in particular, we can take a defensive stance on dust as if it implies we aren’t doing our jobs. For other individuals, dust adds an element of age that seems appropriate. He also mentioned that when the words “dusty museum” are googled the result is over 12,000 hits. Apparently more than just museum professionals see dust. Bill brought up the fact that dust is not only an aesthetic issue in museums, it can present chemical and health issues, and it can be costly and timely to remove. The two sites were then introduced, both of which house collections and are historic buildings. Construction was being done near the sites, and there was a concern about how much more dust accumulation this might cause, so they provided a good case study. Bill then introduced the question of how do you monitor dust?
Bill explained that dust on the surface of an object causes the light to bounce off in many different angles, as opposed to at the same angle, this makes a surface look matte. The resulting matte surface can then be considered to have lost gloss. This loss of gloss is something that can be measured using a glossmeter. The type of glossmeter used during this test was made by Sheen manufacturers. Bill was careful to point out that this test doesn’t measure how much dust you have, but how quickly it will accumulate. For this run of the test Bill used microscope glass slides, because they are cheap, reusable and glossy. The steps of the test are as follows:
- Using the glossmeter, measure a clean slide on a white background (copy paper is suitable. This should be the same background used throughout testing.)
- Put slides out at various locations you wish to test, remembering that the more slides you put out, the more work you will have to do. The slides should be placed in out of the way locations and staff should be told about them.
- After a predetermined amount of time (ex. one month), using the glossmeter measure the slide on the same background that you used in step 1.
- Clean the slide, and reuse, starting over at step 1.
The calculation that is then used to determine the rate of accumulation of dust over the time period is
Fraction change= (Dusty Slide after 1 month measurement – Clean Slide measurement)/ (Clean slide measurement)
Multiply that by 100 to get the percentage.
Bill explained that for every month that you take a glossmeter measurement, you add the value of the new measurement to the previous, since this is cumulative you will go over 100% at some point. You can then use these values and plot them in a graph over time.
If you wanted to test the dust samples, to find out where the dust was coming from and what it was made of, you could incorporate small conductive carbon stickers on the slides. Since this talk focused on the accumulation, not the source of the dust, this topic was not discussed in detail.
The placement of the slides was at one point done both vertically and horizontally surface. The vertical placement was done to mimic how much dust a painting might accumulate. However the vertically placed slides needed a much longer period of time to really show a loss in gloss, so it was not considered as necessary to run both types of slide placement.
When it came to analyzing the results of this test one thing that was found was the fact that the slide nearest the entry had the most dust. When it’s results were plotted onto a graph it produced the steepest slope over time. The more visitors a museum has, the more dust accumulation occurs. During peak tourist times there was a correlating peak in dust accumulation. One thing that was also noticed at the Museum de Gevangepoort was that during construction periods there was also a rise in dust accumulation. The results confirmed a long held thought that visitors are one of the main sources of dust in museums.
Bill then talked briefly about the chemistry of dust. When the dust was analyzed it was found to contain salts, iron, chalk, sand, clay and concrete among other things. When the makeup of the dust was looked at, it was possible to notice trends, for example during the winter months, February in particular there was a noticeable rise in the amount of salts found. Looking at what the dust was comprised of could allow scientists to identify the source of the dust.
Bill pointed out that the idea of too much dust isn’t really something that is definable in terms of science. It’s more defined by people’s perception of it. Different surface types can be just as dusty as one another, but if the dust is more visible on one type of surface, say plexi, the viewer read’s that surface as being less clean.
In discussing an action plan for dust monitoring Bill said you have to determine why you are doing it, i.e. to see if your new HVAC system is producing better results, and it’s important to define “too much dust” as a difference in gloss.
The questions asked after Bill’s presentation included, how many/ what angle should a gloss measurement be taken, to which Bill answered one measurement at 85 degrees was sufficient. He was also asked how often one should be taking measurements. He said that three to four weeks at most will produce good results, if you measure too soon a change won’t be seen.
Bill’s presentation was informative and lively. He presented a system for testing dust accumulation that could easily be implemented and followed. Thanks to Bill for a great talk!
42nd Annual Meeting – We can fix it but should we? Take 2: Part Two – The Treatment of Mr Chips Tad Fallon
As the title indicates this paper is the second part of a treatment that was discussed at last year’s session, and subsequently implimented during the past year. It brings up some fascinating and controversial issues and I admire Tad’s courage and boldness in presenting it to the profession. In short he totally refinished and recolored a work of art-furniture by a living artist.
I strongly encourage everyone, whether WAG or not to read the final paper because the detailed rational for the treatment are beyond what a humble first time blogger is capabable of re-iterating. The treatment was not casual and it is the thought and consideration that enlivens the context and makes this such a stimulating paper.
Tad’s treatment has created a different work of art. It is not the same as the original. The artist-applied colors and varnishes were first destroyed by UV light and the remnants removed by Tad. Tad then applied new colors by working with the artist. Although the resulting furnture is not exactly the same; it could be as close as anybody is ever going to get. He interviewed the artist and the artist assitant, he gathered the exact same materials, he learned the techniques of application (from the artist and otherwise) and documented everything. He also researched the effect of the treatment on the value of the piece which is something he can add to a treatment that few other furniture conservators can provide. He consulted with not only the owner of course, but with dealers as to the effect his treatment would have on value.
It is context that makes this paper so interesting. I would probably never do this treatment because in my lab and with the goals of my institution I would not be justified in such a radical intervention. But I am not allowed to talk to John Townsend, or Lockwood DeForest or any of the hundreds of annoymous workers that have made the furniture that I have treated. Even if I could, I doubt that I would listen to their advice without falling into professional funk.
I treat furniture to be placed in a historic house that is interpreted for the early 21st century viewer. That is my context. An example that occurred to me after Tad’s presentation was an 18th century French commode that I recently returned to a 19th century Gilded Age setting. Like any piece of marquetry furniture it had been stripped and sanded on a probably routine basis to restore the colors. (Not an option for Tad.) At some point the veneers became too thin for this practice to continue and, probably not coincidently, the standards of collectors have changed to accept the patina of age. It has been French polished to a whore’s shine, but still almost everyone that sees it thinks it is beautiful. It has sat in the same corner with everything else in the room for over 70 years. For me to even approach 50% of Tad’s intervention with this piece would be ridiculous.
Tad’s paper stimulated me to look at my own context and the assumptions that I bring to any treatment especially wooden furntiure. “It should suggest the artist’s intent but still show its age” – what exaxctly does that mean? It all depends on context. Now if I could just tone down some of these upholstery fabrics a little bit . . . Why is the context for a chair different than an upholstered chair seat?
42nd Annual Meeting – Architecture Session, 31 May, "Lime-Metakaolin Grouts for Conservation" by Norman Weiss
This technical talk discussed a new method and material for architectural conservation. Norman Weiss began by addressing the problems of lime grouts, and the nature of metakaolin. The problem of lime grouts is the anaerobic nature of the area the lime is being applied to, as the lime requires carbon dioxide from air for the setting phase of the lime cycle. Metakaolin is a class N pozzelan, a ‘thermally activated clay’, where dehydroxylation is accompanied by a loss of crystal structure. Metakaolin is between fumed silica (0.3 µm) and Portland cement (5 µm) in terms of particle size. Metakaolin is very quickly and specifically dehydroxylated between 500 and 530 degrees Celsius, a process which Weiss noted you “don’t have to finesse”.
Weiss continued by addressing the purpose of pozzelans within architectural conservation. The material must densify and fill gaps, must strengthen in order to create bonds, must reduce the amount of cement needed, and must reduce the amount of ‘bleed water’ in order to be the most desirable pozzelan possible. Metakaolin has the potential to achieve all of these goals, if used in a specific way.
This material has been previously looked at somewhat, the first study having been completed in 1993, and the commercial introduction of the material in 1994. Weiss also noted that there has been a lot of research into metakaolin as a grout in Portugal.
Metakaolin has a high water demand, and is not great for a grout, as it does not set well; in order to function, it requires a superplasticizer. Weiss and his colleagues have experimented with the reaction between lime and metakaolinite, which forms Straetlingite. What seemed to be the most important part of this talk was the discussion of the new methodology that Weiss and his colleagues have developed and patented using this material. The wall is mechanically stabilized, and the void is filled through a tube. This method gives the material the time necessary for the slow-strengthening material to achieve the necessary strength.
It will be interesting to see in the future the results of further study and case studies of completed projects using this material and methodology.
42nd Annual Meeting – Textiles Session, May 30, “In Consideration of the Thangka” by Denise Migdail
Any talk with the word “thangka” in the title is one I’m sure to attend. I’ve been hooked on these incredible graphic pieces since seeing one entitled “Protectress Riding a Zombie”. Because who couldn’t like an art form that depicts riding a zombie? So I was very happy to hear that Denise Migdail of San Francisco’s Asian Art Museum was giving a talk entitled, “In Consideration of the Thangka”. The meat of the presentation revolved around the method the Asian Art Museum has developed to store and display the 154 thangkas in their collection, which, I have to say, is very clever. But more about that later.
Denise began with a quick overview of what a thangka is: a Buddhist image used for meditation and/or teaching. Although most are painted, they can also be appliquéd, embroidered or even woven. Denise’s colleague, Jeff Durham, assistant curator of Himalayan Art, says that “thangka” translates to the highly technical term “flat thingy”. However, any conservator who has worked on one will tell you that’s an unfortunate misnomer.
The idea to revamp the storage system came with the 2000-2003 move of the museum from its former home in Golden Gate Park to its current home in the Civic Center. The museum received an NEH storage grant and decided that they wanted to eschew their previous hanging storage for flat storage because a) low-binder paint b) fragile silks and c) wooden hanging dowels. Although this project started before Denise’s arrival, she has been elemental in its development since she came on staff in 2006. She found from experience that the beautiful glass cases installed in the new museum were incredibly hard to access. Only one pane of glass could be moved at a time, allowing relatively small access points for objects that can get really, really big. The staff realized that the boards the thangkas were stored on would aid significantly in getting them into the case. So hey, why not keep them on the boards during display as well as storage? Many different types of mounting boards were experimented with, including Tycore, (takes up a lot of space and is pretty expensive), Coroplast (sharp edges and flexes a lot) and blue board, (still flexes, especially at large sizes). D-Lite boards were ultimately deemed the best option. Navy velveteen was originally selected as a show fabric for both its tooth and complementary color. The thangkas themselves were variously tied, pinned, or stitched to the boards. Eventually the decision was made to start using standard-sized boards because reusing is a great way to go green, and also to save money. Unfortunately, this meant that piercing the boards by tying the thangkas to them customized them too much. Since other rotations in the Asian’s galleries were currently being mounted with the aid of rare earth magnets, it was decided they would be a good solution for the thangkas too. Kimi Taira, employed at the Asian and writing an entry for the AIC objects wiki at this time (http://www.conservation-wiki.com/wiki/Magnet_Mounts), contributed much. The D-Lite boards translated well to magnet mounts, since they were rigid enough to support steel to attract the magnets. And since the gallery display was concurrently being updated, the navy velveteen was replaced with cotton flannel and a show-fabric surround. When a thangka had a bottom dowel, L and U hooks held to the board via magnets offered support.
Denise finished up her presentation by talking about some specific treatments they did on certain thangkas. The one I found most interesting was the recasting of a missing dowel knob. They made a RTV (room temperature vulcanized rubber) mold using the remaining dowel and cast a new one in resin, which was then painted. The end result was quite impressive.
At this time, the Asian Art Museum has three standard board sizes, with minor variations. Many thanks to Denise Migdail for sharing this great green solution with us! Look at this link to the Asian’s website for pictures and a great video clip: http://www.asianart.org/collections/magnet-mounts
42nd Annual Meeting – Paintings Session, May 30, "Aspects of Painting Techniques in 'The Virgin and Child with Saint Anne' Attributed to Andrea Salai" – Sue Ann Chui and Alan Phenix.
This paper, presented by Sue Ann Chui, intrigued and enticed us to want more. She noted at the beginning that the title had changed to “Leonardo’s Obsession: A Workshop Variant of his ‘Virgin and Child with Saint Anne’ from the Hammer Museum, UCLA.” This is a pertinent point to keep in mind in the broader scope of the day’s PSG talks.
Leonardo da Vinci spent fifteen years working on the painting of “Virgin and Child with Saint Anne” (now at the Louvre), keeping it in his possession, leaving it unfinished at the time of his death. While continuing to work in his studio, other variants were being created in the workshop. It was noted that the Hammer painting is in remarkable condition (both structurally and aesthetically) and that the panel is virtually unaltered.
The oil on wood panel painting, in storage for many years and thought to be an early copy, was attributed to Salai (Gian Giacomo Caprotti da Oreno 1480-1524). The panel support, estimated to be poplar with coniferous wood battens, tangential cut and not thinned, is remarkably close to Leonardo’s original panel (the “Louvre” panel) with similar tool marks and dowels. In addition to these similarities, the panel’s thickness (2-2.8cm) would suggest that both wood panels came from the same workshop in northern Italy.
Analysis revealed the ground to be calcium sulfate and glue with an imprimatura of lead white. Compositional changes can be seen in the under drawing (infrared imaging) of Saint Anne’s left foot and several other areas. Walnut oil was characterized as the binding medium in other samples. Pigments were characterized as lead white, carbon black, vermillion, lead tin yellow, red iron oxides, natural ultramarine, azurite, orpiment, transparent glazes of copper green and red lake.
The Virgin’s mantle, with a complex stratigraphy, presents some interesting questions. Does the stratigraphy represent an original sequence or changes by the artist? Analysis of the blue mantle reveals three applications of grey, along with ultramarine, and two applications of red lake glazes on top of the imprimatura and below the grey layers. Is a thinly applied transparent glaze as a preliminary layer, similar to Leonardo’s technique, intentional? The purple toned sleeve of Saint Anne, comprised of reds, red lake and layers of what appear to be retouching varnish is changed from a red-brown to a purple color similar to color found in the Louvre painting.
Two interesting finds in the Hammer Museum’s panel were imprints from fabric and fingerprints. Historical references mention the use of a textile to even out a glaze, as seen in an area of blue on the panel and using the palm of the hand to uniformly spread a glaze (leaving fingerprints in the paint – who might those fingerprints belong to?). Differing paint application in the scene’s plant foliage hint the passages may be by two different hands. Fine brushstroke’s in the face of Saint Anne suggest a very accomplished artist, leaving us to wonder if perhaps the master provided some assistance to workshop apprentices. It would seem the Hammer panel was almost certainly created in da Vinci’s studio.
The change in the title of the presentation tied in nicely with Elise Effmann Clifford’s presentation “The Reconsideration of a Reattribution: Pierre-Edourd Baranowski by Amedeo Modigliani.” In her talk Elise pointed out the biases and prejudices we all carry and need to be aware of. The need to look at each work afresh, consider all the findings of technical analysis, provenance, along with curatorial knowledge and instinct must inform how we approach artworks, while being mindful of our own biases.
As for my personal bias regarding the analysis of the Hammer panel I must admit that, like many in the attentive audience, I was hoping for a surprise ending that announced the Hammer painting would, in fact, be declared to be by the hand of the master. The session was packed full of high quality technical analysis (including a peek into workshop practices) suggesting deeper questions and the paint geek’s favorite, paint cross-sections!
————
Additional articles you may be interested in being cognizant of biases, the writer’s and your own!
LA Times article on Hammer St. Anne:
http://articles.latimes.com/2013/feb/05/entertainment/la-et-cm-leonardo-getty-20130206
Recent article in The Art Tribune mentions the Armand Hammer, UCLA panel:
http://www.thearttribune.com/Saint-Anne-Leonardo-Da-Vinci-s.html
Guardian article on over cleaning of panel:
http://www.theguardian.com/artanddesign/2011/dec/28/louvre-leonardo-overcleaned-art-experts
ArtWatch article:
http://artwatchuk.wordpress.com/tag/leonardos-virgin-and-child-with-st-anne/
42nd Annual Meeting – Track A: Case Studies in Sustainable Collections Care, May 30, “Boxes Inside of Boxes: Preventative Conservation Practices by Robin P. Croskery Howard”
Robin P. Croskery Howard, Objects Conservator at the Ah-Tah-Thi-Ki Museum, focused on how custom housing, in concert with climate control, can be effective preventative conservation. Three case studies highlighting specific housing solutions for different collection materials were shown.
Case Study #1: The Long Road Home/Speck Collection
Some housings need to provide safety for travel and long term storage. The Ah-Tah-Thi-Ki museum makes it a priority to repatriate any collection items that are not Seminole in origin. These items are returned untreated. The two housings used for this are either stacked layers of Volara cutouts, contoured to fit the object or ethafoam cavities lined with acid-free tissue.
Case Study #2: The Doll with the Broken Neck
The museum has a number of dolls made out of palmetto fibers. These fibers deteriorate over time and the limbs and necks of the dolls often detach. Any treatment would produce only temporary results as the doll continued to age and breakdown. Custom pillows and supports are used to support the dolls and relieve stress on their joints.
Case Study #3: Leaning Baskets
An oversized modern sweetgrass basket that had partially collapsed under its own weight was restored using an adaptive housing. The basket was put in a box with twill ties holding it in place. The ties were gradually tightened over several weeks to support and lift the basket and allow it to gently regain its shape over time. Other modern baskets are stored with ethafoam supports.
These were great, practical solutions for caring for objects by using housing to prevent or control damage. I realized while writing this post how much this session falls in line with Cordelia Rogerson’s “Fit for Purpose” talk. All of the items showcased here were cared for, but in a manner and level appropriate for long view of their “life” at the museum.
42nd Annual Meeting, Textiles Session, May 29th: Analysis of Organic Dyes in Textiles by Direct Analysis in Real Time–Time-of-Flight Mass Spectrometry by Cathy Selvius-DeRoo, Ruth Ann Armitage
Direct Analysis in Real Time – Time of Flight Mass Spectrometry (DART-TOF) was shown to be a viable method of organic dye analysis in the presentation by Cathy Selvius-DeRoo. The beauty of the technique is that it requires only a small fiber sample, and no advanced preparation such as dye extraction, in order to get positive identification for a variety of dyes, both plant and insect based.
The project began with a grant to purchase the equipment. From there, various colorants were tested from a dye sample book, in order to develop the protocol. The sample was put in the ionizing gas airstream (helium) and heated to a temperature of 350 – 500 degrees. The result was fast and accurate identification of several dye classes, such as quinones, tannins and indigoids.
The presenter had a relaxed, personable style and shared some of her tips for success as well as lessons learned, including: better results were achieved with the higher temperature and with the addition of acid hydrolysis, which could be added just prior to putting the sample in the airstream using an eyedropper. The presenter confessed that flavonoids could be difficult to discern because the spectra are very similar for the various components.
After the method proved reliable, the technique was tested on textiles with undocumented dyes. The most satisfying was to substantiate family lore on a Civil War coat. The story was that a mother of a soldier dyed a Union issued coat to resemble a Confederate coat. Analysis revealed that the indigo was overdyed with Walnut (also referred to as Butternut). Cool.
Full disclosure – I signed up for blogging this talk because I’m a bit of a science junkie. I don’t always understand it, and in a small private practice, I certainly don’t have a Mass Spectrometer in the studio, but I appreciate knowing how to solve problems and who to go to for help.