NYC’s Museum of Modern Art owns sixteen Piet Mondrian oil paintings, the most comprehensive collection in North America. From this starting point, conservator Cynthia Albertson and research scientist Ana Martins embarked on an impressive project, both in breadth and in consequence—an in-depth technical examination across all sixteen Mondrians. All examined paintings are fully documented, and the primary preservation goal is returning the artwork to the artist’s intended state. Paint instability in the artist’s later paintings will also be treated with insight from the technical examination.
The initial scope of the project focused on nondestructive analysis of MoMA’s sixteen oil paintings. As more questions arose, other collections and museum conservators were called upon to provide information on their Mondrians. Over 200 other paintings were consulted over the course of the project. Of special importance to the conservators were untreated Mondrians, as they could help answer questions about the artist’s original varnish choices and artist-modified frames. Mondrian’s technique of reworking areas of his own paintings was also under scrutiny, as it called into question whether newer paint on a canvas was his, or a restorer’s overpaint. Fortunately, the MoMA research team had a variety of technology at their disposal: X-Radiography, Reflectance Transformation Imaging, and X-ray Fluorescence (XRF) spectroscopy and XRF mapping were all tools referenced in the presentation.
The lecture discussed three paintings to provide an example of how preservation issues were addressed and how the research process revealed information on unstable paint layers in later Mondrian paintings. The paintings were Tableau no. 2 / Composition no. V (1914), Composition with Color Planes 5 (1917), and Composition C (1920), but for demonstration’s sake only the analysis of the earliest painting will be used as an example here. Tableau no. 2 / Composition no. V (1914) was on a stretcher that was too thick, wax-lined, covered in a thick, glossy varnish, and had corrosion products along the tacking edges. Research identified the corrosion as accretions from a gold frame that the artist added for an exhibition. The painting has some obviously reworked areas, distinguished by dramatic variations in texture, and a painted-over signature; these changes are visible in the technical analysis. The same research that identified the source of the corrosion also explained that Mondrian reworked and resigned the painting for the exhibition. XRF mapping of the pigments, fillers, and additives provided an early baseline of materials to compare later works to, as the paint here did not exhibit the cracking of later examples. Ultimately, the restorer’s varnish was removed to return the paint surface to its intended matte appearance, and the wax lining was mechanically separated from the canvas with a specially produced Teflon spatula. Composition no. V (1914) was then strip-lined, and re-stretched to a more appropriate-width stretcher.
It is possible to create a timeline of Mondrian’s working methods with information gleaned from the technical examination of all three paintings. His technique had evolved from an overall matte surface, to variations in varnish glossiness between painted areas. XRF analysis demonstrated a shift in his palette, with the addition of vermillion, cobalt, and cadmium red in his later works. XRF also revealed that the artist used registration lines of zinc and lead whites mixed together and used on their own. Knowing the chemical composition of Mondrian’s paint is vital to understanding the nature of the cracking media and identifying techniques to preserve it.
The underpinning of all this research is documentation. This means both accounting for un-documented or poorly documented past restorations, as well as elaborating upon existing references. Many of the MoMA paintings had minimal photographic documentation, which hinders the ability of conservators to identify changes to the work over time. The wealth of information gathered by the conservation and research team remains within the museum’s internal database, but there are plans to expand access to the project’s data. Having already worked in collaboration with many Dutch museums for access to their Mondrian collections, it’s clear to the MoMA team how a compiled database of all their research and documentation would be groundbreaking for the conservation and art history fields.
In the spring of 2013, San Franciscans were outraged to discover that a cherished Maxfield Parrish wall painting had been removed from its home in the Palace Hotel and sent to New York to be sold. Prior to auction, it was to be cleaned of the hundred-plus years of accumulated grime and accretions it had been subjected to while hanging in The Pied Piper Bar. Thus, even after the Palace Hotel had acquiesced to public sentiment and agreed to return it to San Francisco, the painting remained in New York to be treated.
Harriet Irgang Alden, of Rustin Levenson Art Conservation Associates, had experience with other Parrish wall paintings, and knew the treatment concerns that were inherent to his working methods. The artist alternated thin transparent glazes of brilliant, unmixed pigments with saturating layers of varnish. This made the removal of a restorer’s varnish on a Parrish painting a fraught process that is typically not undertaken, because of the likelihood of disrupting the original layers. The planned treatment outcome only focused on grime removal. The immediate uniqueness of this Parrish wall painting was in the details of its construction. Despite its substantial size at 5 feet by 16 feet, the Pied Piper was not painted in sections, as Parrish’s other wall paintings were. The painting appeared to have been shipped rolled from the artist’s studio to San Francisco, where a stretcher was constructed for it—possibly of redwood due to the incredible length of the members. Additionally, the back of the original canvas remained visible, and displayed a ticking pattern similar to the canvas used for an 1895 Old King Cole painting. The unlined canvas, as well as the unique stretcher, provides new material evidence of Parrish’s working methods.
Unlike previous Parrish treatments, grime removal on the Pied Piper had revealed a broken varnish layer. Apart from thick brush drips and a pockmarked appearance, there were passages of flaking, which curiously did not reveal dull, unvarnished paint beneath. Instead, beneath the discolored upper varnish there appeared to be a clear, glossy layer of a different varnish, and beneath that were the brilliant blues typical to Parrish’s paintings. FTIR analysis at the Museum of Modern Art in New York verified that there were two distinct varnishes: the crumbling upper layer was an alkyd, and the lower a decolorized shellac. Alkyds like this alcohol-acid polymer were not produced prior to the 1920’s, so they could not have been original to Parrish’s 1909 Pied Piper. The decolorized shellac was stable and was still firmly adhered to the paint beneath. Both original layers had actually been protected from UV and bar patron damage by the alkyd addition.
After an aqueous cleaning removed the grime layer, the conservators were faced with an exciting prospect: could they remove the restorer’s varnish, and in doing so, reveal a pristine Maxfield Parrish painting? Solvents would penetrate through both layers and affect the pigment. A more complex process was tested: methyl cellulose in water was applied, and removed after five to ten minutes, to soften the alkyd layer. Though in initial attempts a scalpel was used, the conservators found that the softened alkyd varnish would lift easily and safely by being pulled up with tape using the ‘Texas Strappo’ method. This technique was successful, and revealed a brilliant and unharmed original varnish layer, but it was also incredibly time consuming.
The Palace Hotel declined to extend the treatment of the Pied Piper to include a months-long varnish removal. The alkyd removal test area was toned to blend back in, the painting was varnished with Regalrez, and the Pied Piper returned home. The non-original alkyd varnish remains, still degrading, but it continues to protect the pristine painting and original varnish beneath. In the future, it will be possible to remove the new Regalrez varnish with naphtha, which does not affect the original shellac varnish. It will also be possible to remove the alkyd layer with the solvent and mechanical methods outlined in the test, and revarnish with Regalrez, and possibly a UV stabilizer. Maxfield Parrish’s vibrant original may not be fully unveiled, but until then, the beloved painting is safely on display.
The topic of sustainability was on everyone’s minds at the AIC 42nd Annual Meeting, and an evaluation of the sustainability of our own profession and its educational path was part of the program. Having recently crossed the threshold into an art conservation graduate program, I was particularly interested in hearing Paul Himmelstein, a private practice conservator and partner at Appelbaum & Himmelstein since 1972, assess the sustainability of such programs. Recap:
In order to better understand how the graduate programs have changed over time, Himmelstein opened his talk with summaries of answers to a questionnaire he had distributed to the nine members of the Association of North American Graduate Programs in the Conservation of Cultural Property (ANAGPIC). From the responses collected, he reported the following:
– Most applicants today are female, compared to earlier ratios of applicants, who were closer to 50% female and 50% male.
– The requirements for admission have increased, both in the number of required pre-program hours of conservation experience and in the number of pre-requisite courses.
– All programs require two years of General Chemistry and Organic Chemistry.
– All programs are cost-free regardless of need.
– Most applicants apply twice before acceptance.
– Approximately 80 students apply per year.
– The number of accepted students in each program has remained the same.
Himmelstein attributed these changes to a list of reasons. He surmised that the decreased number of male applicants is a result of the increased number of academic requirements and pre-program hours of experience. Men, he said, are more deterred by the extra years needed to complete these requirements as they are still driven by the “provider” mentality. He also noted that AIC is currently 66% female, but the majority of conservation leadership positions at major fine-arts institutions are held by men. He also pointed out that the majority of our demographic is white and middle-class. In response to the full-ride fellowships, Himmelstein predicted that the expense of supporting all students every year is not sustainable, given the number of students accepted.
Himmelstein continued by offering a list of proposed solutions. He suggested considering changing the grants to a need-based system. He also suggested adopting an admissions approach that simply rejects or accepts with no option for reapplying, as in medical schools and law schools. He also added that more men are entering the field of nursing, another female-dominated profession, as a counterpoint to the fact that our profession is losing men.
After stating that 50% of AIC members are in private practice, he advocated for a business-management component at the graduate level, in which conservators in private practice could share their experiences and provide mentorship at the post-graduate level. He said that new graduates “just aren’t ready” to begin careers in private practice. He also advocated for Kress scholarships for textbooks.
His solutions list continued to broaden outside the graduate school realm and included general suggestions for advocacy and outreach. According to Himmelstein, “Met[ropolitan Museum of Art] conservation projects are boring” and “conservation is hidden.” He feels that conservators are not working as important colleagues with other museum professionals; they also need to play a larger role in the fields of art history and archeology. He suggested presenting conservation treatment projects online, as in plastic surgery “before” and “after” shots. Viewers could scroll over the artifacts to watch them change. Himmelstein suggested that the public “expects us to be wizards,”and concluded with the statement, “We are not on a sustainable track, but I think we can be.” Response:
Assessing the sustainability of our profession, especially in our current economic climate, is imperative. I agree that we must reexamine the number of students graduating each year to reduce expenses and to help control the job market, but not by selectively limiting funding or reducing a person’s chances for acceptance. Limiting funding at the graduate level would create an impossible financial position for most students. The demands of graduate school are such that no one is able, or even allowed, to work while in school. Unless a student is independently wealthy, then everyone falls into the “needs funding” category. According to Himmelstein’s report, average conservation students are not independently wealthy. Many internships at the graduate level are also still unpaid or partially paid, and students rely on their stipends to compensate. The current post-graduate income can also not sustain significant student loans. The “one strike you’re out” formula is also flawed. Many talented individuals who have made great contributions to our profession would not have become conservators if they did not get another chance to apply. Those who reapply show tenacity and dedication and our profession is shaped by those who participate.
I believe the decrease in male applicants is related to other factors and not because of the program requirements. Nursing is likely attracting more men because it has lost some of the “stigma” of a woman’s profession along with providing a relatively secure and well-paying job market. Conservation wages have fallen over time and the number of men in the field are likely reflecting this trend. In another life I pursued a degree in nursing and can attest that the increase in the number of men is not because less time is needed to get in to school. On the contrary, regardless of whether a student works to earn a bachelor of science in nursing or an associates degree in nursing, many hours of volunteer experience are required and many programs now require that a student become a certified nursing assistant before admission. This certification takes two months of full-time work or six months of part-time work in order to qualify for the state board exams. This work, in addition to the pre-requisites needed to apply, takes most individuals at least one year before they can apply to a nursing program. Some of the struggles we fight in conservation are not unique, but we are feeling the growing pains of a smaller and much newer profession, one that needs continuous advocacy in order to earn a living wage.
I agree that continuous outreach, both to the public and to colleagues in the humanities and sciences, is essential. Himmelstein touched on disseminating information to appropriate departments within schools. This is a particularly important task for me as a current graduate student, and a great way to continue advocacy for our profession. I was made fully aware of how important it can be to connect with other graduate students in the two weeks that followed AIC. From June 2-13, three classmates and I participated in the Delaware Public Humanities Institute (DelPHI). Applications to the course were open to all University of Delaware graduate students who work with material culture. Those two weeks were packed full of learning important skills such as navigating social media and presenting your project with concise and interesting language, and investigating what inter-departmental collaboration could mean for each of our disciplines. Plans to attend one another’s lectures and to share our research in one another’s classrooms are already underway for the 2014-2015 school year. I would like to hear other examples of these types of collaborations, because I am sure other wonderful ideas are being implemented.
The sustainability of art conservation is indeed an important discussion and I hope it is one in which conservators at all stages of their careers will participate.
What digital tools and resources do conservators use and create?
Who are the audiences for conservation content?
How can this content be delivered to these groups by digital means?
What kinds of digital tools, resources, and platforms will be needed as the profession continues to grow?
It is with the above questions that “Charting the Digital Landscape of the Conservation Profession,” a project of the Foundation of the American Institute for Conservation (FAIC), interrogates our profession’s origin, its role in this particular technological moment, and its propagation into the future with the aid of technology. As all AIC members have been made aware with the recent mailing, funding from the Mellon, Kress, and Getty Foundations is supporting FAIC in its investigation into the so-called “digital landscape” of the profession. This will help develop a baseline report on the discipline’s use of digital resources in order to better understand its breadth and complexity, and to identify areas critical to the community both now and into the future.
This session was the first in a series of planned forums designed to both map the digital landscape of the profession and to contextualize the data gleaned from the recent survey by discussing the tools currently used and their possible development in the future. An expert panel was brought together for brief presentations, after which there was a lengthy, free-form discussion amongst all attendees.
Please note: This post will err on the side of being longer: Although a report on the survey results will be published by FAIC, this interest session, which put so many experienced professionals and stake-holders in dialogue, is unlikely to be published as delivered. Additionally, many attendees voiced concern that the session was scheduled over many other specialty events, preventing stakeholders from attending to hear more about the project or to voice their concerns about the digital future of the discipline.
To those who are interested in the intimate details:Read on!
To those who would prefer to skim:Know that the FAIC’s report is expected in December 2014, and stay tuned for future forums in the “Digital Landscape” series.
Introducing the session, Eric Pourchot, the FAIC Institutional Advancement Director, began by discussion the project and the initial survey findings. FAIC’s investigation, he said, seeks to identify the critical issues surrounding the digital tools and resources used to shape both the questions and answers concerning urgent need, target audience, and content delivery methods.
He began by outlining five components of the project:
A review of existing resources
A survey of creators of digital resources as well as of the end users
Meetings (and phone interviews) with key stake holders
Formulation of recommendations, priorities, and conclusions
Although I halted a bit at all of this business-speak about timeline and budget and reports and endgames, I was curious as to the initial results of the survey, which I did take. Additionally, the survey goal of identifying the major ways in which digital resources are created, used, and shared both now and in the future, gets at interesting problems and questions we should all ask ourselves.
560 responses to the professionally-designed survey had been completed by the date of the presentation, so, Eric emphasized, the data is still very preliminary. More international participation will be sought before the survey closes and the data is analyzed for accuracy and for various statistical “cross-tabs” by the contracted company.
When asked for the success rate of finding information on a certain topic, those searching for information on Preventive Conservation, for environmental guidelines, for material suppliers, as well as for disaster planning information were successful more than half the time. Unsurprisingly, when it was treatment information that was sought, more than half of the users were unsuccessful. To qualify the lack of “success” of a search, 70% of users cited the lake of information specific to their exact needs. 49% are concerned that the information is not up-to-date. 43% cite concern about the reliability; and 32% were dismayed by the time it took to find the information.
Eric expressed surprise that an archive of treatments topped the list of enhancements desired by the respondents. I do not remember if this was a fill-in question or what I personally responded, but this result did not necessarily strike me as surprising. Rather, I see it being in line with the lack of information on treatment procedures—both historic and current—that was noted in the above section of the survey.
From among the list of Digital Tools used most often, Eric noted the absence of collaborative spaces, such as Basecamp and Dropbox, from the list of image and document management tools, but suggested that maybe some forgot to list these oft-used programs, as they are not conservation-specific.
Finally respondents identified policy issues that were of most concern to them as obstacles to creating, sharing, and accessing content: Copyright/IP (Getty), Institutional/repository policies, time (?), and standards/terminology ranked high. It was unclear at first what was meant by the latter, but David Bloom’s talk (below) did a good deal to illuminate the importance of this.
Eric concluded by noting that although a web-survey platform does self-select for respondents with certain habits, sympathies, and concerns (i.e., those who access the internet regularly and seek to use it as a professional tool), the data represents a good range of age and experience. These groups can be correlated to certain responses; for example, 45-65 year-olds are more likely to search for collections info and are more interested in faster internet access and better online communication. Younger stakeholders, are searching more for professional information and jobs.
Again, be reminded that this data is very preliminary. A final report can be expected by December 2014.
In America, the Mellon began considering supporting digitization efforts and moving conservation documentation online: In 2009 it funded the design phase of ConservationSpace.org to begin imagining online, inclusive, and sustainable routes for sharing. Merv Richard of the National Gallery lead 100 conservators in the development of its structure, its priorities, and its breadth, presenting a discussion session at AIC’s 41st Annual Meeting, Indianapolis.
Important observations are being made when studying potential models, notably the similarities in which the National Park Service, libraries, natural science collections, etc. handle networked information. Although there were necessarily different emphases on workflow and information, there were also large intersections.
In the meantime, CoOL shows its age. It’s long history has necessitated a few migrations over hosts and models—from Stanford Libraries to AIC, and from Gopher to WAIS to W3. It is still, however, based on a library-catalogue model, in which everything is represented to the user as a hypertext (hypermedia) object. In such a system, there are only two options available: to follow a link or to send a query to a server. As important as this resource has been for our professional communication and for the development of our discipline, it lacks the tools to for collaboration over networked content. Having become a legacy resource, it is discontinuous from other infrastructures, such as Wikipedia (pdf), Hathi Trust, Shared Digital Future, and Google Books, all of which which point to a more expansive set of technological opportunity, such as indexing, semantic resource discovery, and linking to related fields.
Our discipline does not exist in a vacuum, and the structuring of our online resources should not show otherwise. Additionally, we need to be able to identity trustworthy information, and this is not a unique problem: We have to open ourselves up to the solutions that other disciplines have come to implement.
Ken encourages us to think of accessible data as infrastructure, which forces the creator to think about applications of the data. A web-platform should be more than just switches and networks! It should support collaborative research, annotation, sharing, and publication. This plat form should increase our ability to contribute to, extract from, and recombine a harmonized infrastructure that we fell represents us.
Planning for the extent of our needs and building it is not beyond a shared professional effort. We will find it to have been worth it.
3. SPEAKER: Nancie Ravenel
Nancie Ravenel, Conservator at the Shelburne Museum, former Chair of Publications and Board Director of Communications, works very hard to create and disseminate information about digital tools and their use to conservators. She is continuously defining the digital cutting-edge, at once “demystifying” conservation through outreach, embodying the essential competencies, and articulating the value of this profession. Her segment of the session provided an overview of key resources she uses as a conservator, noting how the inaccessibility of certain resources (e.g. ARTstor, ILL, and other resources requiring an institutional subscription) changes how she locates and navigates information.
“What does Nancie do in the digital landscape?,“ Ravenel asked. She makes stuff. She finds stuff. She uses and organizes what she makes and finds. And she shares what she’s learned.
Nancie divided her presentation of each function into four sections:
◦ Key resources she uses as a conservator
◦ Expectations of these resources
◦ What is missing
◦ and What remains problematic
In our capacity of makers of stuff, many of us, like Nancie, have begun to experiment, or are already proficient at, using Photoshop for image processing and analysis, experimenting with 3D images and printing, gleaning information for CT scans, producing video, and generating reports.
Where making stuff is concerned, further development is needed in the area of best practices and standards for createng, processing, and preservation of digital assets! We need to pay attention to how assets are created so that they can be easily shared, compared, and preserved. Of great concern to Ravenel is the fact that Adobe’s new licensing model increases the expense of doing work.
On the frontier of finding stuff, certain resources get more use from researchers like Nancie, perhaps for their ease-of-use. Ravenel identifies CoOL/CoOL DistList, jurn.org, AATA, JSTOR, Google Scholar/Books/Images/Art Project/Patent, CAMEO, Digital Public Library of America (dp.la), WorldCat, Internet Archive, SIRIS, any number of other art museum collections and databases (such as Yale University Art Museum or Rhode Island Furniture Archive) and other conservation-related websites, such as MuseumPests.net.
The pseudo-faceted search offered by Google Scholar, which collates different versions, pulls from CoOL, and provides links to all, is noted as being a big plus!
There is, however, lots of what Nancie terms “grey literature” in our field—which is not published in a formal peer-reviewed manner (such as listserv or post-print content, as well as newsletters, blogs, or video content). The profusion of places where content is available, the inconsistent terminology, and the inconsistent metadata or keywords (that which is read by reference management or that which facilitates search) applied to some resources are the most problematic when finding stuff.
As Richard McCoy has always insisted to us, “if you can’t ‘google’ it, it doesn’t exist,” Nancie reiterates a similar concern: If you can’t find it and access it after a reasonable search period, it might as well not exist. In the way of a list of what is harder to find and access she provides the following areas in need:
• AIC Specialty Group Postprints that are not digitized, that are inconsistently abstracted within AATA, or whose manner of distribution makes access challenging.
• Posts in AIC Specialty Group electronic mailing list archives are difficult to access due to lack of keyword search
• Conservation papers within archives often have skeletal finding aids; and information is needed about which archives will take conservation records.
• ARTstor does not provide images of comparative objects that aren’t fine art.
Any effort to wrangle these new ways of assembling and mining information using technology need to consider using linked resources, combining resources, employing a more faceted search engine, and deploying better search options for finding related objects. Research on changing search habits of everyone from chemists to art historians should help us along the way.
In her capacity as a user and organizer what she makes and finds, Nancie knows that not every tool works for everyone. However, she highlights digital tools such as Bamboo DiRT, which, as a compendium of digital-humanities research tools, works and synch across platforms, browsers, and devices, allows for exporting and sharing, and can allow you to look at your research practices in new and different ways. Practices to be analyzed include note taking, note management, reference management, image and document annotation, image analysis, and time tracking. Databases such as these offer structure for documenting and analyzing workflow; and if used systematically, they can greatly increase the scientific validity of any project over the mere anecdotal approach. For a large cleaning project, such as that undertaken with the Shelburne carousel horses, this is indispensable.
What is missing or problematic? A digital lab notebook is not ideal around liquids but is very suited to logging details and organizing image captures. These methods cannot measure the results of treatments using computational methods. Missing are also good tools for comparing, annotating, and adding metadata to images on mobile devices and well as for improved cooperation between tools.
And after all of this analysis of one’s use of digital tools, how is it best to share what one has learned? The AIC Code of Ethics reminds us that:
“the conservation professional shall contribute to the evolution and growth of the profession…This contribution may be made by such means as continuing development of personal skills and knowledge, sharing of information and experience with colleagues, adding to the profession’s written body of knowledge, and providing and promoting educational opportunities in the field.”
The self-reflexive exercise that Nancie Ravenel modeled in her talk—of analyzing personal use of digital tools and how personal needs and goals may reflect and inform those of others—will not only be indispensable to the future development of digital tools which will meet this call to share, but it contains in itself a call to share: Nancie asks, what do you use to share and collaborate with your colleagues. How may these systems serve as a model for further infrastructure?
Email, listservs, and forums; the AIC Wiki; research blogs, and project wikis enabling collaboration and peer review; document repositories like ResearchGate.net and Academia.edu; shared bibliographies on reference management systems like Zotero.org and Mendeley.com; collaboration and document-sharing software like Basecamp, Google Drive, and Dropbox; and social-media platforms allowing for real-time interaction like Google Hangouts are all good examples of tools finding use now.
Missing or problematic factors in our attempts to share with colleges include the lack of streamlined ways of finding and sharing treatment histories/images of specific artworks and artifacts; the lack of archives that will accept conservation records from private practices; and the persistent problem of antiquated IP legislation which is often confusing.
In addition to sharing information with other conservators, we must also consider our obligation to share with the public. Here better, more interactive tools for the display of complex information. As media platforms are ever-changing, these tools but be adaptable and provide for some evaluation of the suitability of the effort to the application.
4. SPEAKER: David Bloom
Described by Eric Pourchot as a “professional museophile,” David Bloom was a seeming non-sequitur to the flow of the event. However, as coordinator of VertNet, and NSF-funded collaborative project making biodiversity data freely available online, he spoke very eloquently about the importance of and the opportunities offered by data-sharing and online collaboration. He addressed issues of community engagement in digital projects, interdisciplinary collaborations, and sustaining efforts and applicability throughout these projects. As argued in the other short talks, conservation is yet another “data-sharing community” which can learn from the challenges met by other disciplines.
As described by Bloom, VertNet is a scalable, searchable, cloud-hosted, taxa-based network containing millions of records pertaining to vertebrate biodiversity. It has evolved (pun-intended) from the first networked-information system built in 1999 and has grown over various revisions as well as by simple economies of scale—as the addition of new data-fields became necessary. It is used by researchers, educators, students, and policy-makers, to name a few. As the network is a compilation of data from multiple institutions, it is maintained for the benefit or the community, and decisions are made with multiple stakeholders under consideration.
Amongst the considerable technical challenges through all of its iterations, VertNet has struggled to establish cloud-based aggregation, to cache and index, to establish search and download infrastructure, and to reign in all associated costs.
Additionally, intellectual property considerations must be mentioned, as even though the data is factual (the information cannot be copyrighted), the data “belongs” to the host institution, as they are the historical keepers. As a trust, VertNet does not come to own the data directly. This made a distributed network with star-shaped sub-networks necessary, even though it was expensive to maintain, especially for a small institution, requiring many servers with many possible points of failure. Once one point failed, it was difficult to locate. Costing about 200k/yr, this was an expensive system to maintain, and although it was still the best and most secure way to structure the network, it was not as inclusive as it could have been for its expense.
There are always social challenges to building such “socio-technical networks,” and this is something that the FAIC is discovering by simply attempting to poll its membership. It doesn’t work if people don’t want to play. What ensues are knowledge gaps, variable reliability, and a lack of resources. To speak more broadly, any entity entrusted with indexing information needs for people to get over their fear of sharing to learn the benefits and acquire the skills associated with being connected (i.e. Social-media privacy controversies). All the knowledge and time needed to meet everyone where they are technologically and bring them along in a respectful manner does not exist in one place, so priorities must be defined for the best investment of time and funds to bring the discipline forward.
Bloom found that disparate data hosts could not communicate with each other—they either had different names for similar data fields which needed to be streamlined or they did not maintain consistent terminology, either globally or internally.
This problem had already been solved in a number of ways. For example, Darwin Core classification system was developed by Dublin Core; ABCD is the European standard; and Biodiversity Information Standards was developed by TDWG. There are 186 fields defined by Darwin Core with a standardized vocabulary in as many fields as possible. These standards are community-ratified and community maintained in order to not be easily or unnecessarily changed. This allows for easy importation by mapping incoming data-sets to a Darwin-Core standard; all the data is optimized for searchability and discoverability; and publication and citation tools are hence streamlined.
This type of study of the state of the art,necessary when designing new database infrastructure, can serve as a model for the field of conservation. At the foundation of a successful system, will be a serious study of what has been done in other fields and of what is most useful to prioritize for this one.
As VertNet is based entirely on voluntary participation, it is critical that participants understand the benefits of submitting their data to the trust. The staff at VertNet makes themselves available to help the host institution through any technical difficulties encountered in the data exportation and importation process. Backups of this data are scrupulously maintained throughout the migration process. A major benefit to the exporting institution is VertNet’s data-quality checks which will complete, clean up, and streamline fields and then will send back a report so that the client can update their own databases. This brings local data-maintenance standards in-line with those maintained by the global database.
Additionally, the NSF grant has made training workshops, the development of analytical tools, and certain instances of impromptu instruction possible for clients. This has lead to VertNet’s exponential growth without advertising. The repository now represents 176 institutions with 488 collections and many, many more want in from the waiting list. All these institutions are voluntarily submitting their data despite historical concerns about “ownership.” All these institutions realize the benefit of membership for themselves, for researchers, and for the state of the discipline.
Unfortunately, however, this “traditional” (eek) model of procuring NSF (or NEH, IMLS, etc.) funding to maintain cost is becoming unsustainable. Support for these services is desperately needed now that its utility is established. The value-add model is difficult even if VertNet does believe in “free data.”
The associated cost does not change; however, the database was built as community tool. So even though the common perception is an unchanging status-quo, the community will have to support the project insofar as they find the resource valuable and important. A common misconception propagated by recalcitrant host institutions is that “we can do it ourselves.”. The fact is, however, that most stewards of data can’t—and even more won’t—turn around and make these records available to the community for revision, maintenance, reference, or analysis.
The audience then exploded with responses :
Pamela Hatchfield (Head of Objects Conservation at the Museum of Fine Arts Boston and AIC Board President) began by reminding those who had been romanced by visions of star-shaped networks that concerns about maintaining privacy are still driven by private funding. Although there is now a conservation module in TMS, and terminological standardization is a frequently cited concern, this data is clearly not intended for the public. Historically, private institutions maintain the attitude that data should be tightly held. There is a huge revenue stream from images at the MFA, and as such it is difficult even for staff to obtain publication rights. Terry Drayman-Weisser (Director of Conservation and Technical Research at the Walters Art Museum) pointed out the the Walters walks the middle path by providing a judiciously selected summary of the conservation record associated with an object. Not all of the information is published.
Certain institutions, such as at the British Museum, have an obligation to make these records public, unless the object falls into certain categories. The 2007 Mellon “Issues in Conservation Documentation” meeting at the National Gallery, London, provides summary of the participants’ approaches to public access at the time of publication.
I did have time to ask a question about the privacy concerns attendant on a biodiversity database. Why does it seem that there is less hesitancy at the prospect of sharing? In reality, these institutions do overcome certain hurdles when deciding what to make publicly available: It turns out that certain data about endangered species should not be shared. Although he did not have time to elaborate, I was curious how this “species privacy” might compare to “object privacy.”
VertNet, it turns out, cannot even find protection under the “Sweat-of-the-Brow” doctrine, as this factual information cannot be copyrighted. What about those portions of conservation documentation which are markedly drawn from speculation, interpretation, and original research? This information can be copyrighted, as per each institution’s policies, but our culture is changing. “We don’t train students to cite resources properly,” he noted, “and then we wonder why we don’t get cited.”
The time allotted for the session was drawing to a close, and everyone expressed their regrets that the conversation could not go on for longer and that more people could have attended.
I would personally like to thank FAIC, the speakers, the Mellon, Kress, and Getty Foundations, and all of the participants for their part in a very though-provoking discussion. I hope and trust that it will continue in future fora.
Alex Carlisle presented a fascinating and detailed treatment of the pulpit in Fort Herkimer Church, German Flatts, New York (http://fortherkimerchurch.org/7.html). The church has a long history; the current structure dates to 1767, with many additions and expansion in war and peacetime. The pulpit was added in the early 19th century, and seems to be completely unique; it is made from white pine, but nothing is known about the workshop.
During a recent, major renovation of the church, white paint coating the pulpit was partially sanded off and discovered to be covering polychrome decoration. At this point, Carlisle was asked to work on the project, to remove the remaining white overpaint and preserve the original polychrome layer. At least one coat of white paint was lead-based, and very intractable; the majority of this was mechanically removed. Fortunately an older resin coating layer was present, and the lead white paint tended to cleave off at the interface.
Once the white overpaint was removed, the remaining original surfaces were consolidated and coated with a barrier layer. Losses in the polychrome ornament were inpainted to re-create the original decorative effect. So far the base and main section of the pulpit have successfully been treated; the canopy awaits funding to complete the project (keep an eye out for part 3!)
Nolley and Gillis treated a 17th century Pennsylvania German shrank which is a rare example with surviving original painted finish including faux burl wood graining and colorful decorative ornaments.
Shrank is a German word for wardrobe; many such cabinets were made in America by immigrants, using locally available woods. As with other types of furniture, these would sometimes have been faux painted to imitate a fancier wood with more elaborate carving or decoration; grain painting was a common decorative technique. Due to their utilitarian nature, original finishes on early examples seldom survive.
Cross-section analysis showed that the Chipstone shrank did have original paint, but with large areas compromised by fire damage and wear from use. This led to the initial overpainting in the early 19th century, followed by several consecutive paint treatments over the years, including an opaque, gray-blue colored casein based paint. This gray-blue layer proved to be very intractable, particularly over areas that were burned or highly worn. Cleaning solutions with chelators were able to remove the majority; agar gel was used for local cleaning around sensitive areas. Older oil-based coating layers actually acted as a resist to prevent the cleaning from going too far.
Completed with varnishing, waxing, and selective inpainting, the treatment was able to successfully expose original decoration and give a sense of the shrank’s intended appearance.
Catherine Coueignoux presented an exciting treatment of the Augustus Rex (c.1750) writing cabinet in the collection of the Victoria & Albert Museum (W.63-1977 http://collections.vam.ac.uk/item/O74665/writing-cabinet-kimmel-michael/# )
The elaborate ormolu mounts had been previously re-gilded. Before treatment were coated with a thick layer of dirt and dust over a shoe polish-like wax treatment, which was possibly added to dull the appearance of the bright new gilding. All other metal components were corroded, and the wood and marquetry had all been stripped and refinished. Curators wished the treatment to result in a bright, nearly-new appearance as it may have looked when newly restored (the previous refinishing and regilding probably occurred while owned by the Rothschild family).
Spotty corrosion on metal components that could not be removed was treated locally where possible. EDTA gel and BCA gels were tested but unsatisfactory- cleaning not enough, or too well. Coueignoux was able to use rottenstone to spot clean dark areas, leaving a layer of light corrosion sympathetic to surrounding areas. In some places, the corrosion spots were left untreated.
The removable ormolu mounts were cleaned using dry ice pellets, a new method for the lab. Their system uses a block of CO2 dry ice which is shaved into pellets and sprayed onto the surface of the object using an air compressor with a custom nozzle. The CO2 pellets expand on contact, providing a gentle mechanical cleaning. By moving quickly along the surface, they were able to avoid excessive cooling that would result in condensation. Acetone and a hairdryer were on hand to remove any condensation that did form. Other labs using CO2 cleaning include the Getty and the Smithsonian.
In the case of the ormolu mounts, CO2 cleaning was fast, safe and effective and removing the unwanted wax and dirt- 150 mounts were cleaned in only seven hours! Obviously this method is not appropriate for many objects and materials, but may be a convenient choice for more conservators in the future.
With her excellent talk, British conservator Pierrette Squires showed that it is possible to do a major collections move project while still being economically and environmentally conscientious. Of course, doing so required an enormous amount of careful planning, creativity, and hard work, which Squires outlined.
Situated in northwest England, an area hard hit by the recession, the Bolton Library and Museum Services (http://www.boltonmuseums.org.uk/) sold the textile mill which previously housed its collections storage. The staff then had to move and rehouse the collection of over 40,000 objects, ranging from fluid specimens to industrial machines, to a new location in two years and with a tight budget of $1.4 million. A large part of the success of the project resulted from the conservation team being included from almost the very beginning. Because of their involvement, the move was inspired by the green values of “Reduce, Reuse, and Recycle,” values which contributed not only to environmental sustainability but economic sustainability as well.
The location chosen for the new collections storage was another old factory. Despite some pollution and asbestos, the building was in good shape structurally. Working closely with the mechanical engineers, the museum did careful environmental monitoring of the space. The museum made the unorthodox decision not to install air conditioning, which would be expensive, but instead to use large amounts of insulation. Other green features of the building renovation included the installation of solar power panels and of Power Perfectors (voltage optimization devices), which save money by buffering energy draw. Adjustments like these resulted in a 50% reduction in energy costs.
Less expensive alternatives for outfitting the storage area were also sought out. Rather than using an expensive system designed for museums, cheaper compact storage intended for use in other industries was selected. Used metal racks and wooden pallets were chosen for storage of larger objects. In all, 65% of the storage furniture was second hand, saving money and keeping things out of landfills.
The arrangement of collections within the storage area was also carefully planned to maximize the environmental conditions of the building. For example, more stable objects like geological specimens were placed in areas against exterior walls, while textiles and archaeological materials were placed in areas farther away from the loading dock and thus most protected from temperature and humidity swings. Fluid preserved specimens were placed in the northern and thus cooler part of the building.
The actual move of the collection continued the theme of sustainability. Local transport companies were hired to do the actual moving, which saved on gas and contributed to the local economy. Storage and packing materials were reused as often as possible. When no longer usuable, materials were recycled.
In conclusion, the move was a very successful project. Although not all the choices made in the project are applicable to every museum – one wonders about the risk of pollutants from used and wooden storage furniture, the ideas presented in this talk were interesting and thought-provoking. The talk proved that environmental sustainability and economic sustainability are not opposites but can go hand in hand.
A problem encountered in the study of paintings is distinguishing the medium in which they were created, and delineating layers which may include different media of mixtures of media. This was the subject of a paper presented at the Research and Technical Studies session.
It is not easily possible to distinguish between oil paintings and tempera (egg-based) paintings by eye, or using many analytical methods. The authors discussed the benefits and drawbacks to three main types of analysis that are used within paintings conservation: cross-sectioning, Fourier-Transform Infrared (FTIR) spectroscopy, and thin layer chromatography (TLC). FTIR, for example, cannot distinguish between egg proteins and glue, and the results can be masked by pigments or colorants. None of these methods, as discussed, can be definitive when it comes to mixtures of media such as tempera grassa.
The author also considered the effectiveness of other common methods, such as GC/MS (Gas Chromatography – Mass Spectrometry). The main drawback to this is that results cannot be compared across different experiments if the methodology varies even slightly.
The combination of these drawbacks in common methodologies led the authors to pursue Time of Flight Secondary Ion Mass Spectrometry (ToF-SIMS), a high-resolution technique that is better at separating and identifying fragments which are different but have similar masses. It also allows for the presence of specific compounds to be ‘mapped’, giving a helpful visual of layers and levels. Using this method, they were able to map for amino acids, identifying the presence of animal glue in a mixture. Practically, this was shown to differentiate between a gesso-size ground and the glue layer which was determined to have been purposefully added.
The talk concluded with a reminder that this technology, as with most, works best in conjunction with other methodologies. While this is an important point to remember, the potentials of this technique are exciting. I’m very interested to see the potential that this technique has for three-dimensional objects with multiple painted or gilded layers. I hope that someone pursues this, and that the technique is able to be harnessed across conservation disciplines.
Glass disease, weeping glass, glass deterioration, funky glass* (*author’s description)–just a few of the many names used to describe the degradation of glass beads that museums have observed as a white precipitate/cloudy appearance and/or cracking and splitting. If you’ve observed this in your collection, take notice- Mellon Fellow in Objects Conservation, Robin O’Hern, is on the case.
O’Hern has taken advantage of the history of glass disease detection at the National Museum of the American Indian (NMAI) and begun evaluating how the different cleaning methods have fared over the years. In 1999, Kelly McHugh (research supervisor and co-author) and Scott Carrlee performed a condition survey of the NMAI collection. The collection was moved into a state-of-the-art storage facility after the survey, where the RH has remained constant, but at a higher level than recommended for glass pieces. (The beads are present on composite pieces with hide, bone, shell, feather, hair, etc. and therefore the environmental controls must address as many materials as possible, not just glass.) Some of the pieces were treated at that time, and others have been treated in the interim years. Using the museum database, O’Hern found that 25% of the condition records that list glass beads as a material also list glass disease. O’Hern has performed another survey, this time seeking to observe condition changes over the past 15 years in a selection of objects from the 1999 survey, to assess treatment technique (ie, which solvents worked best to reduce glass disease), and to discover susceptibility trends (which beads are the worst culprits).
To understand the beads, O’Hern provided background on history of use and manufacture.
Glass beads arrived after contact with Europeans in 1492
Pony beads were introduced after 1675
Wound beads were introduced after the late 17th century
Seed beads were introduced 1710-1840
Red beads were colored from copper in the 17th century, ruby red in the early 18th century, and selenium in the 1890’s
Blue beads were colored from copper or cobalt, but from 1640-1700, they were tin-rich
Beads can be made by pulling the heated glass, called “drawn,” or by winding heated glass around a rod, called “wound”
Glass is made from silica, alkali (to lower the melting point, but also makes it water soluble), and calcium carbonate (that turns to lime- it’s added to help stabilize the glass after the alkali)
There are several explanations for the cause of glass disease. Too little or too much of the lime (part of the bead’s composition) may cause water to leach out of the glass matrix as ions that then form salt on the surface of the bead. The environmental conditions, such as fluctuations in RH, or materials in proximity, such as semi-tanned hide, may accelerate glass disease. As seen from the list above, the beads were manufactured over a range of time, in different ways, and in different places.
As you can tell, there are many factors to research when evaluating glass disease. O’Hern addressed as many as possible while still managing the scope of the project. Survey Results
Condition Change: By comparing condition of the beads today to past condition/treatment reports, 16% of the beads have more deterioration now than in 1999. Measuring pH was used in addition to visual examination to determine condition. Some beads that did not look bad had a higher pH (above 7), signaling glass disease. Some beads that looked hazy did not have a higher pH, meaning no glass disease (perhaps hazy from manufacture).
Differing Manufacturing Techniques: Wound beads have it worse than drawn beads–95% of wound beads have glass disease. This could be because they have a compositional percentage of lime that is less stable.
Differing Colors: Black, red, and blue are the most disease-ridden. O’Hern looked through the museum database and found that the entries with the most “glass disease” indicated had blue beads. Blue beads are very clearly the “winner” of the glass disease competition, followed by red and black.
Treatment Techniques: Here’s where it gets even more interesting. The conservation literature and posts on the Objects Specialty Group list serve debate the use of three solvents to remove the salts on glass disease: water alone, ethanol alone, and a 1:1 water:ethanol mix. By comparing the 1999 survey to her own results, O’Hern capitalized on real-time aging to observe how each solvent mixture fares over time. Water-cleaned beads had a 50% rate of glass disease return; water:ethanol-cleaned had a slightly higher than 50% rate of return; ethanol-cleaned had the least amount of return at just under 50%. However, when looking at the beads cleaned with ethanol over the same time period as those cleaned with 1:1 water:ethanol (removing the very oldest treatments), the rate of return for glass disease falls to 40%.
(Note: Acetone has also been listed as a solvent for cleaning glass beads, but since the NMAI doesn’t use acetone, it was not included in this research.) Other Observations:
1. Measuring pH is essential because beads may look like they don’t have glass disease, but are actually more alkaline. Measuring pH is also quick and easy- cut your pH strip to a small piece, slightly dampen it in deionized water, press it onto the bead for 3 seconds, and then determine any color change in the strip.
2. The most affected beads were those sewn onto hide, but the disease was present when beads were in contact with many other materials as well.
3. Although cleaning with ethanol is a better choice for long-term disease prevention, the solvent chosen should still depend on the substrate around the bead. Advice from O’Hern:
1. Record treatment materials when removing glass disease.
2. Take BT and AT details of beads so you can easily compare for condition changes in the future.
3. Measure the pH of the beads… and RECORD THE RESULTS.
4. Have consistent monitoring of glass disease.
As an audience member, it’s always exciting to see a project that has results, especially on a topic that is not studied as extensively as it persists. This is definitely a postprint worth visiting for more details and results.
For other examples (and some “good” photographic examples), visit Ellen Carrlee’s project “What’s that White Stuff?” that she and (then WUDPAC graduate intern) Christa Pack reported on in Ellen’s blog: http://alaskawhitestuffid.wordpress.com/2011/08/09/glass/