Jennifer McGlinchey Sexton, Conservator of Photographs at Paul Messier, LLC, presented on the testing of reference cards and the development of new imaging protocols that are so desperately needed in our field for increased standardization and comparability of photographs taken of UV-induced visible fluorescence phenomena. The project started by private photograph conservator Paul Messier in 2006, under the servicemark name UV Innovations (SM), was taken over by Jiuan-Jiuan Chen, Buffalo State’s Assistant Professor of Conservation Imaging, Technical Examination, and Documentation at Buffalo State College. Sexton has directed development of the Target-UV™ and UV-Grey™ products since 2012.
Many a visual examination is followed by technical imaging, including both Ultraviolet Fluorescence (UV-FL) and Visible-Induced Luminescence (VIL), and Sexton’s talk first reiterated why observing cultural material by using carefully selected wavelengths of light is important: It is non-invasive, relatively inexpensive, accessible, and (largely) commercially available. As a surface technique, UV-induced fluorescence probes outside layers, coatings, optical brighteners, mold, tidelines, and organic-glaze pigments above bulk pictorial films. Although it is a technique we rely on for the large majority of condition assessments and technical studies, our documentation remains unstandardized, and essentially, unscientific. With so much to gain by standardizing our capture and color-balancing process, as well as by taking careful notes on the equipment used, the prospect of the Target-UV™ and UV-Grey™ UV-Vis fluorescence standards is certainly an exciting one.
UV-FL images are unique in that they contain diagnostic color information, hence the need for standardization, which would enable cross-comparison between colleagues and between before- and after-treatment documentation. The beta testing of the UV target which was been carried out for 2 years has attempted to account for the most significant variables in the production of UV-FL images. The talk evidenced the enormous amount of collaboration and communication needed to streamline the significant aspects of equipment choice, the optimization of acquisition, and the documentation of post-processing methods. The goal was to increase reproducibility and comparability. Sexton’s presentation showed that the beta testing of the product achieved demonstrable results in terms of uniformity of output.
Development of the UV target was begun in collaboration with (Golden) to produce stable fluorogenic pigments of known color values and known neutral-gray values (which were evidently produced by mixing the red, green, and blue fluorogenic pigments). Neutral gray was defined as a gray which was interpreted as neutral by many viewers and which performed similarly under many different conditions. Including such color swatches within a photograph–for the purposes of color-balancing and correcting any variation in the Red-Green-Blue channels for each pixel–is a very familiar principle in visible photography.
A second consideration made for the round-robin testing was that of intensity, which is a variable somewhat unique to UV-FL photography. The nature of the emissive source must be noted for purposes of calibration and exposure, especially as all light sources currently used in fluorescent photography lack stability over long periods. The output of a lamp with fluctuate over time, and this makes relative intensities of materials illuminated with some lamp types very difficult to determine. Even when this particular factor is taken into account, other variables, such as the distance of the lamp to the subject and the wattage of the lamp will effect intensity. It is also possible that multiple emitting sources could be present. These factors should be included in the metadata for the exposure.
To control for this intensity factor, beta testers were to divide their sources, distance-to-subject, and wattage parameters into three different intensity levels which were best matched to certain analyses: “Ultra” was beta-tested for analysis of optical brighteners and other products produced specifically to fluoresce. “High” was best for the analysis of natural and thicker fluorescence, perhaps of a paint film such as zinc white, of some feathers (see Ellen Pearlstein’s talk from this year “ Ultraviolet Induced Visible Fluorescence and Chemical Analysis as Tools for Examining Featherwork”), and uranium glass colorants; and “Low” was used to image thin applications of resins, varnish, and sizing films.
A third variable was that of camera sensitivity, which varies with manufacturer (proprietary internal filtration and software), camera type (either DSLR or digital back cameras), as well as with sensor type (CCD or CMOS, modified or unmodified). Different filters were tested (Kodak Wratten 2e pale yellow filter, PECA 918, and an internal blue-green IR (BG-38) filter). These types of internal filtration are typical on digital cameras to block out IR and some red light to bring the camera output closer to the typical photopic curve of the eye and more closely mimic human vision. The 2e filters UV radiation and a small amount of the blue light commonly emitted by UV lamps, while the Peca 918 is used for IR blocking.
The fourth variable tested was the source type. Those tested included low-pressure mercury, high-pressure mercury, arc and metal halide arc lamps. Although LEDs were used at some institutions, many of these have a peak emission at 398 nm, which is barely in the ultraviolet range. Greg Smith at the IMA analyzed Inova X5 UV LED, and found that it does contain UV but is more expensive. Other products show a large difference in emission peaks which often cannot be accommodated by a simple white-balancing operation. Therefore, testing limited the peak emission to the most common types, emitting between 360 and 370 nm.
The last variables that were analyzed were those of post-processing procedures and software and of user perception and needs. An problematic paradigm identified over the testing period was that of the image being readable or resolvable vis-à-vis a particular argument versus the image being strictly accurate and well-calibrated. A photograph may accurately render the intensity of the fluorescence but it may be so completely underexposed so as to be unreadable.
Testing showed that, despite these difficulties of calibration and subjective experience, that the workflow incorporating the UV Innovations standard, showed a marked increase in standardization. Round-robin testing was completed by eight institutions in the US and Europe in May 2013. Fluorescent object sets were shipped along with the UV standard and filters. Each test site collected two image sets, one named “a,” using the lab’s current UV documentation protocol with color balance and exposure set “by eye,” and the other named “b” using the UV innovations protocol. The increased control provided by the use of the standard was evidenced by the average delta E of L*a*b* data points as well as the average standard deviation of RBG data points for both a and b sets as each institution. By way of example, the ‘Low—a” set showed an improvement from a delta E of 18.8 to the ‘Low—b” with a delta E of 4.9. The average standard deviation in-between these two sets showed an improvement from 32.8 to 6.2!
The presentation went into depth about how this data was collected, how variables were controlled for, and how the data was analyzed, and it showed convincingly that despite the high variability of current work flows, the UV Innovations UV-Grey card and Target-UV standards in conjunction with standardization of UV source and filtration can markedly improve the image variability of UV-FL photography.
One variable in “extra-spectral” imaging that was not addressed in this talk were the spatial inhomogeneities of the light source, or the gradient that results from the use of an inconsistent light source. This could be especially problematic if using UV-FL photography for condition imaging, and “flat-fielding” should be considered as a possible augmentation to the ideal image-acquisition protocol.
There is still further research to be done before this product hits the market. A fourth intensity level will be added to increase the flexibility of the product. The current prototype features two intensity levels on the front and two on the back. Notably, artificial aging must be done to determine when the product should be replaced. As this current standard only operates over UV-A and UV-B, UV Innovations looks forward to developing a UV-C standard, as well as a larger format target.
The prototype of the Target-UV and UV-Grey cards were handmade, but the company hopes to overcome the challenges of large-scale production and distribution by Fall 2014.
Author: Kate Brugioni
42nd Annual Meeting – Paintings (Joint with Wooden Artifacts), May 31, “Painted Totem Poles at the American Museum of Natural History: Treatment Challenges and Solutions” by Samantha Alderson, Judith Levinson, Gabrielle Tieu, and Karl Knauer
Those who have beheld the Hall of Northwest Coast Indians at the American Museum of Natural History and its extraordinary “totem poles” will instantly recognize the potential scope of any study or treatment of such massive artifacts.
These objects are housed in the earliest wing of the museum, curated at its inception by Franz Boas, “the father of American Anthropology”, who organized the early acquisitions of the museum according to a revolutionary argument: that of “cultural relativism” in opposition to a chauvinistic, social-Darwinist organization that put “primitive” peoples at the bottom of an evolutionary tree, the pinnacle of which was white America. Today, this hall holds a landmarked status and remains relatively unchanged, as the poles are very hard to move.
Ten years ago, a renovation of the hall was proposed. Although the recession thwarted plans, the objects were still in need of stabilization and aesthetic improvements. Because this project—from its inception, through the research, testing, and execution stage, was so expansive—Samantha Alderson reminded her audience that her talk could only represent an overview of a four-year process. Those interested in a specific aspect of the project can look forward to in-depth, forthcoming publications.
One of the more important aspects of the research phase, and a professional obligation that is indispensable to the curation and conservation of native materials, was the consideration of ethical issues and provenance information. Most of these pieces entered the collection between the 1880s and the 1920s, and the majority has been on continual, open display since their arrival. Their presence in AMNH’s collection is widely acknowledged to be ethically complicated in itself, representing an era of unscrupulous dealing in Northwest Coast artifacts. (To read more about “Indians and about their procurable culture,” consult Douglas Cole’s, “Captured Heritage: The Scramble for Northwest Coast Artifacts,” about the coincidence of a taste for these native artifacts and the establishment of many of the country’s foremost natural history collections. (p.xi)]
The carvings, including the carved columns most commonly described as ”totem poles,” would have had numerous functions within their originating cultures: house frontal poles holding entry portals to buildings, interior house posts, welcome figures, memorial poles, and mortuary posts [For a technical study on these types of carvings, please consult “Melissa H. Carr. “A Conservation Perspective on Wooden Carvings of the Pacific Northwest Coast.” Wooden Artifacts Group Postprints. 1993.].
To further hone their understanding of provenance, the 2009 CCI “Caring for Totem Poles” workshop in Alert, Canada, allowed the authors to travel through British Columbia with curatorial consultants, native carvers, and native caretakers, in order to study the techniques of manufacture. It was also important to keep abreast of the expectations of the native communities that might be borne out over the course of any treatment intervention or re-installation campaign.
The original aim of this project was to provide structural stability to those carvings which exhibited highly deteriorated surfaces caused by the weathering and biodeterioration in their original environment. These instabilities were often exacerbated by inappropriate environmental conditions and restoration interventions in the museum. The most significant issue requiring treatment was the presence of wood rot, insects, and biological growth, present in the original environment and continuing to run their course.
Although climate control was installed in 1995, soot from the age of coal heaters and lamps still blanketed the inaccessible areas of the objects. Dust from visitor traffic also dulled them, as the hall is adjacent to the entrance to the IMAX theatre. Routine and well-intentioned cleaning was ineffective against a century of accumulated grime and dust and was causing surface loss.
As there is no barrier between the objects and the visitor, touching has caused burnishing and scratching. The unfinished wood readily absorbs skin oils; and graffiti and adhered chewing gum had also become a most-unfortunate problem.
Early interventions after acquisition had caused condition problems of their own, as old fills had a hardness or density that is inappropriate for soft, weathered wood. These fill materials were only becoming more ugly, unstable, crumbly, and cracked with age.
All of these factors, taken together, provided a huge impetus for treatment.
To begin the treatment-planning stage, the conservators at AMNH performed examinations under visible and UV radiation and mapped the observed conditions and materials using a streamlined iPad-based documentation protocol. In some cases the restoration materials observed provided evidence of institutional and condition history. Although there were almost no previous treatment records of these objects, comparison with archival photographs of many of the objects showed the rate of deterioration since acquisition and provided clues as to dates of interventions and installation history.
In summary of the object-treatment stage, vacuums and sponges were first used in an attempt to reduce some of the dinginess of the surface and to increase the legibility of the painted designs. The many resinous and waxy coatings had trapped so much dust, however, that this treatment did not always have a satisfactory result.
The question of solvent toxicity held sway in all aspects of treatment, as operations were completed in makeshift spaces outside of the lab, due to the size of the objects; these areas had no fume-extraction infrastructure. Luckily, plaster fills could be softened with a warm-water-and-ethanol mixture and carved out.
Butvar B-98 and Paraloid B-72 were selected as potential consolidants and adhesives. A 5-10% Butvar B-98 solution in ethanol (i.e. without the toluene component for safety concerns) was used for surface stabilization, and Paraloid B-72 in acetone was used for adhesion of splinters and detached fragments.
Fills were designed using different materials depending on the location on the object. These were intended to reduce damage during installation, display, and regular maintenance. If the fill was not visible, shapes were cut from Volara, beveled, and adhered in place with Paraloid B-72 along the edges. These were often necessary on the tops of the poles to cover the deep voids of deteriorated wood. Some losses were back-filled with tinted glass micro-balloon mixtures of different grades and different resin-to-balloon ratios where appropriate. As some paints were solvent-sensitive, certain fills required the use of Paraloid B-67. The final fill type was a removable epoxy-bulked fill to compensate for deep losses in visible areas. These areas were first filled with polyethylene foam to prevent the fill from locking in. The edges of the fill area to be cast were protected by tamping down teflon (plumber’s) tape which conforms nicely to the wooden surface. West System 105 Epoxy Resin—with “fast” 205, “slow” 206, or “extra-slow” 209 hardeners—was used in different proportions to 3M glass microspheres and pigments to give fill material with various hardness, curing-times, textures, and colors (See Knauer’s upcoming publication in ICOM-CC Warsaw 2013 for more details). This method is notable for its invisibility, its reversibility, and its rejection of phenolic micro-balloons, which are an unstable and unsuitable and were historically used for such a wood fill merely for their brown color. Once cured, the bulked-epoxy (and the plumber’s tape) were removed and the fills were then tacked into place with B-72 to produce an aesthetically pleasing and protective cap.
Many losses which were previously filled were left unfilled, as would have been the case it they had been collected and treated today. Crack fills were incised so as to retain the appearance of a (smaller) crack.
Once the surface and structure was stabilized with the consolidation and filling operations, the team turned their attention to the various paint films to be cleaned. Many of these were proteinaceous but some were more similar to house paints. This data was consistent with the ethnographic findings and with current native practice. No preparatory layers were used, and the pigment layers were often very lean.
PLM, XRF, and SEM-EDS, as well as UV-FL imaging, thin sections, and analysis with FTIR was undertaken. Some binder analysis was also possible, but this was complicated by historical treatments. Interpretation of epi-fluorescence microscopy results was also thwarted by the presence of multiple coatings, the inter-penetration, -dissolution, and bleed-through of layers. As many as four different types of coatings were identified, and understanding and addressing the condition issues caused by these coatings became a primary concern. Cellulose Nitrate was often applied to carvings in the early 20th century. Whether this was to refurbish or protect, it has developed into a dark-brown layer which is alternately hazy and glossy and which obscured the original surface appearance. Lower regions evidenced PVA or PVAc on top of the Cellulose Nitrate. Shellac and dammar are present in isolated locations, as is an orange resin which eluded identification (even when analyzed with GCMS).
Although identification of these coatings was attempted, removal was not originally planned due to the difficulties designing a solvent system for its reduction, considering the variation in sensitivities, the interpenetration of the layers, and the unknown condition of the original paint films beneath. This plan changed when the poles were deinstalled for construction.
The treatment design was largely aided by the isolation of four house posts in the collection made by Kwakwaka’wakw artist Arthur Shaughnessy.
Commissioned by AMNH in 1923, these had never been installed outdoors but which had been coated in the same manner and exhibited in the same space. This allowed for the development of controlled methods for coating reduction.
A Teas table (or Teas chart) was used to identify potential solvents or solvent mixtures, which were tested over every color and monitored for any leaching or swelling. These initial tests were deemed unsuccessful.
In areas without paint, film reformation with acetone reduced haziness or glossiness. Where the coating was completely removed, the wood was often left with an over-cleaned appearance which necessitated some coating redistribution with MBK, MEK, and propylene glycol. Wherever possible, gels were used to reduce the exposure to toxic solvents. In painted areas, the large variation in solvent sensitivity, the inconsistency of media binders, the varying porosity of the wood, and the changing direction of the wood grain required that the conservators work inch-by-inch. DMSO, a component of “safe” stripper, and NMP were controllable over certain colors but caused considerable swelling.
February 2012, the museum saw the reinstallation of the Shaughnessy poles, marking the effective conclusion of the testing period and the successful management of a challenging triage situation by conservation staff.
It was Kwakwaka‘wakw artists like Arthur Shaughnessy who kept carving traditions active when the Canadian government prohibited the potlatch ceremony in 1885. The ban was lifted in 1951, after AMNH’s acquisition of the house posts.
The completion of treatment represents an important opportunity to educate the public: Although these monumental carvings are exhibited in a historic wing of the museum, we need to dust them off and remember that these carvings represent very, active traditional practices and communities.
There is still the need to develop more systematic solvent strategies, as well as to consult with a paintings conservator. But it is clear that these objects stand to look much improved after the grime and coatings are removed or reduced and the objects are thoughtfully reintegrated with a well-designed fill system. Thanks to the remarkable talents of the AMNH team, these stately creations are finally commanding the respect they deserve.
___
Resources:
Hall of Northwest Coast Indians :: AMNH
From the Bench: These Face Lifts Require Heavy Lifting :: IMLS
Arthur Shaughnessy house post carvings reinstalled following conservation treatment (February 2012) :: AMNH
Changing Approaches to the Conservation of Northwest Coast Totem Poles :: Reed College
Andrew Todd (1998). “Painted Memory, Painted Totems,” In Dorge, Valerie and F. Carey Howlett (eds.), Painted Wood: History and Conservation (pp. 400-411). Proceedings of a symposium organized by the Wooden Artifacts Group of the American Institute for Conservation of Historic and Artistic Works and the Foundation of the AIC, Colonial Williamsburg Foundation, 1994. Los Angeles: J. Paul Getty Trust.
A Brief History of the Jesup North Pacific Expedition :: AMNH
42nd Annual Meeting – Digital Resources & Conservation Interest Session, May 31, "Charting the Digital Landscape of the Conservation Profession" by FAIC
What digital tools and resources do conservators use and create?
Who are the audiences for conservation content?
How can this content be delivered to these groups by digital means?
What kinds of digital tools, resources, and platforms will be needed as the profession continues to grow?
It is with the above questions that “Charting the Digital Landscape of the Conservation Profession,” a project of the Foundation of the American Institute for Conservation (FAIC), interrogates our profession’s origin, its role in this particular technological moment, and its propagation into the future with the aid of technology. As all AIC members have been made aware with the recent mailing, funding from the Mellon, Kress, and Getty Foundations is supporting FAIC in its investigation into the so-called “digital landscape” of the profession. This will help develop a baseline report on the discipline’s use of digital resources in order to better understand its breadth and complexity, and to identify areas critical to the community both now and into the future.
This session was the first in a series of planned forums designed to both map the digital landscape of the profession and to contextualize the data gleaned from the recent survey by discussing the tools currently used and their possible development in the future. An expert panel was brought together for brief presentations, after which there was a lengthy, free-form discussion amongst all attendees.
Please note: This post will err on the side of being longer: Although a report on the survey results will be published by FAIC, this interest session, which put so many experienced professionals and stake-holders in dialogue, is unlikely to be published as delivered. Additionally, many attendees voiced concern that the session was scheduled over many other specialty events, preventing stakeholders from attending to hear more about the project or to voice their concerns about the digital future of the discipline.
To those who are interested in the intimate details: Read on!
To those who would prefer to skim: Know that the FAIC’s report is expected in December 2014, and stay tuned for future forums in the “Digital Landscape” series.
And it goes without saying: If you have not yet participated in the survey, now would be a good time. Our research habits are changing. Help Plan the Digital Future of Conservation and Preservation!
1. Introduction
2. Speaker: Ken Hamma (Consultant and Representative of the Mellon Foundation)
3. Speaker: Nancie Ravenel (Conservator at the Shelburne Museum)
4. Speaker: David Bloom (Coordinator of VertNet)
5. Discussion
1. INTRODUCTION
Introducing the session, Eric Pourchot, the FAIC Institutional Advancement Director, began by discussion the project and the initial survey findings. FAIC’s investigation, he said, seeks to identify the critical issues surrounding the digital tools and resources used to shape both the questions and answers concerning urgent need, target audience, and content delivery methods.
He began by outlining five components of the project:
- A review of existing resources
- A survey of creators of digital resources as well as of the end users
- Meetings (and phone interviews) with key stake holders
- Formulation of recommendations, priorities, and conclusions
Although I halted a bit at all of this business-speak about timeline and budget and reports and endgames, I was curious as to the initial results of the survey, which I did take. Additionally, the survey goal of identifying the major ways in which digital resources are created, used, and shared both now and in the future, gets at interesting problems and questions we should all ask ourselves.
560 responses to the professionally-designed survey had been completed by the date of the presentation, so, Eric emphasized, the data is still very preliminary. More international participation will be sought before the survey closes and the data is analyzed for accuracy and for various statistical “cross-tabs” by the contracted company.
Of the population queried, two-thirds go online regularly, and one-third logs on daily. When asked to list the sites most consulted, 30% listed CoOL/DisList as their primary resource, 30% listed Google, and 13% named AIC/JAIC. AATA/Getty, CAMEO, CCI, JSTOR, BCIN, NPS, Wikipedia, and AIC Specialty Groups were present in three-fourths of the fill-in responses.
When asked for the success rate of finding information on a certain topic, those searching for information on Preventive Conservation, for environmental guidelines, for material suppliers, as well as for disaster planning information were successful more than half the time. Unsurprisingly, when it was treatment information that was sought, more than half of the users were unsuccessful. To qualify the lack of “success” of a search, 70% of users cited the lake of information specific to their exact needs. 49% are concerned that the information is not up-to-date. 43% cite concern about the reliability; and 32% were dismayed by the time it took to find the information.
Eric expressed surprise that an archive of treatments topped the list of enhancements desired by the respondents. I do not remember if this was a fill-in question or what I personally responded, but this result did not necessarily strike me as surprising. Rather, I see it being in line with the lack of information on treatment procedures—both historic and current—that was noted in the above section of the survey.
From among the list of Digital Tools used most often, Eric noted the absence of collaborative spaces, such as Basecamp and Dropbox, from the list of image and document management tools, but suggested that maybe some forgot to list these oft-used programs, as they are not conservation-specific.
Finally respondents identified policy issues that were of most concern to them as obstacles to creating, sharing, and accessing content: Copyright/IP (Getty), Institutional/repository policies, time (?), and standards/terminology ranked high. It was unclear at first what was meant by the latter, but David Bloom’s talk (below) did a good deal to illuminate the importance of this.
Eric concluded by noting that although a web-survey platform does self-select for respondents with certain habits, sympathies, and concerns (i.e., those who access the internet regularly and seek to use it as a professional tool), the data represents a good range of age and experience. These groups can be correlated to certain responses; for example, 45-65 year-olds are more likely to search for collections info and are more interested in faster internet access and better online communication. Younger stakeholders, are searching more for professional information and jobs.
Again, be reminded that this data is very preliminary. A final report can be expected by December 2014.
2. SPEAKER: Ken Hamma
Ken Hamma then discussed the Mellon Foundation’s efforts in the areas of conservation and digitization, the goals and directions of these efforts, and their relationship to larger movements in the Digital Humanities.
An immensely appropriate choice to speak at this session, Ken Hamma is at once a consultant at Yale Center for British Art, the Office of Digital Assets and Infrastructure (ODAI) at Yale, ResearchSpace and the Museums and Art Conservation Program at the Andrew W. Mellon Foundation. He is a former executive director for Digital Policy and Initiatives at the J. Paul Getty Trust and has also served as a member of the Steering Committee of the Coalition for Networked Information (CNI), a member of the Reasearch Libraries Group (RLG) Programs Council of OCLC, and a member of the At-Large Advisory Committee of the Internet Corporation for Assigned Names and Numbers (ICANN).
In 2003, Hamma began his advocacy for the use of digital tools in conservation documentation, when a meeting was convened between a select number of institutional heads and conservators to feel out expectations of the Mellon in these matters—how best it should invest in the digitization of treatment records, how and if these should accessible, and by what audiences. This initial meeting was followed by the Issues in Conservation Documentation series, with a meeting in New York City in 2006 and in London in 2007. As the respective directors and heads of conservation of each host institution were present, this represented a recognition of the importance of institutional policy to what are fundamentally institutional records. Outcomes of these meetings were mixed, with European institutions being more comfortable with an open-access approach, perhaps due to the national status of their museums and the corresponding legal requirements for access. This was exemplified in the response of the National Gallery: The Raphael Project includes full scans of all conservation dossiers. Even NGL staff were surprised this became public! (More pilot projects resulting from this Mellon initiative are listed here).
In America, the Mellon began considering supporting digitization efforts and moving conservation documentation online: In 2009 it funded the design phase of ConservationSpace.org to begin imagining online, inclusive, and sustainable routes for sharing. Merv Richard of the National Gallery lead 100 conservators in the development of its structure, its priorities, and its breadth, presenting a discussion session at AIC’s 41st Annual Meeting, Indianapolis.
Important observations are being made when studying potential models, notably the similarities in which the National Park Service, libraries, natural science collections, etc. handle networked information. Although there were necessarily different emphases on workflow and information, there were also large intersections.
In the meantime, CoOL shows its age. It’s long history has necessitated a few migrations over hosts and models—from Stanford Libraries to AIC, and from Gopher to WAIS to W3. It is still, however, based on a library-catalogue model, in which everything is represented to the user as a hypertext (hypermedia) object. In such a system, there are only two options available: to follow a link or to send a query to a server. As important as this resource has been for our professional communication and for the development of our discipline, it lacks the tools to for collaboration over networked content. Having become a legacy resource, it is discontinuous from other infrastructures, such as Wikipedia (pdf), Hathi Trust, Shared Digital Future, and Google Books, all of which which point to a more expansive set of technological opportunity, such as indexing, semantic resource discovery, and linking to related fields.
Our discipline does not exist in a vacuum, and the structuring of our online resources should not show otherwise. Additionally, we need to be able to identity trustworthy information, and this is not a unique problem: We have to open ourselves up to the solutions that other disciplines have come to implement.
Ken encourages us to think of accessible data as infrastructure, which forces the creator to think about applications of the data. A web-platform should be more than just switches and networks! It should support collaborative research, annotation, sharing, and publication. This plat form should increase our ability to contribute to, extract from, and recombine a harmonized infrastructure that we fell represents us.
Planning for the extent of our needs and building it is not beyond a shared professional effort. We will find it to have been worth it.
3. SPEAKER: Nancie Ravenel
Nancie Ravenel, Conservator at the Shelburne Museum, former Chair of Publications and Board Director of Communications, works very hard to create and disseminate information about digital tools and their use to conservators. She is continuously defining the digital cutting-edge, at once “demystifying” conservation through outreach, embodying the essential competencies, and articulating the value of this profession. Her segment of the session provided an overview of key resources she uses as a conservator, noting how the inaccessibility of certain resources (e.g. ARTstor, ILL, and other resources requiring an institutional subscription) changes how she locates and navigates information.
“What does Nancie do in the digital landscape?,“ Ravenel asked. She makes stuff. She finds stuff. She uses and organizes what she makes and finds. And she shares what she’s learned.
Nancie divided her presentation of each function into four sections:
◦ Key resources she uses as a conservator
◦ Expectations of these resources
◦ What is missing
◦ and What remains problematic
In our capacity of makers of stuff, many of us, like Nancie, have begun to experiment, or are already proficient at, using Photoshop for image processing and analysis, experimenting with 3D images and printing, gleaning information for CT scans, producing video, and generating reports.
Where making stuff is concerned, further development is needed in the area of best practices and standards for createng, processing, and preservation of digital assets! We need to pay attention to how assets are created so that they can be easily shared, compared, and preserved. Of great concern to Ravenel is the fact that Adobe’s new licensing model increases the expense of doing work.
On the frontier of finding stuff, certain resources get more use from researchers like Nancie, perhaps for their ease-of-use. Ravenel identifies CoOL/CoOL DistList, jurn.org, AATA, JSTOR, Google Scholar/Books/Images/Art Project/Patent, CAMEO, Digital Public Library of America (dp.la), WorldCat, Internet Archive, SIRIS, any number of other art museum collections and databases (such as Yale University Art Museum or Rhode Island Furniture Archive) and other conservation-related websites, such as MuseumPests.net.
The pseudo-faceted search offered by Google Scholar, which collates different versions, pulls from CoOL, and provides links to all, is noted as being a big plus!
There is, however, lots of what Nancie terms “grey literature” in our field—which is not published in a formal peer-reviewed manner (such as listserv or post-print content, as well as newsletters, blogs, or video content). The profusion of places where content is available, the inconsistent terminology, and the inconsistent metadata or keywords (that which is read by reference management or that which facilitates search) applied to some resources are the most problematic when finding stuff.
As Richard McCoy has always insisted to us, “if you can’t ‘google’ it, it doesn’t exist,” Nancie reiterates a similar concern: If you can’t find it and access it after a reasonable search period, it might as well not exist. In the way of a list of what is harder to find and access she provides the following areas in need:
• AIC Specialty Group Postprints that are not digitized, that are inconsistently abstracted within AATA, or whose manner of distribution makes access challenging.
• Posts in AIC Specialty Group electronic mailing list archives are difficult to access due to lack of keyword search
• Conservation papers within archives often have skeletal finding aids; and information is needed about which archives will take conservation records.
• ARTstor does not provide images of comparative objects that aren’t fine art.
Any effort to wrangle these new ways of assembling and mining information using technology need to consider using linked resources, combining resources, employing a more faceted search engine, and deploying better search options for finding related objects. Research on changing search habits of everyone from chemists to art historians should help us along the way.
In her capacity as a user and organizer what she makes and finds, Nancie knows that not every tool works for everyone. However, she highlights digital tools such as Bamboo DiRT, which, as a compendium of digital-humanities research tools, works and synch across platforms, browsers, and devices, allows for exporting and sharing, and can allow you to look at your research practices in new and different ways. Practices to be analyzed include note taking, note management, reference management, image and document annotation, image analysis, and time tracking. Databases such as these offer structure for documenting and analyzing workflow; and if used systematically, they can greatly increase the scientific validity of any project over the mere anecdotal approach. For a large cleaning project, such as that undertaken with the Shelburne carousel horses, this is indispensable.
What is missing or problematic? A digital lab notebook is not ideal around liquids but is very suited to logging details and organizing image captures. These methods cannot measure the results of treatments using computational methods. Missing are also good tools for comparing, annotating, and adding metadata to images on mobile devices and well as for improved cooperation between tools.
And after all of this analysis of one’s use of digital tools, how is it best to share what one has learned? The AIC Code of Ethics reminds us that:
“the conservation professional shall contribute to the evolution and growth of the profession…This contribution may be made by such means as continuing development of personal skills and knowledge, sharing of information and experience with colleagues, adding to the profession’s written body of knowledge, and providing and promoting educational opportunities in the field.”
The self-reflexive exercise that Nancie Ravenel modeled in her talk—of analyzing personal use of digital tools and how personal needs and goals may reflect and inform those of others—will not only be indispensable to the future development of digital tools which will meet this call to share, but it contains in itself a call to share: Nancie asks, what do you use to share and collaborate with your colleagues. How may these systems serve as a model for further infrastructure?
Email, listservs, and forums; the AIC Wiki; research blogs, and project wikis enabling collaboration and peer review; document repositories like ResearchGate.net and Academia.edu; shared bibliographies on reference management systems like Zotero.org and Mendeley.com; collaboration and document-sharing software like Basecamp, Google Drive, and Dropbox; and social-media platforms allowing for real-time interaction like Google Hangouts are all good examples of tools finding use now.
Missing or problematic factors in our attempts to share with colleges include the lack of streamlined ways of finding and sharing treatment histories/images of specific artworks and artifacts; the lack of archives that will accept conservation records from private practices; and the persistent problem of antiquated IP legislation which is often confusing.
In addition to sharing information with other conservators, we must also consider our obligation to share with the public. Here better, more interactive tools for the display of complex information. As media platforms are ever-changing, these tools but be adaptable and provide for some evaluation of the suitability of the effort to the application.
4. SPEAKER: David Bloom
Described by Eric Pourchot as a “professional museophile,” David Bloom was a seeming non-sequitur to the flow of the event. However, as coordinator of VertNet, and NSF-funded collaborative project making biodiversity data freely available online, he spoke very eloquently about the importance of and the opportunities offered by data-sharing and online collaboration. He addressed issues of community engagement in digital projects, interdisciplinary collaborations, and sustaining efforts and applicability throughout these projects. As argued in the other short talks, conservation is yet another “data-sharing community” which can learn from the challenges met by other disciplines.
As described by Bloom, VertNet is a scalable, searchable, cloud-hosted, taxa-based network containing millions of records pertaining to vertebrate biodiversity. It has evolved (pun-intended) from the first networked-information system built in 1999 and has grown over various revisions as well as by simple economies of scale—as the addition of new data-fields became necessary. It is used by researchers, educators, students, and policy-makers, to name a few. As the network is a compilation of data from multiple institutions, it is maintained for the benefit or the community, and decisions are made with multiple stakeholders under consideration.
Amongst the considerable technical challenges through all of its iterations, VertNet has struggled to establish cloud-based aggregation, to cache and index, to establish search and download infrastructure, and to reign in all associated costs.
Additionally, intellectual property considerations must be mentioned, as even though the data is factual (the information cannot be copyrighted), the data “belongs” to the host institution, as they are the historical keepers. As a trust, VertNet does not come to own the data directly. This made a distributed network with star-shaped sub-networks necessary, even though it was expensive to maintain, especially for a small institution, requiring many servers with many possible points of failure. Once one point failed, it was difficult to locate. Costing about 200k/yr, this was an expensive system to maintain, and although it was still the best and most secure way to structure the network, it was not as inclusive as it could have been for its expense.
There are always social challenges to building such “socio-technical networks,” and this is something that the FAIC is discovering by simply attempting to poll its membership. It doesn’t work if people don’t want to play. What ensues are knowledge gaps, variable reliability, and a lack of resources. To speak more broadly, any entity entrusted with indexing information needs for people to get over their fear of sharing to learn the benefits and acquire the skills associated with being connected (i.e. Social-media privacy controversies). All the knowledge and time needed to meet everyone where they are technologically and bring them along in a respectful manner does not exist in one place, so priorities must be defined for the best investment of time and funds to bring the discipline forward.
Bloom found that disparate data hosts could not communicate with each other—they either had different names for similar data fields which needed to be streamlined or they did not maintain consistent terminology, either globally or internally.
This problem had already been solved in a number of ways. For example, Darwin Core classification system was developed by Dublin Core; ABCD is the European standard; and Biodiversity Information Standards was developed by TDWG. There are 186 fields defined by Darwin Core with a standardized vocabulary in as many fields as possible. These standards are community-ratified and community maintained in order to not be easily or unnecessarily changed. This allows for easy importation by mapping incoming data-sets to a Darwin-Core standard; all the data is optimized for searchability and discoverability; and publication and citation tools are hence streamlined.
This type of study of the state of the art, necessary when designing new database infrastructure, can serve as a model for the field of conservation. At the foundation of a successful system, will be a serious study of what has been done in other fields and of what is most useful to prioritize for this one.
As VertNet is based entirely on voluntary participation, it is critical that participants understand the benefits of submitting their data to the trust. The staff at VertNet makes themselves available to help the host institution through any technical difficulties encountered in the data exportation and importation process. Backups of this data are scrupulously maintained throughout the migration process. A major benefit to the exporting institution is VertNet’s data-quality checks which will complete, clean up, and streamline fields and then will send back a report so that the client can update their own databases. This brings local data-maintenance standards in-line with those maintained by the global database.
Additionally, the NSF grant has made training workshops, the development of analytical tools, and certain instances of impromptu instruction possible for clients. This has lead to VertNet’s exponential growth without advertising. The repository now represents 176 institutions with 488 collections and many, many more want in from the waiting list. All these institutions are voluntarily submitting their data despite historical concerns about “ownership.” All these institutions realize the benefit of membership for themselves, for researchers, and for the state of the discipline.
Unfortunately, however, this “traditional” (eek) model of procuring NSF (or NEH, IMLS, etc.) funding to maintain cost is becoming unsustainable. Support for these services is desperately needed now that its utility is established. The value-add model is difficult even if VertNet does believe in “free data.”
The associated cost does not change; however, the database was built as community tool. So even though the common perception is an unchanging status-quo, the community will have to support the project insofar as they find the resource valuable and important. A common misconception propagated by recalcitrant host institutions is that “we can do it ourselves.”. The fact is, however, that most stewards of data can’t—and even more won’t—turn around and make these records available to the community for revision, maintenance, reference, or analysis.
5. DISCUSSION
The audience then exploded with responses :
Pamela Hatchfield (Head of Objects Conservation at the Museum of Fine Arts Boston and AIC Board President) began by reminding those who had been romanced by visions of star-shaped networks that concerns about maintaining privacy are still driven by private funding. Although there is now a conservation module in TMS, and terminological standardization is a frequently cited concern, this data is clearly not intended for the public. Historically, private institutions maintain the attitude that data should be tightly held. There is a huge revenue stream from images at the MFA, and as such it is difficult even for staff to obtain publication rights.
Terry Drayman-Weisser (Director of Conservation and Technical Research at the Walters Art Museum) pointed out the the Walters walks the middle path by providing a judiciously selected summary of the conservation record associated with an object. Not all of the information is published.
Certain institutions, such as at the British Museum, have an obligation to make these records public, unless the object falls into certain categories. The 2007 Mellon “Issues in Conservation Documentation” meeting at the National Gallery, London, provides summary of the participants’ approaches to public access at the time of publication.
I did have time to ask a question about the privacy concerns attendant on a biodiversity database. Why does it seem that there is less hesitancy at the prospect of sharing? In reality, these institutions do overcome certain hurdles when deciding what to make publicly available: It turns out that certain data about endangered species should not be shared. Although he did not have time to elaborate, I was curious how this “species privacy” might compare to “object privacy.”
VertNet, it turns out, cannot even find protection under the “Sweat-of-the-Brow” doctrine, as this factual information cannot be copyrighted. What about those portions of conservation documentation which are markedly drawn from speculation, interpretation, and original research? This information can be copyrighted, as per each institution’s policies, but our culture is changing. “We don’t train students to cite resources properly,” he noted, “and then we wonder why we don’t get cited.”
The time allotted for the session was drawing to a close, and everyone expressed their regrets that the conversation could not go on for longer and that more people could have attended.
I would personally like to thank FAIC, the speakers, the Mellon, Kress, and Getty Foundations, and all of the participants for their part in a very though-provoking discussion. I hope and trust that it will continue in future fora.