The Library of Congress, among its responsibilities, is digitizing the wealth of historic records it holds – both to preserve them for future generations, and to make accessible to a larger audience online.
Tom Rieger, the manager of the library’s Digitization Services Section, said its contributions to the Federal Agencies Digital Guidelines Initiative (FADGI) for more than 20 years have helped federal institutions around the country figure out how to manage cultural heritage digitization.
“The challenge here is to do it correctly, but not foolishly — on too good a specification for what’s needed, or not enough of a specification for what’s needed. So we have to define exactly what that sweet spot is, and then figure out, for the rest of the community, just the best way to do that,” Rieger said in an interview.
While the COVID-19 pandemic has made it harder for researchers and the public to access the library’s resources in-person, Rieger said the agency has an opportunity to make its records more accessible through digitization.
“If we can make this material available in good enough quality, and easy enough access for the world, we’ve done our job,” Rieger said.
“It’s fabulous, it’s one-of-a-kind, and it’s in danger,” Rieger said about this material. “We are spending enormous amounts of time and effort to save that for the very last time we can, at as high a quality as we possibly can.”
Rieger’s section of the library develops all the standards and metrics for records digitization across the organization, as well as for much of the cultural heritage imaging community.
“The amount of digitization that we are doing today, as opposed to what we were doing five, 10, 15 years ago, is just stunning. We’re now dealing with many, many millions of images a year that we’re adding to the collections online, and adding to internally accessible collections as well,” Rieger said. But it’s an entirely different IT world today than it was back then. We have vastly better tools.
Library records under consideration for digitization go through a rigorous process. Rieger said the library’s curatorial division first submits a proposal for a digitization project that goes to a committee of senior executives.
“They review a project proposal, which Digital Collections Management Services staff have helped them put together, to say, ‘OK why is this important? Why do we want to do this? What benefit is this to people?’” Rieger said.
If the committee approves a project for digitization, it then heads to a technical analysis group, which conducts a feasibility assessment.
“That’s where we get all the players in a room. Conservation will look at it and say, ‘What do we have to do to make this camera-ready, to where it’s safe to handle? The lawyers will look at it and say, ‘Do we have copyright clearance? Are there any other complications here?’ The metadata people will look at it and say, ‘Ok, is the metadata ready for this or not?’ And if the answer is no, we’re not going to do it. The metadata has got to be ready in order for us to approve a project to go into production. Now, that doesn’t mean we’ll say no, go back to square one. We just say, ‘Come back to us when you got this piece together,’ because if we image things without having the metadata, you’ll never find it,” Rieger said.
“The way did it five years ago is not the way we’re going to be doing it going forward, because technology has changed in the last five years. And it’s our job to understand those changes in technology, and in some cases, actually make that technology adjustment, to actually build it here ourselves, or contract with experts around the country, around the world, who can actually help us improve the science and technology involved in this,” Rieger said.
Rieger said the evolution of FADGI guidance, the first version of which came out in 2010, reflects advancements in technology.
“If you look at the state of technology 12 years ago, we didn’t have anywhere near the sophistication of tools to work with. It’s the cameras. Back then, we were dealing with, at the very best, maybe 60 megapixel cameras. That’s not bad, that’s fine. Today, we’re dealing with 150 megapixel cameras that are vastly faster at capturing the image, transferring the image,” Rieger said.
The updated guidelines include a total rewrite on metadata standards, and also switches to a new method of measuring color, which is universal no matter the application.
“When you put things in various forms, like you’re putting it on the web, or you’re putting it out for publication, you’re going to be in a different color space, as they call it. The results might not be what you might expect them to be, using the old methods of doing this. So we’re learning, we’re refining. These are not the way things could have been done 10 years ago, but they’re the way they should be done,” Rieger said.