An opportunity for this has presented itself to me anyway, so I thought I could share the results with you.
De Montfort University is a member of the UK LOCKSS Alliance and engages in actively collecting content from electronic journals (where this is permitted) and where we have an interest in continuing to have access to the content. I am intending to delegate the mechanics of this task to a colleague. Explaining it to someone else is always a good opportunity for finding out how something works.
First, explain why we are doing this
LOCKSS (Lots Of Copies Keeps Stuff Safe) is an application designed to ensure continuing access to articles published in electronic journals and held off-site. By downloading and storing content, and periodically checking its integrity, LOCKSS ensures that the University has access to this content should it ever become impossible to access from the publishers’ own web sites.
From time to time new Archival Units are released as candidates for archiving on our LOCKSS Box. These come as emails with headings like '747 additional archival units now available for preservation'. I pass these emails on to Subject Librarians who know more about the interests of people studying and researching in the university than I do. I am effectively asking them to guess which titles we would miss if, at some later date, we no longer had online access to them.
Second, explain how we go about doing this
Once we have a list of journals to archive we can begin activating them as content we want the LOCKSS Box to collect. The process would go like:Point your browser at the LOCKSS Admin page and login;
- Select the 'Add titles' link from the main menu;
- Select one or more publishers from the list and click on the 'Select Titles' button;
- Click on the checkbox for each Archival Unit of the journals we want to add to LOCKSS. An 'Archival Unit' is roughly equivalent to a printed journal volume.
- Confirm these additions by clicking on the 'Add selected AUs' button.
Is this as easy as it could be?
You might think so from the bare step by step guide above.
In practice we spotted a couple of glitches. For example: The list of journals that subject librarians wanted to preserve included some titles that we do not subscribe to, and so can't access. By sending an unedited list of Archival Units to preserve, I have raised expectations that interesting journals are available to us when they are not. perhaps if we had checked the list against our Electronic Journals A-Z list we could have avoided this.
Then again, the subject librarians have selected titles that are not on the A-Z list, but maybe should be. One surprising benefit (to me) of the digital preservation workflow is that it highlights gaps in the coverage of the A-Z and OpenURL Resolver.
Selection by 'Guesswork' or 'Science'?
At present the subject librarians making the selections have little more to go on than intuition. They might ask their teaching and researching colleagues for their views, but that would not necessarily reduce the amount of guesswork involved. It is worth looking for sources of data that could help to inform decision making in this process. One source would be the usage statistics for current use of electronic journals. In the UK there is a service that automatically collects and manages usage statistics. The Journal Usage Statistics Portal (JUSP) is a JISC sponsored service that, at present, collects data from the publishers involved in the NESLi deals. Obtaining usage statistics for journals available for archiving in LOCKSS could assist people making collection development decisions. It could also act as a way of evaluating the choices made by the selectors: have well used journals been selected for archiving where they are available?