JUSP Community Survey 2017

A little over a year ago I blogged the results of the 2016 annual survey that Evidence Base runs for the Jisc resource for academic and research  libraries, reporting their use of e-journals and e-books (JUSP) which they have purchased or to which they subscribe.

We have now completed this year’s survey and the 2017 report, which has been interesting because JUSP has added more reports and other resources to its service since last February. As well as being able to see how many times the library users have downloaded or viewed e-journals or e-books, library e-resource managers can now find out those statistics for a select number of data bases and aggregators. The JUSP team have also developed a set of data visualisations of the statistical reports that can be used by library staff to add to their own reports, or presentations, or as an analytical tool.

pexels-photo-265087.jpeg

These data visualisations have proven popular with the JUSP community and although it is early days since their introduction in November it seems that some institutions are already making good use of them and see potential for these to have a positive impact on their work. We will be monitoring the use of these and are interested in any interesting ways that libraries are using the JUSP visualisations.

Overall, the JUSP community are happy with JUSP, they would find their work much harder without it. Special mention was given about the customer support provided by the JUSP team who tirelessly work in the background ensuring that JUSP users are accessing their e-resource usage figures and develop the online resource in the way that is most useful to the JUSP community.

Advertisements

Latest News from JUSP

I have two pieces of information from JUSP. The team has been very busy over the past few months and two items have come into fruition. We did some interviews earlier in the year about e-book statistics. You may recall that I blogged about “the trouble with ISBNs”, and that post was due to my work with the e-book statistics project.

We wanted to know what challenges were faced by the teams and individuals whose roles include the collection and reporting of e-book usage statistics. We did some case study interviews that included a cross section of publishers, librarians, aggregators and  library consortia, from the UK and other countries. We not only asked about the challenges, but also about how they overcame them and what recommendations would they give for the future collection of e-book statistics.

We discovered that one major problem was the lack of a standard for what was termed a section of a book. This means that if you are counting the number of times that a book section has been downloaded, you cannot be sure whether that is a whole chapter, a page, or even one dictionary entry. Surprisingly, we found that there was a lack of relevant common identifiers – hence my thoughts on ISBNs. Again, in this age of machine automation, we found that many of the solutions to challenges meant a great deal of manual work and manipulation.

The project and the recommendations that resulted from the work have been written up as an article in Insights and as a full report.

The second news item is that the e-book portal in JUSP will no longer be called the e-book portal. This is because that portal will contain COUNTER reports of databases as well as e-book reports, starting on 4th September. The portal will be re-titled “Books and other”. The team are working towards including other reports on that portal as well, such as multi-media. As always, the team are speaking to many publishers and with the addition of more COUNTER reports more publishers will be joining JUSP. You will find a little more information about this in the JUSP newsletter and look out for further details as the team make the changes to the portal.

IRUS-UK Community Survey 2017

IRUS logoAnother of Evidence Base’s regular tasks is to compile and send out the IRUS-UK user survey. IRUS-UK is a Jisc community driven resource for Institutional Repositories in the UK. When a repository becomes a member of IRUS-UK the IRUS team collects the usage data of items that have been deposited in that repository and processes the data into COUNTER compliant statistics. I have promised to write a blog about COUNTER and I apologise for not doing it yet, in the meantime follow this Link to find out more about the project and the code of practice. The benefit to repositories is that they then have validated usage data about the work that has been deposited which can be compared  with other standardised data.

The annual survey ensures that the IRUS team are developing IRUS-UK to suit the needs of the Institutional Repository community and a great deal of attention is paid to the feedback that is received. Sometimes it is not possible to implement a suggestion, sometimes it takes time to make a change, but each suggestion in considered thoughtfully and steps are taken to impblue surveyrove the website and the service.

This year we found out that IRUS-UK is mostly used for identifying trends and patterns of usage. It is used least for SCONUL reporting. Other uses for IRUS-UK include:

  • Awareness raising
  • Checking records
  • Tweeting about statistics
  • Advocacy with researchers

I particularly like the idea of tweeting about the statistics. It is good to boast on twitter that a certain document has been accessed. Especially as I have just found out through IRUS-UK that one of my articles has been downloaded 185 times.

The IRUS-UK community consider that the most useful reports are Repository Statistics and Item Reports. Most people thought that IRUS-UK provided value because of the reliable and accurate counter compliant statistics, because it helps benchmarking and reporting and it helps to compare data. Moreover, it saves a lot of staff time.town clock

Some of the best things about IRUS-UK are:

  • It is easy to use and access
  • It is good at archiving Statistics
  • The community support is great

There were some requests for some new features such as some quick guides to reports, some more case studies and some bite sized videos to demonstrate how to use the reports. Members also wanted more data visualisations -sometimes only a picture will tell the story – and there was a request that IRUS-UK becomes Mobile friendly.

These and other suggestions are now under consideration by the team . Keep looking at IRUS-UK to see how it develops. And Thank You to the institutional repository staff out there that took the time to complete the survey.

JUSP Community Survey 2016

JUSP logo

One of Evidence Base’s bread and butter jobs is to conduct the annual user survey for JUSP, which is the Jisc specialist resource for university or further education college librarians. JUSP tells them about the use of on-line journals or e-books that they have purchased or to which they have subscribed. By using JUSP a librarian can find out whether a certain journal or e-book has been used and how many times. In the old days, when everything was paper it was easy to work out if books had been taken out of the library, and if journals were looking well thumbed, or still in pristine condition because no-one had looked at them. E-resources are a different, with all sorts of possible metrics available you can find out if something has or hasn’t been downloaded, and from which IP address. Of course, it does mean that someone, somewhere has to collect and collate that information. Briefly, JUSP works by using SUSHI to harvest COUNTER statistics and it will then give you reports about the resources used by your institution. Currently, JUSP has a “Journal” portal and an “e-book” portal. If you have no idea what SUSHI and COUNTER mean, don’t worry about it, they are simply a means to an end and I will devote whole posts to explaining them in the future. Meanwhile, back to the survey…

survey

We finished the JUSP community survey at the end of last December, and the full report is now ready to read here: http://jusp.jisc.ac.uk/news/jusp-community-survey-2016-report.pdf . The results of the report showed that JUSP is a much wanted resource for university and college librarians which adds value to their service, provides them with reliable data and saves them time (somewhere in the region of 7 hours work a month). One commented JUSP has a “hugely positive impact on staff time… freed up for analysis rather than finding and downloading data files”.  Many respondents thought that JUSP was vital to their service because without it they would have to take staff away from other tasks; one even thought that they would not use usage statistics any more if JUSP did not exist.

Of the journal statistics that JUSP can calculate, the most vital one appeared to be “Journal Report 1 (JR1)” which can tell you the number of requests there has been for one specific journal per month, over a period of time which you can select. Users of JUSP use the Journal portal statistics for a range of tasks:

  • Ad-hoc reporting
  • SCONUL reporting
  • Considering subscription renewals
  • Answering general enquiries
  • Finding evidence to prove access rights
  • General annual statistics
  • Evaluating deals with publishers
  • Benchmarking against organisations
  • To put a list of “top ten journals” on the website
  • To search for anomalies in their data

E-book portal is less used, it is still in development and does not contain statistics from many e-book suppliers used by many of the libraries. Librarians can get the statistics that they need directly from the publishers. However, respondents did think that e-book statistics are important and the ones who do use JUSPs e-book portal (just less than half the respondents) use Book Reports 2 and 3 (BR2, BR3) in order to find out the number of times an e-book has been requested per month over a selected period of time (BR2) or the number of times access has been denied to an e-book (BR3). When a library users asks for a certain e-book, only to discover that they cannot access it indicates that the library does not subscribe to or has not purchased that e-book. Librarians like to know this so that they can understand what their customers want. So, our respondents tolde-book us that they use e-book statistics for:

  • Collection development
  • Choosing which subscriptions to renew
  • To see how the full collection is being used
  • To make purchasing decisions
  • To calculate cost per download

Overall, the JUSP users were satisfied with the service that it provides and they praised the JUSP team for the support that it gives to users. Some very appreciative clients there, so university and  college librarians out there who are not using JUSP, take a look at it, you may benefit from its use.

CERIF in Action Workshop

CERIF in Action Workshop 2012 10 19

Evidence Base is currently working with the JISC-funded Digital Infrastructure programme. One of the areas of the programme is Research Information Management (RIM) that includes the adoption of the common European research information format (CERIF) by the UK higher education community. As one of the steps in data gathering, we attended the CERIF in Action project workshop on the 19th of October. The workshop brought together nearly 60 professionals working on research information management from higher education, library and commercial sectors.

The workshop was centred on discussing the current situation and future prospects of research information management from different perspectives, including those of Research Outcomes System (ROS), ResearchFish, Gateway to Research, UK Research Councils and others. Several surveys presented in the event demonstrated prevalence of and demand for submission of research information by bulk – that is, institutions submitting large amounts of research information to ROS in contrast to manual submission when principal investigators on their own submit to research councils. Such bulk submission reduces the community costs per submission by roughly half. It is possible that CERIF can further significantly decrease these costs. The new RIM programme manager Verena Weigert estimated that savings of 20 – 30% can be made if CERIF was made compatible with Current Research Information Systems (CRIS).

The workshop also brought together practitioners to produce possible steps for future activity for CERIF, extended use cases, ResearchFish/ROS and non-textual outputs during the break-out sessions. The main issue identified across all areas was the necessity to standardise terminology and definitions.

This event provided Evidence Base researchers with valuable information for their work. More information on the workshop including its materials will be made available on the CERIF in Action blog.

Mobile technologies in library – community support project outputs

Mobile technologies in libraries

Mobile technologies in libraries (photo from Kennedy Library on Flickr)

The JISC-funded mobile library community support project we’ve been working on is drawing to a close, with just the final survey report still to be published. I thought it would be useful to provide a summary of the outputs with relevant links.

Social media resources

We’ve been using a number of social media services to collate resources and support discussions:

Case studies

We collected a number of case studies throughout the project:

Pathways to Best Practice

We brought together the resources we had collated as well as information from conversations with practitioners to put together ten pathways to best practice guides. Each includes an explanation of the area, the benefit to the library, current state of maturity, examples of initiatives in libraries, lessons learned and useful contacts. You can view each online or download a PDF version.

Fact finding surveys

We ran two surveys during the project; one at the beginning of the project (Nov-Dec 2011) and one at the end (Jul-Aug 2012). The surveys gave an idea of where libraries were in terms of implementing initiatives with mobile technologies as well as examining barriers to implementation and considering potential solutions. The report for the first survey is available from Slideshare (you can download a copy). The report for the second will be available shortly from the documents section of our Slideshare account.

Continuing the discussion

I’m sure we will continue to keep abreast of developments in the area and will continue to share resources we find (using the tag of mlibs). We’ll also be blogging still, though the blog is in the process of being moved to a new home over on the JISC blogging platform. We’ve also set up a JISCMail M-LIBRARIES-GROUP discussion list, so please feel free to subscribe and share any news or ask questions via the mailing list.

Making the Most of JUSP – Birmingham

Last week, a group of eager representatives from some of the JISC-funded Journal Usage Statistics Portal (JUSP) libraries joined us in Birmingham for the first Making the Most of JUSP event. The aim of the Making the Most of JUSP event is to provide attendees with the opportunity to:

  • learn more about how JUSP can help you understand e-journal usage and introduce you to some of its more advanced features
  • exchange experiences with other members of the ever growing JUSP community
  • provide feedback to the JUSP team to inform development of the service

To start the day, we had a brief introduction to the JUSP team (JUSP is a consortium project involving Mimas, JISC Collections, Cranfield University and Evidence Base), and then we had three excellent presentations from participating libraries – University of Portsmouth, Open University, and University of Glasgow.

Sarah Weston, University of Portsmouth

Sarah’s presentation highlighted the progress the University of Portsmouth have made in terms of collecting and analysing usage data. She talked us through the initial process which was manual and labour intensive and then discussed how JUSP has helped streamline their processes and some of the ways they use it. She outlined their needs, including information over a long time period for comparison, all sources in one place, identifiers for titles within or outside deals, and also the ability to add print usage.

The reports and features within JUSP that are particularly useful for University of Portsmouth include:

  • Titles vs deals (used to identify titles within deals which is baseline for analysis and separate the pre-existing subscribed titles to accurately benchmark costs)
  • Titles within deals over time (which gives at a glance information of how deal content has changed to facilitate accurate reporting)
  • Downloadable CSV data to spreadsheet software (including column of aggregator usage to include or exclude)
  • Publisher usage by title and year (valuable for benchmarking, eliminating a significant number of steps and providing an accurate time series upon which to import own data)
  • Titles and usage range (to identify high-performing titles with high usage levels)
  • Stars visually highlighting subscribed titles across reports (once subscribed titles have been added)

Alison Brock, Open University

Alison discussed the unique challenges of being a distance learning education provider, and the importance of journal usage statistics to identify curriculum changes and inform future purchasing. She gave an outline of the process using JUSP:

  1. Collect JR1 and JR1a reports from JUSP
  2. Export these to Excel for each NESLi2 publisher
  3. Sort by high to low use
  4. Remove monthly columns and others not needed
  5. Add in cost data
  6. Calculate cost per use
  7. Highlight and colour code titles (high use non-subscribed titles and low use subscribed titles)
  8. Add details of faculty for subscribed titles
  9. Recommend swaps/cancellations to faculty representatives

Alison highlighted the the fact that getting the data via JUSP meant it was easier to collate and compare, and that the JUSP checking processes ensure its reliability.

Jacqui Dowd, University of Glasgow

Jacqui’s presentation talked about the importance of journal usage data in supporting University of Glasgow’s KPIs (key performance indicators). JUSP has helped University of Glasgow by reducing the need for some of the analysis which was previously done manually. They utilise the JR1 and JR1a reports (including gateway and intermediary use), as well usage over specified date ranges and number of titles in different usage ranges (nil, low, medium and high usage).

Using these reports has saved University of Glasgow a great deal of time – it has reduced the number of providers they need to download by ~16% and reduced the need to calculate JR1 – JR1a by title by ~75%. They also appreciate the benefits of using JUSP data which they know is checked by both the JUSP team and other participating libraries (therefore improving reliability of data) and also that JUSP will chase publishers for data ensuring up-to-date usage information is available.

Workshop activities

The attendees were then split into four groups – each doing one of two practical exercises devised by the JUSP team. The exercises encouraged them to explore JUSP and its reports, and consider how they might be used. Many commented that this had helped them consider uses of JUSP for their own institutions. We hope to refine these exercises over the series of workshops, and we then hope to add it to our online support materials for people to work through at their own pace. Feedback from the activities has also helped us consider the titles of the reports, functionality, and how they are accessed which is really useful feedback.

JUSP enhancements – Angela Conyers, Evidence Base

Angela introducing the JUSP enhancements

Angela introducing the JUSP enhancements

After lunch, Angela presented on the JUSP enhancements which were officially launched at the workshop and have now been enabled for all libraries.

This includes two areas of work:

  1. Deals and titles – data on titles included in deals from JISC Collections and being able to view these deals within JUSP and their changes over time.
  2. Subscribed titles – being able to mark up your subscribed titles (i.e. those outside a deal) so that these are highlighted in all reports with a star.

More information on the JUSP enhancements and support materials on how to use them are being produced.

Panel discussion and future priorities

A panel discussion followed Angela’s presentation and had representatives from Mimas, JISC Collections and Evidence Base. Each gave brief updates on some key information such as COUNTER 4, publisher updates, and community support. The discussion also led to some suggestions from participants about what they would like JUSP to focus on next. The main areas of agreement from this were:

  • Critical mass of journal publishers
  • Community of practice for libraries to share resources for using JUSP data

We hope to be able to take these areas forward as the service continues to progress.