JUSP Community Survey 2016

JUSP logo

One of Evidence Base’s bread and butter jobs is to conduct the annual user survey for JUSP, which is the Jisc specialist resource for university or further education college librarians. JUSP tells them about the use of on-line journals or e-books that they have purchased or to which they have subscribed. By using JUSP a librarian can find out whether a certain journal or e-book has been used and how many times. In the old days, when everything was paper it was easy to work out if books had been taken out of the library, and if journals were looking well thumbed, or still in pristine condition because no-one had looked at them. E-resources are a different, with all sorts of possible metrics available you can find out if something has or hasn’t been downloaded, and from which IP address. Of course, it does mean that someone, somewhere has to collect and collate that information. Briefly, JUSP works by using SUSHI to harvest COUNTER statistics and it will then give you reports about the resources used by your institution. Currently, JUSP has a “Journal” portal and an “e-book” portal. If you have no idea what SUSHI and COUNTER mean, don’t worry about it, they are simply a means to an end and I will devote whole posts to explaining them in the future. Meanwhile, back to the survey…


We finished the JUSP community survey at the end of last December, and the full report is now ready to read here: http://jusp.jisc.ac.uk/news/jusp-community-survey-2016-report.pdf . The results of the report showed that JUSP is a much wanted resource for university and college librarians which adds value to their service, provides them with reliable data and saves them time (somewhere in the region of 7 hours work a month). One commented JUSP has a “hugely positive impact on staff time… freed up for analysis rather than finding and downloading data files”.  Many respondents thought that JUSP was vital to their service because without it they would have to take staff away from other tasks; one even thought that they would not use usage statistics any more if JUSP did not exist.

Of the journal statistics that JUSP can calculate, the most vital one appeared to be “Journal Report 1 (JR1)” which can tell you the number of requests there has been for one specific journal per month, over a period of time which you can select. Users of JUSP use the Journal portal statistics for a range of tasks:

  • Ad-hoc reporting
  • SCONUL reporting
  • Considering subscription renewals
  • Answering general enquiries
  • Finding evidence to prove access rights
  • General annual statistics
  • Evaluating deals with publishers
  • Benchmarking against organisations
  • To put a list of “top ten journals” on the website
  • To search for anomalies in their data

E-book portal is less used, it is still in development and does not contain statistics from many e-book suppliers used by many of the libraries. Librarians can get the statistics that they need directly from the publishers. However, respondents did think that e-book statistics are important and the ones who do use JUSPs e-book portal (just less than half the respondents) use Book Reports 2 and 3 (BR2, BR3) in order to find out the number of times an e-book has been requested per month over a selected period of time (BR2) or the number of times access has been denied to an e-book (BR3). When a library users asks for a certain e-book, only to discover that they cannot access it indicates that the library does not subscribe to or has not purchased that e-book. Librarians like to know this so that they can understand what their customers want. So, our respondents tolde-book us that they use e-book statistics for:

  • Collection development
  • Choosing which subscriptions to renew
  • To see how the full collection is being used
  • To make purchasing decisions
  • To calculate cost per download

Overall, the JUSP users were satisfied with the service that it provides and they praised the JUSP team for the support that it gives to users. Some very appreciative clients there, so university and  college librarians out there who are not using JUSP, take a look at it, you may benefit from its use.


Will the Germany V Elsevier situation herald a new model for publishing?

It is currently popular to give something up in January. Here in the UK the Charity Alcohol Concern have created the Dry January event to promote the health benefits of giving up alcohol for 31 days.  This year I have recently seen a similar campaign running around on Twitter and Facebook that urges you to give up sugar for January. Personally I have to relinquish eating the lovely, rich, stodgy Christmas food before I start looking like a Christmas Pudding.

Christmas pudding (11927643275)

However, Research Institutions across Germany are starting the new year giving up something else in order to make a stand about open access to research. Around 60 institutions have chosen to cancel their contracts with Elsevier because they claim that the publisher’s offer of a nationwide contract will “not comply with the principles of Open Access” and contribute to rising prices for access to research articles. In consequence, all access to Elsevier by the participating institutions ceased on 1st January 2017. The situation has arisen because of a project aimed to negotiate deals with all large scientific publishers for a Germany wide licence which would reduce institutional costs and increase access to scientific literature.

Project DEAL is negotiating on behalf of  the Alliance of Science Organisations in Germany who are attempting to ensure that German research institutions can provide current literature for teaching and research at a price that they can afford. The Alliance believe that large scientific publishers are wielding too much market power and earning too great a profit on the backs of unpaid work done by academics: for example, authoring, journal editing and peer review. They are making a stand at this point because costs have recently risen so steeply that library acquisition budgets have not been able to keep up and therefore have not met the needs of their researchers.

Negotiations with publishers Springer Nature and Wiley are due to begin this month.  It will be interesting to find out the results of this situation. The problem for the German researchers will be lack of access to Elsevier published papers until the dispute has been resolved. One wonders how many of the researchers will turn to open access journals, for information gathering and for publishing their own work. Alternatively, will they search open access institutional repositories to read pre-prints or send emails around the globe to get information at first hand from authors. I do hope that someone is doing research about this. (If not, Evidence Base staff are available at a moderate fee!)

In fact, is this all a portent that a different model of publishing is needed? Perhaps there is room for a duel approach to dissemination of research; publishing the work in two forms, at least one of which being Open Access and another in a top rated journal – whether that journal is Open Access or not. In the UK there is already a requirement that any university authored article published from April 2016 has it’s twin deposited into an Institutional Repository (normally a pre-print) to comply with Hefce  (Higher Education Funding Council for England) policy, for the 2021 REF which relates to the funding of universities. Pushing the boundaries even more, one of the journals in the Public Library of Open Science (PLOS) group has adopted an unusual approach. PLOS Computational Biology is asking researchers to write a Wikipedia article which is then converted into a review paper in the journal. The subject matter is dictated by a gap of knowledge in Wikipedia such as a lack of article or undeveloped article on an important topic, and the review paper is published under the Topic Pages section of the journal. Admittedly this is only possible because PLOS computational Biology is already Open Access and published under the same sort of creative commons licence as Wikipedia and edits and contributions to the article in Wikipedia can be counted as a form of open peer review. You can read more about this here: Creating an efficient workflow for publishing scholarly papers on Wikipedia.

At present, this is only conjecture on my part, but new models for publishing in the true sense, that is putting your work out into the public domain, are developing. For example, open access repositories, research blogs and websites, adding presentations to Figshare and data repositories.  The choice of dissemination route is growing. Is Elsevier therefore being unwise to prevent access to its articles if researchers can find the information they need by another route? It really depends on how long the dispute continues and the lengths to which researchers are prepared to go to discover information and to publish their own work.


Working with IRUS-UK

Evidence Base works with IRUS-UK, which is a Jisc resource for Higher Education Institutions (HEIs) such as Universities, Colleges and other Institutions that offer higher qualifications, such as HNC, HND, and graduate and post graduate degrees.

Flag Iris in Victorian garden Quex House Birchington Kent England.jpg
By AcabashiOwn work, CC BY-SA 4.0, Link

I said IRUS, not iris! Most of the UK Universities keep an Institutional Repository: an electronic archive of the work produced in the university, such as theses, research reports, data, presentations, images and articles written by staff and students. Many of these Institutional Repositories are freely accessible to anyone who wants to read the work.

As you can imagine, the Institutional Repository managers are keen to find out what work has been accessed. We are in age of Metrics where counting everything is important because use and usefulness has to be justified and value for money demonstrated. But then, as a taxpayer, you will certainly want your money to be spent wisely. IRUS – UK  can be used by Institutional Repository managers to find out what items have been downloaded from their repository. IRUS-UK sorts the downloads into lists: including type of item, title and author. Sometimes it is not so easy to identify a specific author, people use different variations of their names. For  example I could be M. Bamkin, M.R. Bamkin, Marianne Bamkin, Marianne Ruth Bamkin, or even decide to use my pre-marital surname – King, so that makes another four variations. I could be listed as eight different people, and therefore downloads of my work could indicate that it comes from eight individuals, not one. With a name like Bamkin, is it not so hard to work out that M. Bamkin is probably the same author as Marianne Bamkin, but just think about J. Smith. Is that John, Jane, James, Jamina, and if so, which one?

Now, I expect you are wondering why I am going on about different author names when I started talking about Institutional Repositories and IRUS-UK. Well, there is a link – ORCID identifiers.

I said ORCID, not orchid! ORCID is one of those names that started as an acronym but which everyone forgets, so is now a word for the electronic identifier that can be used on all authored work to show that you are one person, whatever you decide to call yourself on that day for that piece of work.

IRUS-UK is working hard to incorporate ORCIDs into the system and has written a blog post about what Institutional Repositories can do to expose ORCIDs. This means that IRUS-UK can have a searchable list of authors with their individual identifiers, so you can be sure that you have found the right J. Smith.

Everything Changes

Things are happening in Birmingham City University with new buildings being added to the Eastside campus. Evidence Base now has a new home, at the Joseph Priestly Building, in a slick open office. Here is a newspaper article about the building:


Here are some photos of our new home:


We also have a new member of staff – that is me, Dr Marianne Bamkin. I will be blogging about things to do with Library and Information as well as keeping you updated about JUSP, IRUS-UK and our research. There is no newspaper article about me.


Planning and Facilitating Focus Groups

Focus groupFocus groups are a great way to collect qualitative research data. They’re excellent for helping gain an insight into the views and perceptions of people, and a really great way of understanding users. They can be used to get feedback on products or services, or to help plan future developments. They’re often used in market research but can be adapted for a number of different purposes. One way we have used focus groups at Evidence Base was to gain a deeper understanding of student’s views on e-books and print books – their preferences, use, and reasons behind these. This was incredibly useful for supporting future purchasing decisions and user support. We’ve facilitated focus groups for a number of different organisations, including many libraries, and if you’d like us to work with you on this, please contact us.

Focus group training workshop

However, we recognised there was a need for training to enable other people to plan and facilitate their own focus groups or improve ones they already do. Following a number of requests for support in this, we spent time designing a training workshop on this topic. This started as a half-day workshop, but it has since expanded into a full day workshop to allow time for the theory, practice and planning.

The aims of the workshop are to help attendees:

  • Understand the value of focus groups and consider when they would be appropriate
  • Plan, organise and facilitate a focus group
  • Prepare activities for different types of focus groups to maximise the information collected
  • Report findings from focus groups to different stakeholders

The workshop includes a presentation to cover the logistics of focus groups, techniques to attract attendees, techniques to improve facilitation, and how to record and share the results of the focus group. It also includes a series of focus groups activities, based around a neutral topic, to experience the different types of activities you could include in focus groups. This then allows participants to consider how these could be adapted for their own focus groups.

We ran the first full day workshop in September 2015, and had very positive feedback. Some comments included:

“Very thorough information and an excellent balance between theory and practical.”

“It has given me a theoretical and practical grounding for ideas and activities I have seen in use and participated in. I now have a clear idea of what sort of activities I would / wouldn’t use in different focus groups, and a good understanding of how people react differently to activities.”

“The activities were really useful to see how to make focus groups more interactive.”

“The handbook and individual exercises were laid out really clearly and will be really useful in future. The presentation was really clear and easy to follow.”

Book your place

Due to the success of the workshop, we’ve scheduled more dates and our latest is now ready to accept bookings – Tuesday 15th March 2016. Booking is now open so you can find out more at http://www.bcu.ac.uk/evidence-base/training-and-events/planning-and-facilitating-focus-groups and book your place at https://www.surveymonkey.com/s/focusgroupworkshop.

Hope to see some of you there!

11th Northumbria International Conference on Performance Measurement in Libraries and Information Services

Over summer I attended the 11th Northumbria International Conference on Performance Measurement in Libraries and Information Services. It’s the first time I’ve attended this conference and I was really impressed with the content of the sessions I attended. It was fairly international in terms of attendance, with delegates from USA, Canada, Australia, New Zealand, South Africa… (and that’s just the people I happened to meet during the breaks!). This meant that the conference had a real variety in terms of attendees and speakers, which helped provide a much broader context. Many of the presentations I attended were from academic libraries, though I did also make sure to attend some from different sectors to see what we can learn from them.

One highlight for me included a session I attended on piloting surveys using cognitive interviews (from Ithaka S&R who administer large scale international surveys for academic staff). The techniques discussed were useful for me to consider when we plan our surveys on behalf of Library and Learning Resources, though some of the methods would be difficult given our relatively small sample size.

I also attended a session from University of Manchester who use Tableau software to create a statistics dashboard for library management with an overview of library usage statistics to support data driven decision making.

I found it very interesting to hear about the different types of roles people at the conference have, and in fact one of the poster presentations shared research on this and the training/support needs of those who work in this area. Some attendees have a role which focuses solely on the performance measurement of their library (sometimes even as part of a team), whilst others have some element of performance management for their library alongside another role (such as librarian, marketing officer, or senior manager). The focus of performance measurement for each organisation seemed slightly different, though there were many commonalities in terms of measuring usage of resources, space, and library staff expertise.

From talking to others, it seems there is a perception that all other libraries are ahead of their own! I heard many people (both in their presentations and when talking to them during the networking opportunities) apologise for the less than perfect approach they take to measuring performance. I think as a profession we need to cut ourselves some slack here – very little data collection (if any?!) is ever perfect, but collecting data and using it is infinitely better than not collecting it or not using it.

Many of the people I spoke to agreed that there is a need for more sharing within the profession, both in terms of performance measurement methods and strategies for communicating findings; this could be through events such as the Northumbria Conference and Library Assessment Conference (and their conference proceedings), or via professional networks such as mailing lists and social media.

I came away from the conference inspired and enthused and hope to be able to use some of the methods in our own research, both for BCU Library and Learning Resources, and for our external projects.

Library Research Priorities Survey

We would like to invite you to complete the Library Research Priorities Survey which is being conducted by Evidence Base at Birmingham City University in order to understand more about current practice and future plans for research in libraries and information services. It covers conducting and disseminating research, research skills, and support and guidance.

The survey is open to all LIS practitioners in any sector and country. It should take around 10-15 minutes of your time and all questions are optional. The survey will remain open until May 29th 2015 and is available at:


To thank you for your participation, we are offering entry into a prize draw to win one of two £50 Amazon vouchers. Details are on the final section of the survey.

For purposes of this study, we are defining research as:

The process of arriving at dependable solutions to problems/questions/hypotheses through the planned and systematic collection, analysis, and interpretation of data: it may be applied or theoretical in nature and use quantitative or qualitative methods. (This definition does not include library research that is limited to activities such as compiling bibliographies and searching catalogues.)

Please feel free to pass the details of the survey to other contacts who may be researching in library and information services, or would like to.

If you have any questions about the survey, please contact ebase@bcu.ac.uk.