IRUS-UK Community Survey 2017

IRUS logoAnother of Evidence Base’s regular tasks is to compile and send out the IRUS-UK user survey. IRUS-UK is a Jisc community driven resource for Institutional Repositories in the UK. When a repository becomes a member of IRUS-UK the IRUS team collects the usage data of items that have been deposited in that repository and processes the data into COUNTER compliant statistics. I have promised to write a blog about COUNTER and I apologise for not doing it yet, in the meantime follow this Link to find out more about the project and the code of practice. The benefit to repositories is that they then have validated usage data about the work that has been deposited which can be compared  with other standardised data.

The annual survey ensures that the IRUS team are developing IRUS-UK to suit the needs of the Institutional Repository community and a great deal of attention is paid to the feedback that is received. Sometimes it is not possible to implement a suggestion, sometimes it takes time to make a change, but each suggestion in considered thoughtfully and steps are taken to impblue surveyrove the website and the service.

This year we found out that IRUS-UK is mostly used for identifying trends and patterns of usage. It is used least for SCONUL reporting. Other uses for IRUS-UK include:

  • Awareness raising
  • Checking records
  • Tweeting about statistics
  • Advocacy with researchers

I particularly like the idea of tweeting about the statistics. It is good to boast on twitter that a certain document has been accessed. Especially as I have just found out through IRUS-UK that one of my articles has been downloaded 185 times.

The IRUS-UK community consider that the most useful reports are Repository Statistics and Item Reports. Most people thought that IRUS-UK provided value because of the reliable and accurate counter compliant statistics, because it helps benchmarking and reporting and it helps to compare data. Moreover, it saves a lot of staff time.town clock

Some of the best things about IRUS-UK are:

  • It is easy to use and access
  • It is good at archiving Statistics
  • The community support is great

There were some requests for some new features such as some quick guides to reports, some more case studies and some bite sized videos to demonstrate how to use the reports. Members also wanted more data visualisations -sometimes only a picture will tell the story – and there was a request that IRUS-UK becomes Mobile friendly.

These and other suggestions are now under consideration by the team . Keep looking at IRUS-UK to see how it develops. And Thank You to the institutional repository staff out there that took the time to complete the survey.

JUSP Community Survey 2016

JUSP logo

One of Evidence Base’s bread and butter jobs is to conduct the annual user survey for JUSP, which is the Jisc specialist resource for university or further education college librarians. JUSP tells them about the use of on-line journals or e-books that they have purchased or to which they have subscribed. By using JUSP a librarian can find out whether a certain journal or e-book has been used and how many times. In the old days, when everything was paper it was easy to work out if books had been taken out of the library, and if journals were looking well thumbed, or still in pristine condition because no-one had looked at them. E-resources are a different, with all sorts of possible metrics available you can find out if something has or hasn’t been downloaded, and from which IP address. Of course, it does mean that someone, somewhere has to collect and collate that information. Briefly, JUSP works by using SUSHI to harvest COUNTER statistics and it will then give you reports about the resources used by your institution. Currently, JUSP has a “Journal” portal and an “e-book” portal. If you have no idea what SUSHI and COUNTER mean, don’t worry about it, they are simply a means to an end and I will devote whole posts to explaining them in the future. Meanwhile, back to the survey…

survey

We finished the JUSP community survey at the end of last December, and the full report is now ready to read here: http://jusp.jisc.ac.uk/news/jusp-community-survey-2016-report.pdf . The results of the report showed that JUSP is a much wanted resource for university and college librarians which adds value to their service, provides them with reliable data and saves them time (somewhere in the region of 7 hours work a month). One commented JUSP has a “hugely positive impact on staff time… freed up for analysis rather than finding and downloading data files”.  Many respondents thought that JUSP was vital to their service because without it they would have to take staff away from other tasks; one even thought that they would not use usage statistics any more if JUSP did not exist.

Of the journal statistics that JUSP can calculate, the most vital one appeared to be “Journal Report 1 (JR1)” which can tell you the number of requests there has been for one specific journal per month, over a period of time which you can select. Users of JUSP use the Journal portal statistics for a range of tasks:

  • Ad-hoc reporting
  • SCONUL reporting
  • Considering subscription renewals
  • Answering general enquiries
  • Finding evidence to prove access rights
  • General annual statistics
  • Evaluating deals with publishers
  • Benchmarking against organisations
  • To put a list of “top ten journals” on the website
  • To search for anomalies in their data

E-book portal is less used, it is still in development and does not contain statistics from many e-book suppliers used by many of the libraries. Librarians can get the statistics that they need directly from the publishers. However, respondents did think that e-book statistics are important and the ones who do use JUSPs e-book portal (just less than half the respondents) use Book Reports 2 and 3 (BR2, BR3) in order to find out the number of times an e-book has been requested per month over a selected period of time (BR2) or the number of times access has been denied to an e-book (BR3). When a library users asks for a certain e-book, only to discover that they cannot access it indicates that the library does not subscribe to or has not purchased that e-book. Librarians like to know this so that they can understand what their customers want. So, our respondents tolde-book us that they use e-book statistics for:

  • Collection development
  • Choosing which subscriptions to renew
  • To see how the full collection is being used
  • To make purchasing decisions
  • To calculate cost per download

Overall, the JUSP users were satisfied with the service that it provides and they praised the JUSP team for the support that it gives to users. Some very appreciative clients there, so university and  college librarians out there who are not using JUSP, take a look at it, you may benefit from its use.

IRUS-UK new users survey feedback

IRUS logo

IRUS logo

Evidence Base is responsible for community engagement on the IRUS-UK project, a Jisc-funded project developing a statistics service for repositories in UK Further and Higher Education institutions. As part of the community engagement, we survey new joiners to the service to get her their first impressions, collect ideas and suggestions, and discover the areas they need more support with. Their feedback informs the technical development and guidance and support aspects of the project:

  • Technical development – we keep a technical wishlist based on user feedback, and review this at monthly team meetings to prioritise development work.
  • Guidance and support materials – we work with other members of the IRUS-UK team to provide relevant help and guidance for using IRUS. User feedback helps us focus our efforts on guidance for the areas that need it most.

As we are moving to the second year of IRUS-UK, we produced a summary of the key themes from the user survey so far. This included benefits and challenges, the way people use IRUS-UK and other repository statistics, their views on the open data approach, benchmarking, and feedback on specific features of IRUS-UK. The table below demonstrates some of the ways we have responded to feedback from these surveys:

IRUS-UK response to user feedback

IRUS-UK response to user feedback

Community input to IRUS-UK is something which is highly valued, and we will continue to collect feedback from the community on a regular basis. The summary report is now available from the News page of the IRUS website.

Report on m-library activity

As part of the JISC-funded mobile library community support project, we ran two fact finding surveys; one at the beginning of the project and one at the end. We have now published the final report for the end of project survey (data collected July-August 2012) as well as a series of summary blog posts.

The full report is available online as a PDF, or you can view it below:

CERIF in Action Workshop

CERIF in Action Workshop 2012 10 19

Evidence Base is currently working with the JISC-funded Digital Infrastructure programme. One of the areas of the programme is Research Information Management (RIM) that includes the adoption of the common European research information format (CERIF) by the UK higher education community. As one of the steps in data gathering, we attended the CERIF in Action project workshop on the 19th of October. The workshop brought together nearly 60 professionals working on research information management from higher education, library and commercial sectors.

The workshop was centred on discussing the current situation and future prospects of research information management from different perspectives, including those of Research Outcomes System (ROS), ResearchFish, Gateway to Research, UK Research Councils and others. Several surveys presented in the event demonstrated prevalence of and demand for submission of research information by bulk – that is, institutions submitting large amounts of research information to ROS in contrast to manual submission when principal investigators on their own submit to research councils. Such bulk submission reduces the community costs per submission by roughly half. It is possible that CERIF can further significantly decrease these costs. The new RIM programme manager Verena Weigert estimated that savings of 20 – 30% can be made if CERIF was made compatible with Current Research Information Systems (CRIS).

The workshop also brought together practitioners to produce possible steps for future activity for CERIF, extended use cases, ResearchFish/ROS and non-textual outputs during the break-out sessions. The main issue identified across all areas was the necessity to standardise terminology and definitions.

This event provided Evidence Base researchers with valuable information for their work. More information on the workshop including its materials will be made available on the CERIF in Action blog.

Mobile technologies in libraries – end of project survey

The m-libraries support project (managed by Evidence Base and Owen Stephens Consulting) is part of JISC’s Mobile Infrastructure for Libraries programme running from November 2011 until September 2012.

The project aims to build a collection of useful resources and case studies based on current developments using mobile technologies in libraries, and to foster a community for those working in the m-library area or interested in learning more.

At the beginning of the project we ran a survey to gather information, to discover what was needed to help libraries decide on a way forward, and to begin to understand what an m-libraries community could offer to help (full report available). It’s now time to revisit these areas to see how things have changed.

Please answer the following few questions – they should only take 5-10 minutes and all questions are optional.

This is an open survey – please pass the survey link on to anyone else you think might be interested via email or social media: http://svy.mk/mlibs2

DevCSI stakeholder analysis survey 2011-12

Evidence Base at Birmingham City University has been commissioned to undertake a survey of stakeholders on behalf of DevCSI, the Developer Community Supporting Innovation project (http://devcsi.ukoln.ac.uk). DevCSI aims to build a community of developers working / studying in UK Education and investigate the value and impact it can make to technical innovation in the wider educational community and at an organisational level. DevCSI is managed by the Innovation Support Centre, UKOLN at the University of Bath and funded by JISC. The broad topics of this survey include: benchmarking developers across the sector; examining stakeholders’ views of software development; discovering examples of local innovation; and gathering suggestions about the on going future development of a developer community.

The survey is currently available for developers, managers of developers, senior managers, funders, vendors/suppliers and users (academics/researchers/librarians) at: http://www.surveymonkey.com/s/devcsi2011-12

Each respondent will be able to enter a prize draw to win a £200 Amazon voucher or one of four £50 vouchers. If you would like to enter for your chance to win, please follow instructions at the end of the survey.

The survey should take approximately 10-15 minutes of your time. Questions marked * are compulsory. Please be assured that all data will be anonymised during analysis.

In addition to the survey responses the DevCSI team are looking for people who would be willing to provide further in depth case study data to support the project. There will be an option towards the end of the survey to supply your contact details if you are interested in finding out more about this. Please note this is not a compulsory element of the survey.

If you have any queries about this survey, please contact Evidence Base: ebase@bcu.ac.uk

Thanks for your help – we really value your feedback.