Home | News & Events | Events | 2014 Events | NISO/NASIG Webinar

NISO/NASIG Joint Webinar:
Playing the Numbers: Best Practices in Acquiring, Interpreting, and Applying Usage Statistics

May 21, 2014
1:00 p.m. - 2:30 p.m. (Eastern Time)

Can't make it on the webinar day? Register now and gain access to the archive for one year.

System Requirements:

  • NISO has developed a quick tutorial, How to Participate in a NISO Web Event. Please view the recording, which is an overview of the web conferencing system and will help to answer the most commonly asked questions regarding participating in an online Webex event.
  • You will need a computer for the presentation and Q&A.
  • Audio is available through the computer (broadcast) and by telephone. We recommend you have a set-up for telephone audio as back-up even if you plan to use the broadcast audio as the voice over Internet isn't always 100% reliable.
  • Please check your system in advance to make sure it meets the Cisco WebEx requirements. It is your responsibility to ensure that your system is properly set up before each webinar begins.

About the Webinar

In a time of shrinking budgets and growing reliance on electronic resources, the collection and analysis of usage statistics has become a staple of the library world. But while usage statistics may be ubiquitous, many librarians still struggle with the best methods of interpreting the data. The ability to effectively understand and apply usage data is an important skill for librarians to master as they attempt to analyze their collections and justify their expenses to administrations.

This webinar will highlight the ins and outs of COUNTER, as well as discuss the process of analyzing the data once harvested.



Todd Carpenter, Executive Director, NISO

Todd Enoch, Head, Serials and Electronic Resources, University of North Texas Libraries;
Chair of the Continuing Education Committee, NASIG

* * * * * * *

COUNTER Update: Release 4 of the COUNTER Code of Practice for e-Resources
Peter Shepherd, Project Director, COUNTER

In a career spanning over three decades, Peter Shepherd has become intimately acquainted with most aspects of STM journal, book and database publishing, having worked as a publisher with Wiley, Pergamon, Elsevier and Harcourt. Since 2002, Dr. Shepherd has been Director of COUNTER, the not-for-profit international organization whose mission is to improve the quality and reliability of online usage statistics. Since 2011, COUNTER has also had overall responsibility of the development and management of the Usage Factor project.

Peter Shepherd received a PhD in chemistry in 1978 from St. Andrews University and joined the publishing industry in 1980, following a post-doctoral research fellowship at the University of California, Santa Barbara.

* * * * * * * *

Integrating COUNTER Statistics within the Information Workflow
Oliver Pesch, Chief Product Strategist and Senior Vice President, EBSCO Information Services

Collecting COUNTER statistics is just the first step in effective collection analysis. Usage data often needs to be combined with cost details, holdings information and sometimes other attributes of a title (such as Impact Factor) for the analysis to be effective. Collecting this additional data can be a challenge as can presenting the analysis in the right context so that the results are meaningful. As an example, there are some interesting nuances to the “cost” data for e-journals and e-packages subscriptions that can be particularly troublesome. This presentation will look at some of these challenges and provide some suggestions for overcoming them.

Oliver Pesch works as chief product strategist for EBSCO Information Services where he helps set direction for EBSCO’s e-resource products and services, including EBSCO’s Usage Consolidation & Analytics products. Oliver is a strong supporter of standards and is very involved in the development of standards related to usage. He is currently co-chair of the NISO SUSHI Standing Committee; he is member of the Executive Committee for Project COUNTER where he also serves as chair of the Technical Advisory Group; and, is a regular contributor to Serials Librarian through his Spotlight on Serials Standards column.  

* * * * * * * * 

Usage in the Eye of the Beholder: Developing Academic Library Usage Reports that Meet the Needs of Your Institution
Jill Emery, Collection Development Librarian, Portland State University Library

The presentation will focus on which questions to ask to develop the usage information most needed by your institution, examine some of the ways to use current standards and vendor systems to gather use information of resources, and outline ways to develop different reports for different audiences, and the depiction of usage information for the utmost impact.

Jill Emery is the collection development librarian at Portland State University Library and has close to 20 years of academic library experience from various higher education institutions within the United States of America. She is a past-president of the North American Serials Interest Group (NASIG) and the social media specialist for the Electronic Resources & Libraries, LLC. Jill serves as a current member of the Charleston Advisor editorial board and is the columnist for “Heard on the Net,” and is also on the editorial board of Insights: the UKSG journal.

Event Q&A

Could you address why sessions were removed from the database reports?

Oliver Pesch (OP): Peter addressed this later during his talk. Because of advances in technology including the use of federated search, discovery, mobile devices and more the notion of a “session” is no longer a meaningful metric and no longer can be used to represent the equivalent of a virtual “gate count”. For example, a federated search or mobile device may operate in a stateless manner that means every action, whether it be a search or a retrieval is technically a new “session”. Or, some federated search engines may open a session with a given content provider through which the manage searches from multiple users.

Do the mobile reports cover usage through an app, through the resource's website, or both?

(OP) There are two aspects of mobile use being captured. The “mobile” version of Journal and Title reports are intended capture use by mobile devices as detected through the user-agent of the browser or by virtue of a mobile application being used – it is about the device being used and not the format being retrieved. But COUNTER has also added some additional metric types that can be included in JR3 that reflect retrieval of the mobile formatted versions of the full text. To summarize, the “JR1mobile” report looks at the activity by mobile “devices” regardless of website or format; and, JR3 can include the number of times mobile formats of full text were retrieved regardless of the device.

Database report still seem to be reporting the same number across subject packages - Example ProQuest shows usage for all their usage as the same number for PQ ABI Inform, newspapers, PQ Environmental, Social Sciences. Could you explain why?

(OP) It is not possible to answer this without seeing the reports and the metric type being questioned and even the environment under which the activities are taking place. For example, if the institution is set up to search using a federated or multiple database search, then it could be that every database is searched for every user query and thus the search counts would be the same.

For the usage factor, how will gaming be addressed in the Counter standard? Will usage providers need to have the capability to idenify and scrub out usage deemed to be coming from gaming?

Peter Shepherd (PS): As part of the Usage Factor project we investigated different gamng scenarios and we are currently working to develop a COUNTER protocol/tool to enable vendors to identify and eliminate 'rogue' usage.

(OP): Also, since the Use Factor calls for the median usage of articles in a journal to be used it is less susceptible to gaming where an individual article is targeted.

Still not entirely clear what the difference would be between Record Views and Result Clicks, it sounds like they are both supposed to measure clicks on content that comes up as a result of a Search?

(OP) A “Record View” is a very specific transaction where the user has viewed the detailed metadata record from a given database – e.g. to see the abstract and subject headings, etc. A record view could occur from a user clicking on the result list, or navigating detailed records or even linking in from another site – it is about viewing the detailed record. A “Result Click” is about the activity that happens on the search result list and it is tracking any click the user might have made on a result from a give database. Some example of Result Clicks are: a user clicks the “findIt@my library” link; the user clicks to view full text; the user clicks to see the full record; the user clicks to “request article via ILL”; etc. In each case, the user has expressed interest in the result by clicking on something – and since that expression of interest can be attributed to the database that contained the metadata it also serves to indicate the value of the database.

So the Ebsco renewal information is available to all Ebsco customers?

(OP) Any subscription customer that licenses our Usage Consolidation product can see usage in EBSCONET Renewals, Order, etc. at no additional charge.

Will Counter Code of Practice for Articles be broken out by subject discipline?

(OP) No, the Article Report 1 defines a format for exchanging article-level usage – it does not prescribe how that usage is to be analyzed.

So, will Usage Consolidation eventually be able to "bring in" or upload data from other sources (e.g. impact factor)?

(OP) EBSCO has plans for adding more details to the titles in its global knowledge base… including peer review indicators, open access designations and various impact scores. Some metrics, such as the ISI Impact Factor are protected by copyright; therefore, agreements need to be put in place before that data can be loaded and shared.

What is the role of data visualization for usage statistics?

(OP) Visualization is important when presenting usage. Jill’s presentation showed several good examples of this.

How do you report database usage?

Jill Emery (JE): We use the DB1 report for the majority of databases and count the number of searches performed and calculate cost per use on this figure. We review but do not report on mobile usage at this time.

Follow-up to the last question: If JR1 does not list usage by database, how do you use JR1 to calculate cost per use for database?

(JE): We use DB1 for database searches and do not use JR1 for database access to content. We only look at journal usage direct form the publisher and note if there is coverage within an aggregated database. We consider the provider of the journal content to be the primary use and if it is not then cancellation is even more justified if there is then aggregator coverage available.

Do you have any real-life scenario on analyzing the stats?

My best example is the John Wiley journal package. We analyze down to the title level for subject liaisons to make cancellation decisions and these spreadsheets include impact factors, SNIP, notation for editorial work done by faculty on campus. Then we also report to the campus the overall usage of the package on both subscribed and non-subscribed (enhanced) usage to show that paying the premium for the additional content coverage is worthwhile.


If paying by credit card, register online.

If paying by check, please use this PDF form.

Registration closes on May 21, 2014 at 12:00 p.m. (ET)

Registration Costs

Additional Information

  • NISO Member

    • $95.00 (US and Canada)
    • $109.00 (International)
  • NASIG Member

    • $95.00
  • Non-Member

    • $125.00 (US and Canada)
    • $149.00 (International)
  • Student

    • $49.00
  • Registration closes at 12:00 p.m. (ET) on May 21, 2014. Cancellations made by May 14, 2014 will receive a refund, less a $25 cancellation. After that date, there are no refunds.

  • Registrants will receive detailed instructions about accessing the webinar via e-mail the Friday prior to the event. (Anyone registering between Monday and the close of registration will receive the message shortly after the registration is received, within normal business hours.) Due to the widespread use of spam blockers, filters, out of office messages, etc., it is your responsibility to contact the NISO office if you do not receive login instructions before the start of the webinar.
  • If you have not received your Login Instruction email by 10:00 a.m. (ET) on the Tuesday before the webinar, at please contact the NISO office or email Juliana Wood, Educational Programs Manager at jwood@niso.org for immediate assistance.

  • Registration is per site (access for one computer) and includes access to the online recorded archive of the webinar. You may have as many people as you like from the registrant's organization view the webinar from that one connection. If you need additional connections, you will need to enter a separate registration for each connection needed.
  • If you are registering someone else from your organization, either use that person's e-mail address when registering or contact the NISO office to provide alternate contact information.
  • Library Standards Alliance (LSA) members will need to register at the member rates. Joint NISO/NASIG webinars, as well as other partner events, are not included in the free webinar package (just NISO-only webinars).
  • Webinar presentation slides and Q&A will be posted to the site following the live webinar.
  • Registrants and LSA member webinar contacts will receive an e-mail message containing access information to the archived webinar recording within 48 hours after the event. This recording access is only to be used by the registrant's or member's organization.