2015-09-24

NASIG at 30: Building the Digital Future

Note: A summary of this article appeared in the September 2015 issue of Against The Grain v27 #4 on Page 72.

By Steve Oberg  (Assistant Professor, Electronic Resources and Serials, Wheaton College, Wheaton, IL)   <soberg@wheaton.edu>

Column Editor: Donald T. Hawkins  (Freelance Conference Blogger and Editor)  <dthawkins@verizon.net>

The 30th anniversary meeting of the North American Serials Interest Group (NASIG) was held May 27-30 in Arlington, VA.  One of the highlights of the meeting was a joint session on May 27 with the Society for Scholarly Publishing (SSP) that featured five speakers discussing information policy issues: open access (OA), grant funder submission and publication requirements, management and preservation of data sets, access for the print disabled, intellectual property, copyright law, and fair use.

Joint NASIG-SSP Session

“You can’t herd cats, but you can move the cat food!” was the most memorable quote from this all day session. About 150 librarians, publishers, and vendors attended. October Ivins, Principal, Ivins eContent Solutions, as the overall coordinator of the joint program, was an excellent choice given her extensive history with both organizations and her knowledge of the information landscape.

The session began with a publisher perspective by Jayne Marks, Vice President of Publishing, Wolters Kluwer, who noted that:

Print is not dead;

People do not want one format (e.g. print) or another (e.g. electronic), they want multiple formats;

Adoption of (and attitudes toward) OA vary across disciplines;

Publishers face enormous pressures on costs and revenues while at the same time demand for content in multiple formats is increasing;

Publishers are not sure what is expected of them in the area of data management, especially in the healthcare market, because of regulatory issues.

Marks closed by describing ways publishers are responding to changes in the information landscape, including doing lots and lots of market research in their attempts to get feedback and take the pulse of users; working on new business models and services, including changes in peer review and the editorial process; and doing a great deal of experimentation. She felt it important for publishers to listen, respond, be engaged, question, experiment, be user focused, and adapt. In the Q&A session that followed, audience members asked questions such as

How do we better describe and champion OA?

How do traditional publishers respond to new publishers (some of whom may be predatory)?

How sustainable is long-term preservation of journal content by publishers? (Marks noted that this issue is particularly worrisome in the case of OA content.)

Scott Plutchak, Director, Digital Data Curation Strategy, University of Alabama at Birmingham (UAB), offered a librarian’s view on information policy, with a twist. Previously director of the medical library at UAB for many years, Plutchak oriented his audience to his current (and somewhat unique) position outside of the library context, helping to lead discussions around innovation and data management, which involves bringing together “data wranglers” from all parts of his institution. Declaring that “data is the new bacon,” Plutchak asked his amused audience to think of how infrastructure, policy, and services can best be marshalled to manage research data. In his view, research data management is just as important and perhaps a more complex problem to solve than OA. He reviewed current initiatives and resources including the impressive Data Management Planning Tool developed by the California Digital Library (https://dmp.cdlib.org/), and also discussed CHORUS (http://www.chorusaccess.org) and SHARE (http://www.share.org/). He said how thankful he is that CHORUS and SHARE, initially seen as competing initiatives, are now actively collaborating. He also pointed out that libraries are taking the lead in providing research data management guidance and highlighted the Journal of EScience Librarianship (http://escholarship.umassmed.edu/jeslib/) as an excellent librarian-led venue for discussion and research on this topic. He finished by quoting from Rex Sanders, a scientist at the U.S. Geological Survey:

“If the data you need still exists;

If you found the data you need;

If you understand the data you found;

If you trust the data you understand;

If you can use the data you trust;

Someone did a good job of data management.”

Questions to Plutchak from the audience included,

Are there other people at the provost level taking on something similar to what you are doing? (Answer: not many).

How can we manage something as malleable as research data since research itself is incremental? (He responded that we are still in the “Wild West” and concrete solutions are not fully developed. But we need to track provenance of data over time and understand researcher concerns about how their data will be used.)

Caitlin Trasande, Head, Research Policy, Digital Science, and Nature Senior Strategy Editor, introduced us to a vendor perspective with a summary of Digital Science: its beginnings, its tool set, and its outlook on the whole research lifecycle. She outlined the lifecycle in an interesting, simplified way: track research, view funding, read about discoveries, plan experiments, conduct experiments, manage data, publish discoveries, share data, and measure attention. These facets are what drive the services Digital Science developed, e.g. Altmetric (http://www.digital-science.com/products/altmetric/) addresses the need to measure attention, ReadCube (http://www.digital-science.com/products/readcube/) is a tool to read about discoveries, and so on. She defined research information management as “the capture, linking, and dissemination of information associated with the research lifecycle, usually with an institutional focus.” It is very challenging to do and to resource properly, and it was in this context that Trasande declared, “You can’t herd cats but you can move the cat food!” Digital Science charges for its services but strongly supports OA. As she noted, information may want to be free but humans want (need!) to understand. Open is the first step, not the end goal. One audience question that stood out was, How do you determine between a trend and what is here to stay? Trasande responded that you simply need to find out where the pain points are and focus on them by listening to and consulting with customers.

Following these speakers there was a panel discussion on intellectual property and copyright moderated by October Ivins. The panel featured Peter Jaszi, Professor of Law, American University Washington College of Law, and Michael J. Remington, a lawyer at Drinker Biddle & Reath, LLP. Their talk was entitled “The Importance of Constructive Cooperation in the Copyright Policy Process” and their wide-ranging discussion addressed the following points: international first sale; fair use, licensing, and mass digitization; the implications of the Georgia State University (GSU) decision[1]; library exceptions (possible revisions of Section 108); and accessibility and copyright. Jaszi began the discussion by bemoaning the tenor of current copyright policy discussions, and expressed a desire for conversations to be more oriented toward finding common ground. Remington pointed out that the costs of copying are decreasing, but the costs of enforcement are increasing. Both panelists discussed the Kirtsaeng case (Kirtsaeng v. John Wiley & Sons, Inc.[2]) extensively, noting surprise that there has been no legislation yet as a result. They expected the impetus for it would come from the publisher community (who lost the case). Then turning to the GSU case, both panelists believe that the existing circuit court ruling solves nothing because it is too provisional and causes copyright owners to incur enormous transaction costs. They believe the case will not have significant precedential effect on other circuit courts and suggested that the ultimate solution should be a best practices approach like that published by the Association of Research Libraries (ARL) for course reserves, rather than waiting for a definitive court decision, which they felt was many years away. Regarding the future of Section 108 (Photocopying by Libraries and Archives), panelists felt that it does not really have a great deal of relevance any more because of today’s realities. And they noted that Section 107 (Fair Use) explicitly cannot be contravened as noted in the language for Section 108, which they feel explains the decline in library focus on Section 108.

The session concluded with a recap moderated by Bob Boissy, Manager, Account Development & Strategic Alliances at Springer, posing questions to the panelists.  Here is an edited transcript:

What constitutes an author’s “best effort” in finding orphan works?

Remington: It’s unclear. Some best practices have been issues, but in the photo industry, there has been no litigation when they were followed, which is very good news.  Not many judges will criticize someone who engages in sound business practices.  The orphan works issue seems to have been subsumed into the mass digitization issue.

What is the real goal of an institution’s data curation effort? How will it advance the mission of the institution?

Plutchak: For UAB, at the most basic level the goal is compliance with funding mandates. More generally, though, we believe curation is a social good, and we get involved because of all the things we care about. Figure out what keeps the person in charge awake, then try to bring something to the table to help them sleep better, and that is what will get done.

Is the OA world starting now a softer, kinder, gentler world?

Marks: Do you care about your business model when you are looking for content? The mission of an editor is to get good content.  OA journals can compete with other journals, and they are all competing for the work of the same author.  Every editor wants the best for their journal, so competition will not be weakened by OA.

Are universities worried about filling out their freshman class?  Are institutions marketing themselves?

Trasande: The City University of New York (CUNY) is one of the largest public institutions in the US; with 70,000 students, it has a lot of competition for freshmen and provides a vital pipeline for the middle class in New York City. It is also involved in research. Many universities have an opportunity to have deep roots with their host city.  With the advent of MOOCs, universities need to demonstrate “why us?” or “why here?”  There is a very good opportunity for smaller universities, but it requires selling themselves.  Tools are available to help them to do this.

Is all the data produced by faculty members preservable? What do the federal mandates require us to do? What are we obligating ourselves to do?

Plutchak: It is an insoluble problem! Federal mandates require that grantees have a data management plan. They will have to describe what data they are collecting, where it is going to be stored, and how it might be shared later. None of that obligates the library to be the manager.  We need to look at our institutional priorities, what are the key research components, and focus efforts on high priority areas.  This will mean saying “no” to some people (which libraries hate to do).

What happens during a regulatory review when two big publishers merge?

Remington: They hire antitrust lawyers.

Jaszi: There are concerns about the preservation of the competitive environment.

How do libraries, publishers, and the academic community work together on the issue of digital preservation?

Marks: We store everything in one place once and then everybody can access it.

Jaszi: Libraries are likely to take the lead in the active digitization of unique special collections. We do not yet have a business model to spread costs for more extensive forms of digitization across the library sector.

Plutchak: Publications are easy compared to dealing with data! No single entity can figure out how to preserve digital data over centuries. What needs do the different components have? It will take some time; the immediate problems are consuming all of our attention. Nobody is thinking about what happens when somebody goes to an article 20 years from now and clicks on a link.  Will the data pop up in a format they can read?

What is the status of international ILL?

Jaszi: A structural legal problem interferes: the laws of nations regarding the limitations of copyright can be radically different. As long as countries have different legal standards, there will be problems. Is there any possibility of improving the harmonization of the laws of different nations? Affirmative protections have been addressed and small initial steps have been taken (such as with the Marrakesh VIP Treaty that facilitates access to publications for disabled people—see http://www.wipo.int/treaties/en/ip/marrakesh), but that seems to be the only area where harmonization has been attempted.

Remington: Professors are being told not to publish before checking with their technology transfer office, which can hinder their research efforts.

Has the reward system changed in a context of shrinking resources and how we expect people to publish?

Plutchak: There are some glimmers of change and an increasing interest in altmetrics. The notion that we are looking at other ways to measure impact is useful. There is an interest in looking more broadly at how you do evaluations. Some of these elements are being embraced by a new generation of scholars.

Trasande: You see more professors practicing applied research at major universities. Altmetrics have come more mainstream, and there is a general desire to measure what the impact of scholarship means. These are different incentives to which young graduate students can devote their talents. It is a combination of publisher and measuring impact.

Plutchak: We have spent just over 300 years in developing the scholarly publishing world, and we have been trying to reform it for about 30 years. It is a long process and patience is a virtue!

In closing, Ivins noted that SSP and NASIG have many good things in common, including a desire to provide a neutral place for discussing mutual problems and issues among all parties. The last time these professional organizations held a joint session was at the University of Illinois at Chicago in 1992, which I was fortunate enough to also have attended. Hopefully the future will see NASIG and SSP build on these joint programming experiences to their mutual benefit and the benefit of publishers, vendors, and librarians everywhere.

Somewhere to Run, Nowhere to Hide

Stephen Rhind-Tutt, President and co-founder of Alexander Street Press (ASP), began the second day of the NASIG conference in his vision session entitled, “Somewhere to Run to, Nowhere to Hide.” Rhind-Tutt’s presentation gave his insights into the future, and although his perspective comes from primary sources, streaming video, and audio, he had many interesting and relevant observations about serials and academic publishing. In his research preparing for this talk, he found the following predictions about the web in 2020, which he believes will contain:

90% of works published prior to 1923,

The majority of works published to 2020,

More than one trillion photos, and

More than 30 million audio files.

In this context, referring to a famous scene in the movie Jaws, Rhind-Tutt declared, “We are going to need bigger boats.” We need to think about information and content on a wholly different scale.

So what does this have to do with academic publishing? He shared results of a review of NASIG’s website, in particular the topics covered in past conferences. Noting that topics covered 25 years ago are remarkably similar to what we’re doing now, Rhind-Tutt highlighted the fact that they focus on fears of publishers as well as fears of librarians: everyone fears being made redundant. Many are worried about the future, while desperately trying to hold onto the past.

His advice is to focus on what the user wants, and he referenced Karen Schneider’s blog post on Free Range Librarian entitled “The User Is Not Broken: A Meme Masquerading as a Manifesto” (http://freerangelibrarian.com/2006/06/03/the-user-is-not-broken-a-meme-masquerading-as-a-manifesto). “We need this more now than ever”, he said. We are in the process of evolution, not revolution. We are also transitioning from data, to information, to knowledge, and finally, wisdom. He believes we should focus on function rather than form. For example, reading journals is not the function; discovering and interpreting new ideas is the function. For video, the underlying purpose is not simply to watch the video but rather to get the needed information it contains.

Rhind-Tutt believes we can predict the future with reasonable certainty, and he illustrated his point via some entertaining French comics published around the turn of the 20th century, that predicted such things as the ability for books to be transformed from print into a format that directly can be imported into our brains (i.e. digitized). We keep trying to grasp new things by framing them in the past (e.g. “e-journals”), and he demonstrated this by showing his audience slides about the evolution of the automobile, which at first was called the horseless carriage before the completely new term, automobile, became widely accepted.

Declaring that “the future is clear enough to act on,” he believes, for example, that information will indeed eventually be free, i.e. OA. As a publisher, he sees better and better content becoming available as OA. There is no way for the commercial sector to avoid giving customers what they want, and they want free. He also stressed the vital importance of interlinking of content and felt that too few people recognize how important the work of linking technologies really is and will continue to be. (I thought immediately of OpenURL–http://en.wikipedia.org/wiki/OpenURL.) Although there are many challenges to linking, most of them are philosophical, not technical. Linking speeds research and learning, lowers costs, maximizes usage, and increases functionality. He encouraged everyone to think about content at the atomic level, rather than thinking of it in a linear or packaged fashion (articles, rather than journal issues).

Finally, Rhind-Tutt described what ASP is doing in light of these predictions. For example, they are developing an “Open Music Library” that will be fully OA, because they believe that interactions with music academics will be infinitely richer because of this openness, as compared to what they could get if their product was behind a paywall. He also spoke a bit about Digital Science as an impressive pioneer and someone to watch, because “the process we are all engaged in is discovery.”

A Q&A time followed and here is some of what was discussed:

How do you expect ASP to make money if everything is going to be OA?

ASP can make money by being hybrid free/for fee, e.g. in music where many things are copyrighted. He gave an example of how ASP might make money as a service even though the content is not theirs, pointing to an initiative where Tufts is loading its own content onto ASP’s infrastructure as a fee service.

What about clarity on business models, especially because the hybrid environment is such a mess for libraries, and seems unsustainable?

Rhind-Tutt contends that some business models are working and reiterated his belief that publishers can build fee-based services on top of OA.

Slides from all presentations are available on NASIG’s Slideshare site: http://www.slideshare.net/NASIG/tag/nasig2015.  The 2016 NASIG conference will be in Albuquerque, NM on June 9-12.

Steve Oberg is Assistant Professor, Electronic Resources and Serials at Wheaton College in Wheaton, IL. A past president of NASIG, Steve has written and presented extensively on technology, electronic resources, and serials issues for the past 25 years. He also teaches courses on technical services, e-resources, and serials management at the University of Illinois Graduate School of Library and Information Science as well as Dominican University’s Graduate School of Library and Information Science. He has worked in a wide variety of settings including a large academic research library, library systems vendor, liberal arts college libraries, and a Fortune 100 healthcare company. His M.S.L.I.S. and undergraduate degrees are from the University of Illinois at Urbana-Champaign. Connect with him on Twitter (@TechSvcsLib), his blog (Family Man Librarian – http://familymanlibrarian.blogspot.com), or via his Flipboard magazine, Family Man Librarian Daily (http://flip.it/JBYzc).

[1] The case was about fair use of electronic documents in the university’s e-reserve system. (See http://libguides.law.gsu.edu/gsucopyrightcase)
[2] A Supreme Court decision in favor of Kirtsaeng, in which the Court held that the first-sale doctrine applies to copyrighted works imported from other countries. See http://en.wikipedia.org/wiki/Kirtsaeng_v._John_Wiley_%26_Sons,_Inc..

The New Big Picture: Connecting Diverse Perspectives—The 2015 SSP Meeting

By Donald T. Hawkins   (Freelance Conference Reporter and Blogger)  <dthawkins@verizon.net>



Howard Ratner, SSP President, Opens the Meeting

The Society for Scholarly Publishing (SSP) met in Arlington, VA on May 27-29, 2015 for its 37th annual meeting, which drew a near-record attendance of 910. Its theme was “The New Big Picture: Connecting Diverse Perspectives”. The meeting featured the traditional mix of plenary keynote and concurrent sessions, a vibrant exhibit hall, as well as a new event: a joint session with attendees at the NASIG (formerly known as the North American Serials Interest Group) meeting, which took place concurrently with SSP at a nearby hotel. (See the following article by guest columnist Steve Oberg.)

Among the interesting features of the meeting were several large and engaging posters drawn on the spot by Greg Gersch, a freelance artist, like this one:



Click for larger image.

Photo reproduced by permission of Greg Gersch

The meeting opened with a special presentation by Amy Brand, Vice President of Academic & Research Relations at Digital Science, documented some demographics of scholarly publishing and communication professionals (a complete analysis of the data will appear in a forthcoming issue of Learned Publishing). The data were obtained by a survey which received 828 responses from 33 countries. The majority of the respondents were based in the U.S. or the U.K. Over half of them were females between 35 and 54 years old. They are well educated: 39% have Master’s degrees and 17% have a Ph.D. Nearly 75% of them work for non-profit organization (such as societies), commercial publishers, or university presses; about half are in an editorial role.

Keynote Addresses

Charles Watkinson



Charles Watkinson

The keynote address “Rethinking Book Publishing in the Digital Age” by Charles Watkinson, Associate University Librarian for Publishing at the University of Michigan and Director of the University of Michigan Press, was one of the highlights of the meeting. He began by noting that a previous SSP keynote address had coined the terms “pubrarians” and “liblishers” to describe the intersection of librarians and publishers as producers of content, and John Thomson, author of Books in the Digital Age (Polity, 2005) said in his book that publishing is a complex industry that is structured into fields, each with its own distinct properties. Watkinson illustrated this concept with this diagram and said that we live in small fields distinguished by market type or competition.

Fields of Publishing

Most of the time we graze in the middle of our own field, but by doing so, our outlook will be narrow, causing us to miss some of the most interesting things happening at the junctions or edges of the fields. A recent article in the New York Times (“Life on the Edge”, Aikko Bush, December 26, 2013, http://www.nytimes.com/2013/12/27/opinion/life-on-the-edge.html, said that humans seem to have an appreciation for the edges and know that they are where the action is and where things get pushed for best results.

The “edge effect” is important in sharing innovation across fields. For example, monographs and journals share an edge, and there is an amazing persistence of format between them. Revenues from monographs have been gradually declining over the past 10 years, which has put a lot of pressure on sustainability. Approaching the academic monograph from the “edge” of journal publishing might stimulate new thinking.  For example, new literature sources such as samplers or summaries of longer works may look like books (for example, Palgrave’s Pivot (http://www.palgrave.com/page/about-us-palgrave-pivot/) or MIT Press’s BITS (https://mitpress.mit.edu/BITS/index.html)), but they have different content and require publishers to increase the speed of publication.

Although monographs have long been recognized as a field of scholarly publishing, the emphasis has traditionally been on journals. Now, new technologies are being applied to book publishing, and the book field is receiving a new emphasis. There are now many more sales channels than previously, and open access (OA) business models for books are raising questions like these:

Are there aspects of using a book that require a different approach to design choices?

How much should the book processing charge be?

Where should the money come from?

How much of the charge should be allocated to authors?

Other interesting edges include journal and data repositories (Nature Scientific Data, http://www.nature.com/sdata/), professional books and journals (Morgan & Claypool’s Synthesis series), trade books and professional books (the Dummies series), juvenile and scholarly journals (Frontiers for Young Minds, http://www.frontiersin.org/), and textbooks and library publishing (Open SUNY Textbooks, http://textbooks.opensuny.org/).

Watkinson urged the audience to look across the edges and see what your neighbors are doing; it is well worthwhile.  Our ecosystem is evolving, and organizations are experimenting with new platforms. He closed with an exhortation to take advantage of the numerous edges that exist in scholarly publishing by branching out into new and unfamiliar areas for example, by going to a session on an unknown topic, speaking to a vendor that your organization has never used, or becoming active on an SSP committee.

Ken Auletta

Ken Auletta

The second day keynote was structured as a conversation with Ken Auletta, a writer for The New Yorker and well-known author of Googled: The End of the World as We Know It (Penguin, 2009) and other books. He made the following points:

The publishing industry is going through a disruption similar to that of the TV industry when cable platforms emerged.

In reporting, personalities and the human factor are important; sometimes major business decisions are made for a simple human reason (and those stories usually do not get into the media).

The digital edition of The New Yorker has not affected authors’ writing. Articles are still edited and fact checked with great care, and we still need curators and intelligent agents to sort out the news that interests us.

Even though digital publishing has increased, the print is being protected. Profits from newspapers and magazines still come from the print editions. For example, the average reader of the printed New York Times spends up to 35 minutes a day reading it, but the average online reader does not spend that much reading time in a month. Advertisers therefore are not willing to pay as much for online ads as for printed ones, so the online edition does not make as much revenue for the publisher.

The average age of The New Yorker readers has been significantly lowered by introducing photos and digital articles appealing to younger readers. A major issue is whether the attention spans of these younger readers are long enough to be affected by ads.

Google has become both a technology and media company, especially since they have bought media organizations like YouTube and Zagat.

Jennifer Lawton

Jennifer Lawton

Jennifer Lawton, former CEO of MakerBot Industries, spoke on “Reflections on Leadership and Success”, in which she said that it is important to know who you are, what makes you happy, and what you want to do.  MakerBot was a failure-driven company; after a failure, learn and go on to the next level of achievement. Here are the principles that have guided her in her career:

Do not let anyone tell you what you can and cannot do with your life. It’s your life and yours to try.

There is nothing to fear but fear itself. The worst thing anyone will say to you is “no”, and then it is in your control to take the right next step. You must be able to handle someone saying “no”.

Ask for help early and often. You get further faster, and you learn along the way.

Look. Listen. Slow and easy wins the race.

Once you climb a mountain, turn around and help the next person up. Sometimes the best success comes from not making it to the summit. Share your experiences, hopes, and dreams. Realize that when you get to the top, you got there with the help of many other people.

Everyone has their own path. Follow yours. Be kind and gentle to yourself. You deserve it.

Always network. Stay in touch with everything and keep your network active.

You will never know the answer if you do not ask.

If you are offered a seat on a rocket ship, get on and don’t ask what seat. If you want to go to the moon, you must get on the rocket ship.

Make sure you wake up every day looking forward to what is in front of you. If you don’t, change your direction. Make sure it is positive energy. It is all in your hands to make a change.

There are no easy answers. Life is work—hard work.  It’s your life, so work hard to make it what you want.

Closing Plenary: Lessons Learned in the Past Five Years

The following panel of society publishing professionals was asked to discuss their recent successes and failures.

Brandon Nordin

American Chemical Society

Nancy Rodnan

Endocrine Society

Angela Cochran

American Society of Civil Engineers

Stephen Welch

American College of Chest Physicians (CHEST)

Robert Harington

American Mathematical Society

Kenneth Heideman

American Meteorological Society

Here are some of the accomplishments they are most proud of and issues that were solved and worked particularly well:

Welch: The app for the iOS and Android platforms. When expected purchases did not materialize, the price was dropped and functionality was enhanced. As a result, the app will gross over $2 million this year and is frequently a top selling medical app on the iStore.

Harington: Developed Math Jacks (http://www.jackmathsolutions.com/), which is a major repository of open source information funded by a professional society. They also developed a book, Really Big Numbers, for children which recently won an award from the Children’s Book Council.

Nordin: Managing the print-to-digital transition. In 2006, the electronic version of ACS’s journals was declared the version of record, and since 2008, all journals are electronic only (except in the Asian market, for which printing was outsourced to a local printer). Stopping the print removed a drag on profits, which allowed the society to enhance digital development and increase annual sales of its e-journals to nearly $100 million.

Rodnan: Working with people: assessing staff, figuring out technology needs, and hiring the proper people to meet them.

Heideman: Decreased production time of journals while maintaining quality. Consolidation of editorial assistants from 37 to 8. The success of the book publishing program has taught the society the benefits of outsourcing.

Cochran: Engaging with editors, asking their opinions, attending editorial board meetings, and creating an annual editors’ workshop has transformed the journals publishing program.

Here are some things that did not go well and the lessons learned from those experiences:

Cochran: The amount of time to get an article published was long, so moving to electronic submission and reporting helped to shorten the time. But the human issues were not addressed initially, which caused problems with the editors. They had to change their workflows and get assistance from the assistant editors.

Heideman: When data was being transferred to a new server, the process crashed. Lack of backups resulted in a loss of about 6 months of data, affecting 1,500 papers. Many could be restored from files in progress elsewhere (with reviewers, etc.). When customers and the community heard what happened, they rallied to help, and the staff contributed a significant amount of overtime work to recover.

Rodnan: “If you build it, they will come” does not always happen! It is important to assess ideas and make sure they will be relevant to customers. Recognize the amount of time, effort, and maintenance that is required in developing new systems.

Nordin: Understand how to control and value the ecosystem of text and data mining. Failing to recognize market trends led to several wasted years; it is necessary to take decisive action even if it is painful.

Harington: You need to have the experience of trying things, even if it means failing. Cultivate wonderful relationships with librarians. Attempting to sell e-books to individuals through Google Play was a major failure.

Welch: Declining to become a content provider for the UpToDate medical decision support system (http://www.uptodate.com/home) was a failure because that system became very widely used. It is important to discern market reactions to products. Another failure was an attempt to sell e-books through the e-journal platform.

Concurrent Sessions

Open Access Monographs from the Perspective of Publishers and Librarians

This session examined the business models that will make OA sustainable. Institutional repositories have not performed as expected, so the focus is now on monographs. Palgrave Macmillan was one of the first publishers to offer OA books and hybrid chapters. Authors are charged an Access Publishing Charge (APC) of $12,000 to $17,000, and all online editions of the book are OA. The decision to publish the book as OA is left up to the author. OA has been positively received; usage of OA books is significantly higher than non-OA books, but OA has had a negative effect on print sales.

Lessons learned:

It is important to clearly state license terms in the book.

OA titles must be easily found and available on a variety of platforms.

Funders and authors should be encouraged to share and review their books as widely as possible.

Funding, permissions (especially for cover designs), and production workflows are challenges.

Luminos (http://www.luminosoa.org/), part of the University of California’s OpenPress program, now has 12 titles committed to OA. The same standards for selection, review, approval, production, and marketing are used for both OA and printed books. An effort was made to preserve essential creation procedures for monographs whether they are OA or not.  OA is an important author’s choice. The baseline publication costs are about $15,000; authors’ institutions are expected to contribute $7,500; libraries and the UC Press subsidize the remaining costs. (More elaborate books with many illustrations or audio files result in increased APCs.) Luminos does not replace the traditional monograph program; it extends it.

Publishers Communication Group (PCG, http://www.pcgplus.com/) did a large market survey of several hundred librarians from the US, UK and Western Europe and found that OA books are treated similarly to journals. Most libraries have very small collections of OA monographs (less than 1% of the total collection). Librarians hear about OA monographs by word of mouth, emails from publishers, or industry newsletters. It was interesting to see that library funding for OA monograph publishing frequently comes from new sources and not from existing budgets. There is a small initiative to publish OA monographs today, but it is growing and there is lots of opportunity moving forward. Librarians are embracing OA monographs, but no consensus on their exact role has emerged.

The Evaluation Gap: Using Altmetrics to Meet Changing Researcher Needs

The “Evaluation Gap” refers to the difference between using traditional metrics such as citation counts and Impact Factors (IFs) and alternative metrics (“altmetrics”) in the evaluation of scientific research output. This session featured four speakers addressing the issue from the perspective of publishers, librarians, and institutions.

Terri Teleen, Editorial Operations and Communications Director at John Wiley, reviewed a pilot project on six journals in which an “Altmetric Badge” (a visual representation developed by Altmetric, LLP, http://www.altmetric.com/, showing how much and what kind of attention an article has received—see below) was displayed for each article in the journals.

Altmetric Badge Example

67% of readers said that the displayed article metrics were helpful to them, and about 50% said they would be more likely to submit a paper to a journal that supported article-level metrics like blog posts, tweets, Facebook posts, and mentions in national news media. The results of the survey were positive, so Wiley has begun displaying altmetric badges on all articles. A number of recent posts on Wiley’s blog (Wiley Exchanges, http://exchanges.wiley.com/blog/) have discussed various measures of article impact.

Wiley is also helping authors promote their works; 59% of them see themselves as primarily responsible for promoting their published research. A partnership with Kudos (https://www.growkudos.com/) is available to help them explain, enrich, and share articles for greater impact.  ORCID (http://orcid.org/) and ReadCube (https://www.readcube.com/) make it easier for researchers to discover, access, and interact with published work.  Wiley has created a self-promotion kit for authors; almost all of them said they would be likely to use it.

Cassidy Sugimoto, Assistant Professor at Indiana University, said that the best criteria we currently have for evaluating science are promotion and tenure documents, which are usually based on citation counts and IFs. (Citations and IFs suffer from the limitation of measuring only a person’s publication record.) Article-level metrics capture many other types of data and are beginning to be used by scientists in reputation-management systems; for example, Mendeley (https://www.mendeley.com/) measures what students, faculty, and researchers are reading, and ImpactStory (https://impactstory.org/) can be used by individual scientists to create a CV that includes metrics of their research. Some academic librarians have begun to support altmetrics by teaching their users how to use them to promote their research. The challenge of altmetrics is that they must be standardized and unified; just providing access to them does not mean that we can overcome disparities in the scientific workforce.

Colleen Willis, Senior Librarian at the National Academy of Sciences (NAS), agreed with Sugimoto and said that metrics are like breadcrumbs because they consist of data that inform a publisher’s staff what happened to the products that they have produced. The NAS library has created a class called “Motivational Metrics” which teaches authors and staff what the numbers mean and provides some examples of their use.

Jill Rodgers, Journals Marketing Manager at MIT Press, wondered how a publisher can determine how users are engaging with its content. Understanding the scope of engagement can help a publisher target its market and grow its business.  Altmetrics are not a replacement for citation counts and IFs; they augment them by measuring numbers of views, discussions, shares, and recommendations.  They allow publishers to listen and respond to researchers. The BATCHES service developed by MIT Press (https://www.facebook.com/mitpress/posts/10152418198014894) consists of collections of reader-selected articles on a single topic bundled for downloading to the Kindle e-reader. They have been well received by the market; sales of BATCHES are 2 to 4 times higher than sales of single issues of journals.

Other features and products have been launched using altmetric data; for example, social sharing buttons appear on all abstract pages, which allows the Press to collect data on readership, select articles to offer in free trials or feature in a monthly newsletter. Other results of using altmetrics include increased ad revenue, enhanced reporting to sponsors, and a boost in staff morale.

Where to Find Growth in a Crowded Market

A standing-room-only audience heard three presentations on the growth outlook for the scholarly publishing market.  Michael Clarke, President of Clarke and Company, a management consulting firm, said that the three engines of growth in the scholarly publishing industry from 2000 to 2015 are:

Site licenses used to establish journal sales in institutions,

The Big Deal of packages of journals, and

Global expansion, especially in China, India, and the Middle East.

Are these the end of the line?  There are few new markets to sell into, not many new institutions to support growth, and existing institutions are experiencing strong budgetary pressures. Selling new products and services appears to be the major avenue for growth; here are some promising approaches:

Re-establish an individual (“end user”) market. There will not be an awakening of a market for personal subscriptions to journals, but products like Figshare (http://figshare.com/), Papers Online (http://www.papersapp.com/online/), Overleaf (https://www.overleaf.com/), and NEJM Knowledge+ (http://knowledgeplus.nejm.org/) are all targeted at the individual researcher.

Develop new business models. Tap into revenues not from the library. Use the “freemium” business model (http://www.freemium.org/), in which a core product is given away to entice users to pay for value-added services such as à la carte options, traffic referrals, targeted ads, and analytics. We can learn from the Shazam service (http://www.shazam.com/company) in the music industry: most of their revenue comes from analytic data about who is listening to what and where.

Mergers and acquisitions are common growth strategies and are not limited to large commercial associations. They can even happen at the association level.

Joe Esposito, a management consultant, said that we must begin to investigate the properties of digital media and rethink the basic editorial structure of what we do.  Current constraints on growth include the maturity of markets, library funding not being as robust as desired, necessary infrastructure investments that will lower market revenues, and pressures on margins caused by OA.  Digital media enables extensive data collection, but what will we do with the masses of numbers?  Database marketing has not been a significant activity of this community; perhaps we should employ companies in the database management business to analyze our data.  Much of the future of digital media is still unknown. It is not true that there is no end user data from print products; we have mailing addresses with Zip codes, but we have no information about actual uses, and print products are usually in silos with little connection to others.

User data can bring growth through direct sales to consumers and from packaging and selling metadata. Direct sales to consumers (D2C) are probably best suited for books. One of the problems is generating web traffic, but then competition with large companies like Amazon will occur.  Article sales conflict with institutional journal subscriptions. Even though the D2C market is small, the data about it can be put to new uses by marketing departments. Monetizing metadata is an interesting approach because it is a way to approach new customers, but it requires a huge scale. Anonymity is essential, and a rigorously enforced privacy policy is necessary to prevent challenges.

Dealing with end user data is a new area for publishers; where it will go is largely unknown at present. It is probably best to think of it as a promising property of digital networks and one in which investments in data collection should be made now. A new class of products will have to be designed and new imagination must be brought to the industry to stimulate growth, but that will not come from an extension of current markets and products which are largely not designed for purchase by end users.

David Lamb, President of Lamb Group LLC, a financial advisory firm serving publishers, looked at the outlook for mergers and acquisitions (M&As) in scholarly publishing.  The market is a worldwide industry that is financially consistent, attractive, and comparable across products.  It has a notable number of non-profit participants that are not normally seen in commercial publishing. There is plenty of scope for acquisitions because over 2,000 journal publishers in the market have the potential for growth by combining with others. A variety of participants in the market creates a very healthy environment for M&As because:

The economy is currently relatively healthy,

Private equity firms have a record $1.3 billion in assets,

Interest rates are low and lenders are eager,

There a current pent-up demand for strategic growth,

The role of digital data is established and well understood, and

Scientific research is growing worldwide.

Licensing is similar to M&A and should be considered as an alternate viable strategy for growth.

How Much Does It Cost? Vs. What Are You Getting For Or Doing With the Money?: The OA Business Model

This session considered the cost of publishing a journal article, particularly in an OA environment. Much of the discussion centered on the Author Publishing Charge (APC): what it is based on and whether authors are getting appropriate value for their payments.  Robert Kiley, Head, Digital Services at the Wellcome Trust Library presented an analysis of the Trust’s OA spending in 2013 and 2014 and noted that there has been a 20% increase in Trust-funded articles published as OA. APCs have remained static. About 24% of the research was published in fully OA journals; the remainder was published in hybrid journals. The average APCs of hybrid journals is 64% higher than that of fully OA journals. 40% of the Trust’s APC spending goes to Elsevier and Wiley. Problems related to articles in hybrid journals included timing and processing issues. Kiley said that hybrid journals are expensive and some do not provide the level of service expected. Possible actions in response are to do nothing, do not fund hybrids, implement a funding cap, withhold payment of some of the APCs until paid-for service levels are delivered, or set a cap for APCs based on the value of services that a journal offers.

Rebecca Kennison, one of the Principals at K|N Consultants (http://knconsultants.org/about/), listed three main pricing strategies for products: what the market will bear, gross margin target, and most significant digit pricing (i.e. $29.99 instead of $30). For example, PLoS found that most authors can pay $3,000, so that is the price they set. A common pricing strategy in retail markets is to double the wholesale cost of the product, but it does not work for services because it does not account for their perceived value.

Here are some questions to consider in OA pricing decisions:

What kind of business is scholarly communication?

Is it different from other kinds of publishing because of the players or the economy?

Is OA publishing different from scholarly communication?

What do we mean by transparency?

What products and services are being considered?

Kennison suggested that a “buyer beware” strategy may apply to OA:

Peter Binfield, Founder of PeerJ (https://peerj.com/), a low-cost open article publishing system similar to PLoS, noted that OA customers typically buy articles, not journals.  So he wondered if we need journals at all.  The marketplace is tied up with the concept of journal as a brand. Many OA publishers still try to publish the “best” journals, so their APCs are high. Other costs of publishing are also high because of inefficiencies at publishers and libraries and additional costs due to additional players in the marketplace. In a fully OA world, publishers and libraries would be freed from many of their inefficiencies of the systems they currently maintain.

The dark side of the OA publishing model is that we assume the quality of every article is equal, which is not true.  We also see predatory journals.  Binfield suggested removing the gatekeeping role of peer review in favor of the PLoS ONE model is to publish all submitted articles after a technical review to ensure that the data supports the research conclusions.

Anyone paying to publish their article in a journal must ask if it is a good one, if they would be proud to put the reference on their CV, and what value they are getting as an author, which would help to mitigate the actions of predatory publishers. Authors can do these types of investigations themselves with OA journals, but it is not necessarily possible on subscription journals because access to their articles may not be readily accessible.

Does Data Fit Into Traditional Publication? Should It?

Mark Hahnel and Jennifer Lin

Funders of government research have begun to require that data supporting the research be made publicly available after publication, which has caused some publishers to offer this capability to their authors.  In this session, Mark Hahnel, CEO of Figshare and Jennifer Lin, PLoS Senior Product Manager, debated what the role of publishers should be and how data can be made available. PLoS was one of the first publishers to establish an open data policy and to require authors to make their data available for publication; Figshare is a vendor of technology that helps publishers store and visualize data without encumbering their existing operations.

The debate addressed the following four issues from the viewpoints of data technologists and publishers:

Issue

Data Technologist Viewpoint

Publisher Viewpoint

Business Opportunities

Publishers do not see business opportunities with data. Legacy publishing platforms are for sharing articles and are not suitable for data.

Nobody else has the resources and money to handle data. Publishers have big buildings where they can store servers.

Data Characteristics

Data is everything.

Data is a second-class citizen that is only important for writing research articles.

Publisher’s Role

There is no need for publishers when it comes to disseminating data.

Publishers are the experts on disseminating academic content.

Trust

Publishers cannot be trusted.

Academics cannot be trusted to store data persistently.

Here are some of the points discussed in the debate:

Business Opportunities

Only publishers have the infrastructure resources and money to handle data. They see opportunities because data represents a new revenue stream. The other side contends that publishers do not see business opportunities with data. (However, when asked for a show of hands, several publishers in the room indicated that they do see a business opportunity because they know how to handle citations and Ifs, which are a form of data).

Legacy platforms are for sharing articles and are not suitable for data. Data is a new frontier and has different characteristics than articles do. Who is going to pay for processing it?

Some emerging companies are developing services for publishers which can be integrated into the services they currently offer. It is important to have skills to properly manage data, and many publishers are developing their own tools.

Data Characteristics

Data is a “second-class citizen” that is only important for writing research articles, but it should be treated as a new first-class object. The contrary view is that research articles represent the result of years of work in the laboratory.

We might think that data are everything, but by the time an article is published, nobody cares about the underlying data. Many people see data as the currency of research, and all they want to do is to publish an article and put the data somewhere.

Some publishers have the capability to give authors a print “wrapper” for the data which gives authors two articles with double the impact and two APCs, resulting in more revenue for the publishers. The existing academic environment motivates this approach because it rewards researchers based on the number of articles they write.

The Publisher’s Role

Academics think that the article is king, but funders are now saying that the data must also be published.

There is no peer review of data, so articles are published without the data being checked and the data becomes an afterthought.

There is no need for publishers to disseminate data. One audience member wondered if data is what needs to be disseminated or should it be appropriately housed and curated with individuals being directed to it.

Metadata surrounding the data are important. The raw data is not valuable without the metadata. Publishers can provide some of those services.

Trust

Even if publishers can manage the data, should they? Data belongs in academies.

There is a concern that publishers will take the data and sell it back again.

Some people think that academics cannot be trusted to store the data persistently; others think that publishers cannot either.

Should libraries be the disseminators of all this content? Data is a new area; maybe there is a role for institutions to play. The bigger question is: What is the role of publishers with respect to data? There is no right answer, but there are things publishers can do, and there are conversations going on outside of publishers with research data managers and funders.

A member of the audience pointed out that an example of a publisher disseminating data is found in the American Chemical Society’s Journal of Chemical & Engineering Data (http://pubs.acs.org/journal/jceaax?&), which has been in existence for about 60 years. According to its website, “The Journal of Chemical & Engineering Data is a monthly journal devoted to the publication of data obtained from both experiment and computation, which are viewed as complementary. It is the only American Chemical Society journal primarily concerned with articles containing data on the … properties of well-defined materials, including complex mixtures of known compositions.” Clearly, the journal has been successful in its mission, as evidenced by its long existence.

Jennifer Lin concluded the session with a list of recommendations for publishers to increase access to data[1]:

Establish and enforce a mandatory data availability policy.

Contribute to establishing community standards for data management and sharing.

Contribute to establishing community standards for data preservation in trusted repositories.

Provide formal channels to share data.

Work with repositories to streamline data submission.

Require appropriate citations to all data associated with a publication.

Develop and report indicators that will support data as a first-class scholarly output.

Incentivize data sharing by promoting the value of data sharing.

Donald T. Hawkins is an information industry freelance writer based in Pennsylvania. In addition to blogging and writing about conferences for Against the Grain, he blogs the Computers in Libraries and Internet Librarian conferences for Information Today, Inc. (ITI) and

Show more