Big Data and Small Steps at the Charleston Conference 2016

At this year’s Charleston Conference, held as always in lovely Charleston, SC, in early November, attendees seemed in a mood to focus on practical, incremental progress, with sessions on assessment packed with standing room only audiences while questions of where the field is going failed to pull the crowds.
charleston1At this year’s Charleston Conference, held as always in lovely Charleston, SC, in early November, attendees seemed in a mood to focus on practical, incremental progress, with sessions on assessment packed with standing room only audiences while questions of where the field is going failed to pull the crowds. The aforementioned assessment panel, “Effective Methods for Library Assessment,” did not venture far into unknown territory. Most of the panelists referred to gold standard tools such as COUNTER reports and Google Analytics, supplemented by an in-depth assessment of user behavior. These methods are used to tweak everything from reference hours to renewals to whether or not to participate in a Big Deal. For data visualization—which, panelists said, is becoming more important—Tableaux and High Charts Cloud were the tools of choice. The biggest assessment mistake? “Buzzword assessments,” in which a higher-up wants to assess some trendy topic on a short time frame and with no clear goal.

In the Big Tent

Launching Thursday’s conversation with “You Can’t Preserve What You Don’t Have—Or Can You?" Plenary presenter Anja Smit, university librarian, Utrecht University, the Netherlands, made a case that libraries need to prioritize preservation of materials whether they own them or not—and not just in a dark archive. While acknowledging that in publisher negotiations it can be “hard to make the effective case for eternity,” she called on librarians to “make perpetual access…the top of our agenda.” She acknowledged that the global system of redundant hubs that would truly ensure no knowledge is lost is not realistic in a landscape of local funding. But, she said, we can “get there by developing local initiatives and using standards,” citing a French initiative that built a repository with access included in its framework as a local example for others to follow. Smit was followed by Jim Neal, president-elect of the American Library Association, who offered a self-proclaimed “alarmist and strident” call for radical collaboration and innovation, with less focus on infrastructure and more on libraries’ role as conveners and enablers. He also called for less focus on ideas—stop strategic planning, he implored libraries, when by the time the plan is completed, it is obsolete—and more on action, balancing incremental and disruptive change to produce “measured transformation.” This, he said, will require messy and fluid organizational structures that in turn call for a new style of leadership. But for many in the audience, Neal’s most radical statement might have been his prediction that in the near future, the entire industry targeting products to the library market will disappear, replaced by one directed to the consumer and dominated by self-published works and niche technologies. On Friday Kalev Leetaru, senior fellow, Center for Cyber and Homeland Security, Georgetown University, kicked off the day with an eye-opening look at "Reimagining Our World At Planetary Scale: The Big Data Future Of Our Libraries." Fast-moving maps showed the scale of what we now know—and what we don’t. Some of his top-level takeaways included how copyright is impacting digitization and data mining—said Leetaru, We understand more of what happened 300 years ago than we do the last 70 years.” He also showed who we are hearing from—and who we’re not—when we study social media. Because Twitter use is heavily concentrated in certain areas, often we are only getting one side of the story.  Data mining presents a similar issue, he said—because it is mostly carried out in English, non-English material is underrepresented, making preserving and translating it a high priority. “Even as the stories we tell get richer, the comprehensiveness is shrinking,” said Leetaru. “The tools and questions haven't scaled as the data has.” Unfortunately, Leetaru says he’s not gotten the level of support from libraries that one might wish—in fact, he described it as a “very adversarial relationship. Libraries have never gone out of their way to help me find datasets or connect me to publishers.” He called on libraries to emulate the “very welcoming arms” of the Internet Archive and to play facilitator, helping to “bridge those worlds” of researchers and publishers. In the subsequent Hyde Park Debate, a recurring feature of the Charleston Conference, Alison Scott of the University of California, Riverside, attempted to convince the audience that article processing charges (APCs), as a means of funding open access (OA) publications, are antithetical to the values of librarianship; Michael Levine-Clark, University of Denver, contended that they were not. In some ways, they argued past one another, Scott assuming that the sustainability of OA is a separable issue from APCs, and Levine-Clark presuming the contrary. Among Scott’s argument is that APCs have the effect of privatizing community resources by turning libraries’ attention from communities to support a smaller group of article producers, and that the green model of OA is still worse because during the embargo period scholars in poorer schools or unaffiliated have no access to the materials.

OA, Ebooks, and Containers, Oh My

Considering OA through a monograph lens, “Mapping the Free Ebook Supply Chain” featured a discussion of OA monographs, who reads them, and how they are distributed. Rupert Gatti, director, Open Book Publishers, shared some data which confirmed that while OA ebooks are sold through traditional library distributors, that readership is “entirely swamped by free readers”—by more than 300 to one. The challenge, said Gatti, is that “the library doesn't sit in that free distribution network. How do we embed the library?” Bringing OA books into catalogs, developing library publishing programs, and uploading OA monographs into repositories might be the answer. Eric Hellman of unglu.it is working with the university of Michigan Press and Open Book to study results by both book and referrer, to drill down into exactly how users find OA books. Finally, Jill O’Neill of the National Information Standards Organization (NISO) presented a user journey—hers—in attempting to find a free ebook. Ultimately she was able to do so in 15 minutes, but she was presented with incomplete and contradictory information at every step of the way, and relied on field-specific know how that the average lay reader wouldn’t have. Her conclusion: “We need complete, explicit, high-quality metadata”—and to get there requires total community involvement. That might be a challenge, however—potentially indicating that attendees are either satisfied with the ebook status quo or less focused on monographs in general, a sparse crowd turned up. While there was brief discussion of new types of info that don’t fit well into the traditional buckets, such as 3-D models, despite the name, “Content as a Community Asset: What Happens When It Loses Its Traditional Container?” was really less about losing those traditional containers—journals, articles, monographs, reference works, primary sources, and data—and more about layering new types of containers on top of them, such as single repositories that can handle all of the above. “None of us have systems that can just do whatever,” said one participant. “You need to be able to know what you have bought and how to get to it.” In addition, while users “don’t care about containers” when searching for content, that doesn’t mean they don’t care at any point in the process. Once they’ve found an object, said panelists, “containers provide context” to help signal how authoritative it is and what to expect from it: it is an overview? Original research? The issue may be less about shortcomings of the containers themselves than about how users find them. “The Evolution of Ebooks” covered related ground, with Mitchell Davis of BiblioLabs (LJ’s partner on the SELF-e project) asking for an “all-in-one experience” which includes video, databases, and more. Davis also focused on open educational resources and their potential to lower student outlay without incurring costs to the institution. More time, however, was devoted to the shortcomings of the ebook itself. “They aren't books, they are only moderately ‘e,’ and they just don't work very well,” summed up James O'Donnell, university librarian and professor, Arizona State University Libraries. Shortcomings noted include known issues like footnotes, maps, and illustrations, as well as publisher restrictions on printing and downloading. He called for fewer platforms and the most open possible standards, as well as distinguishing the solution for “the print book in e-form” from the born-digital ebook going forward. David Durant, federal documents and social sciences librarian, East Carolina University, Greenville, NC, called for a model to support what he called a “biliterate reign” for long form, which is best handled in print, and short, tabular reading, which is best done online.

Awarding, Rewarding

The first Cynthia Graham Hurd Memorial Scholarship, sponsored by Springer Nature, was presented to Sabrina Dyck, Faculty Librarian at Tallahassee Community College, FL, by Hurd’s brother Melvin, who also presented Dyck with an additional plaque from the family. Also new this year was the Fast Pitch competition, a Shark Tank–style event in which librarians with new project ideas pitch for two $2,500 grant awards, one determined by a panel of experts and one by the audience. The projects ranged from those well underway and already funded by other, larger grants to those still a gleam in their presenters’ eye, and from small local instances to ambitious collaborations. They included “Relax Map,” from Auraria Library in Denver, which would use bio-mapping devices to find and map places on a college campus to reduce stress for students (and make a toolkit for other libraries who want to do likewise); the University of Michigan’s Fulcrum Project/Lever Press for born-digital OA scholarship was another; the State University of New York at Syracuse’s Launchpad program for student entrepreneurs, though already funded by Blackstone and the dean of the iSchool, sought the grant to place a collection in the space with titles crowdsourced by faculty; and St. John Fisher College, Rochester, NY, pitched resource sharing tools for coordinated collection development across collaborating colleges that use existing data in a new package to reduce duplication. Ultimately, Syracuse took home the judge’s award and St. John Fisher the popular vote. In addition, the annual Vicky Speck ABC-CLIO Leadership Award was presented to Chuck Hamaker, special projects librarian at the University of North Carolina, Charlotte.
Register to hear our panel of conference attendees recap the event and share key takeaways from the 2016 conference and what it means for the direction of the field in the year to come: Charleston 2016 Conference Trends and Takeaways.
Comment Policy:
  • Be respectful, and do not attack the author, people mentioned in the article, or other commenters. Take on the idea, not the messenger.
  • Don't use obscene, profane, or vulgar language.
  • Stay on point. Comments that stray from the topic at hand may be deleted.
  • Comments may be republished in print, online, or other forms of media.
  • If you see something objectionable, please let us know. Once a comment has been flagged, a staff member will investigate.


RELATED 

ALREADY A SUBSCRIBER?

We are currently offering this content for free. Sign up now to activate your personal profile, where you can save articles for future viewing

ALREADY A SUBSCRIBER?