AI, ChatGPT Are Focus at Top Tech Trends Panel | ALA Annual 2023

OpenAI’s ChatGPT has been a hot topic ever since it debuted to the public seven months ago. So much so that the American Library Association’s (ALA) Core division decided to forgo its traditional wide-ranging approach to its Top Tech Trends panel and focus exclusively on the potential benefits and problems of generative artificial intelligence (AI) during the “Core Top Technology Trends: Libraries Take on ChatGPT” session at the ALA Annual Conference, held June 22–27 in Chicago.

ALA Core logoOpenAI’s ChatGPT has been a hot topic ever since it debuted to the public seven months ago. So much so that the American Library Association’s (ALA) Core division decided to forgo its traditional wide-ranging approach to its Top Tech Trends panel and focus exclusively on the potential benefits and problems of generative artificial intelligence (AI) during the “Core Top Technology Trends: Libraries Take on ChatGPT” session at the ALA Annual Conference, held June 22–27 in Chicago.

While panelists agreed that there are already issues with AI-generated misinformation, a lack of transparency/clarity about how these tools work and where they source information, ethical issues and privacy concerns, the potential to be used for cheating in academic settings, and more, they also agreed that the technology is here to stay, and that librarians should soon familiarize themselves with these tools to help patrons and students navigate their use in the near term and in the future.

“As generative AI starts to be introduced into society, I think there’s a lot of things that we should be thinking about, [and] not being necessarily afraid of,” said Jonathan McMichael, learning experience designer, Arizona State University. He added that in this early stage, libraries can help introduce the technology to people in ways that facilitate learning. “This is going to be a big part of a lot of students’ lives. It's going to be a big part of our users’ lives…. Trying to downplay it will not work,” he added later.

McMichael was joined on the panel by Fernando Aragon, Learning Labs supervisor, Gwinnett County Public Library; Hannah Byrd Little, director of library and archives, the Webb School; and Trevor Watkins, teaching and outreach librarian, George Mason University Libraries. The panel was moderated by Kate DeLaney, outgoing chair of Core.

Citing a conversation that he had with another professor recently, Watkins noted that “you can’t have a house without a foundation…. What is the information literacy skillset of the students [whom] we’re going to be teaching?” Before working AI tools into a course, “go over the fundamentals of information literacy, because I think we can all agree that, in general, information literacy is pretty horrible right now.” Once that foundation is established, educators can better explain current issues and limitations of AI tools, Watkins said, noting, for example, that ChatGPT can’t parse the context of many user questions, and when it is asked to cite sources for information it generates, it sometimes creates authors that don’t exist.

Byrd Little agreed, adding that many students at her boarding school “are already way ahead of us” with AI tools. Many are not using platforms such as ChatGPT to cheat and write whole papers, she said. Instead, they’re using the tools to ask questions and then write based on the information they receive back, and/or using ChatGPT after they’ve written a paper to review an AI-generated counterpoint to their argument. Yet even for these uses, students should be aware of ways in which AI isn’t always reliable. “It’s a tool; it’s not a source,” she said. And people have been using related tools, such as text-to-speech and voice assistants, for years already. “Several years back, I asked my sixth graders to define a word, and they all pulled out their iPhones and said ‘Siri how do you define so-and-so?’ They’ve grown up using AI,” she said.

McMichael said that he uses the Association of College and Research Libraries’ Framework for Information Literacy for Higher Education for covering information literacy topics with students, and said that he’s been working on classroom exercises that illustrate ChatGPT and other AI tools generating contradictory or false information, “showing where it makes errors or…almost trying to break it in some way.” He added that demonstrating what current AI platforms do well and what they do poorly can help those students better evaluate how to use these tools.

For librarians, panelists suggested using online courses from Coursera, MOOCS from major universities, and other online tools to learn how the technology works, but also to just sign up for a free account and experiment with ChatGPT and other new AI tools. Watkins suggested that joining a coworker or small group of peers while testing it out could be helpful as well. “Have a ChatGPT listening party where you just [experiment with it], and everybody has a discussion,” he said.

Aragon said that probing these tools with questions and requests about topics that you have some degree of expertise in will help reveal the technology’s current flaws and strengths, and possibly even offer some insight into where it is sourcing information about a specific subject. “Whatever you feel like you've got a good knowledge base of, whatever topic, ask ChatGPT,” he said, and see how the results measure up.

“Asking yourself the question, ‘Why did it come up with that answer, or why did it produce that?’ [regarding] things you know a lot about reveals quite a bit,” McMichael agreed. “If you have some background knowledge in a particular subject, you could start to identify, ‘Oh I'd bet it looked at this source and that source to get this answer.’”

Currently, these tools sometimes seem better suited to perform tasks than to give 100 percent accurate responses to questions. For example, Aragon teaches Python courses, and he experimented with asking ChatGPT to write a Python program that could take data files, clean the data, and then output the data into a spreadsheet. “It gave me code that was really good,” he said. “I didn't ask for comments [explaining the code], but it included comments, and it really broke things down in terms of how it did it. And the code worked.” As an aside on academic integrity, he noted that if a student leaned too heavily on the platform “just to get ahead in your coding class, you’ve got to ask yourself why you’re getting into coding in the first place.”

While the panelists did express some apprehension over the potential directions that commercial AI could take—with concerns surfacing throughout the conversation about academic integrity, dehumanization of work, and the lack of concern that AI corporations may demonstrate about its potential for misuse—the consensus was that librarians should be learning this technology, and that the field needs to take its spot at the table as commercial AI expands.

“There is individual incentive to use AI, because it can help you get ahead,” McMichael said. “But there is risk to all of us if we don’t take collective action, which is actually why I think libraries are a really good place to be talking about [AI].” In terms of the motivation for corporations to invite libraries to help steer the direction of these tools, McMichael later added: “We have something that’s really valuable to these companies right now, which is really good data—that’s been curated and has been touched by human beings—that we know is reliable and valid.”

Author Image
Matt Enis

menis@mediasourceinc.com

@MatthewEnis

Matt Enis (matthewenis.com) is Senior Editor, Technology for Library Journal.

Comment Policy:
  • Be respectful, and do not attack the author, people mentioned in the article, or other commenters. Take on the idea, not the messenger.
  • Don't use obscene, profane, or vulgar language.
  • Stay on point. Comments that stray from the topic at hand may be deleted.
  • Comments may be republished in print, online, or other forms of media.
  • If you see something objectionable, please let us know. Once a comment has been flagged, a staff member will investigate.
Sorry !!! Your comment is not submited properly Or you left some fields empty. Please check with your admin


RELATED 

ALREADY A SUBSCRIBER?

We are currently offering this content for free. Sign up now to activate your personal profile, where you can save articles for future viewing

ALREADY A SUBSCRIBER?