NSF-Funded Team To Develop Community-Specific Information Literacy Tools

With the help of a $750,000 National Science Foundation grant awarded in September 2021, a team of researchers has launched “Adapting and Scaling Existing Educational Programs to Combat Inauthenticity and Instill Trust in Information,” a study created to understand the information literacy needs of populations usually overlooked in such work, and to test methods of improving information literacy among them. 

Graphic reading
Courtesy of Stanford History Education Group

With the help of a $750,000 National Science Foundation (NSF) grant awarded in September 2021, a team of researchers has launched “Adapting and Scaling Existing Educational Programs to Combat Inauthenticity and Instill Trust in Information,” a study created to understand the information literacy needs of populations usually overlooked in such work, and to test methods of improving information literacy among them. Starting with methods used in K–12 information literacy instruction, the project will investigate the processes several groups use to distinguish truth from untruth when making decisions, and then develop a new instructional toolkit for use outside the formal education system, with a strong focus on libraries.

Led by MIT Comparative Media Studies/Writing Associate Professor Justin Reich, the team’s co–principal investigators include Stanford University professor Sam Wineburg, who studies how people judge the credibility of online content; University of North Carolina at Chapel Hill School of Information Assistant Professor of Library Science Francesca Tripodi, whose research focuses on the sociological complexities of media literacy and how information systems are exploited for political gain; and Michael Caulfield of Washington State University Vancouver, where he directs blended and networked learning.

Phase one of the project will prototype and test interventions with populations not customarily part of information literacy education efforts, starting with low-income Americans without broadband access in rural Montana. Using these findings, the team will then apply for a second phase grant of $5 million. Possibilities for phase two include looking at targeted interventions among Indigenous communities, Latinx communities in urban centers (specifically targeting health misinformation), families on military bases, and older internet users. Team members are currently in the process of cultivating partnerships as they prepare to apply for the second grant.

“As we move forward, we’re thinking [about] how do each of these communities approach information and trust differently?” said Tripodi, “and how can we get a better understanding of these cultural complexities so that we can create tools that best match their needs, rather than assume there's a blanket tool that can just be used anywhere.” Partners such as Humanities Montana, a local National Endowment for the Humanities affiliate that worked with Wineburg at Stanford—whose board members include a range of small town and rural residents, from agricultural workers to Native American tribal members—will help ensure that the communities they identify to work with have input as well. The project members plan to develop similar mechanisms for input if they move on to phase two.

The grants are part of NSF’s Convergence Accelerator program, which has awarded more than $21 million to 28 multidisciplinary groups in the first phase to dig into two complex research areas: Trust and Authenticity in Communications, the focus of this information literacy study, and the Networked Blue Economy (the sustainable use of ocean resources). Teams are drawn from academic, business, government, and nonprofit sectors and other communities of practice. During phase one they have nine months to develop their ideas into proof of concept, identify team members and partners, and participate in NSF’s innovation curriculum that offers tools for human-centered design, user discovery, team science, early-stage prototyping, storytelling, and pitch preparation.

 

FOLLOWING THE FACT CHECKERS

Wineburg’s earlier research provided the project’s starting point. In his study “Lateral Reading: Reading Less and Learning More When Evaluating Digital Information,” he discovered an interesting fact about how different groups analyze the credibility of websites. When he and his team at Stanford conducted a series of assessments of search skills among several groups—including undergraduate students, PhD historians, and professional fact checkers—the fact checkers consistently outperformed the others on online search literacy tasks.

Most of the study groups, Reich explained, evaluated a website’s trustworthiness by “vertical” scrolling: checking within the site to see if it has recent content, an About page, and citations, and whether it is professionally formatted—“all these things that people have been taught are markers of credibility,” Reich told LJ.

When fact checkers evaluate a site, however, they read “laterally,” leaving the site to hunt for supporting information elsewhere. “They find a couple of key search terms to look for and they use the web as a web—they figure out whether or not other people have commented on the site, and sources, and things like that,” said Reich.

In the case of minimumwage.com, for instance, the fact checkers discovered that the site was supported by the Employment Policies Institute, a conservative think tank created by a former lobbyist in favor of lowering the minimum wage. “The Stanford students, the tenured historians, they never get to that important sourcing information,” noted Reich, “not because they're not smart or lack critical thinking skills, but because they're not using techniques that work to search online.”

Lateral reading search skills, he added, are not difficult to teach or learn. Wineburg and his colleague Joel Breakstone built a Stanford-hosted instructional site, Civic Online Reasoning, which offers a free curriculum for teaching students to evaluate online information. Caulfield developed a curriculum for evaluating sources using the acronym SIFT: Stop; Investigate the Source; Find better coverage; Trace claims, quotes, and media back to the original context. Both curriculums were used successfully in middle school, high school, and college settings, and Reich used them to develop an information literacy MOOC (massive open online course).

With the NSF grant, “Our mission is to go from working in public education to educating the public, to finding all of the places in which people—especially in the United States, but around the world—go for information and searching tasks,” said Reich: “Libraries, healthcare settings, other kinds of schools, town halls, public spaces, and trying to make sure that they have access to resources to help them search effectively and expert mentors who can help them.”

 

TO MONTANA AND BEYOND

Libraries are at the top of their list. “We're trying to figure out how we can get into a space where it's not centered around an education model,” said Tripodi, whose research has focused on the cultural complexities of media literacy, and how people’s approaches to information are closely tied to their communities of origin. “We thought libraries were a great place for that.”

The team’s first stop was rural Montana, in areas where residents don’t have consistent broadband access and often use public libraries to go online. Humanities Montana helped them identify multiple libraries that fit the bill; Institutional Review Board (IRB) privacy protections prevented LJ from getting participant feedback.

Humanities Montana originally brought in the project team to present on its media literacy training curriculum at the Montana State Library, said Kim Anderson, Humanities Montana director of programs and grants. When they began work on phase one of the NSF proposal, the team suggested that they collaborate. “Our role, as the statewide humanities council and provider of grants cultural programs around the state, was to be a connector for their researchers into these communities,” Anderson told LJ.

Tripodi spent two weeks talking with library employees and patrons about their communities’ needs. She took notes as library users conducted online searches and walked her through their steps, and also made fly-on-the-wall observations, watching them engage with information without offering suggestions.

What she discovered is that there is a great need to address information literacy in a mobile context, identifying and developing quick touchpoints to help them through their search processes. The patrons Tripodi observed had a range of needs, including housing and food insecurities, and limited time to access the resources they were looking for.

“A lot of the tools out there are focused on information seeking as though you’re sitting down at a computer, and you have a lot of time to look up things,” she said. “What we're finding is that, in particular in rural areas where internet access is not as widely available as it is in more urban settings, the role of mobile quick lookups is essential.”

Tripodi also noted how strongly bias figures into their search habits. She asked patrons to look up information on land use, a subject of interest to many area residents. “And we found that those who had very specific views on that—whether or not government should preserve land, or whether or not land should be cultivated for private ranching purposes—really did direct the kind of information that they then sought out in these live interactive searches.”

“We were able to give three dimensions to what can sometimes be a stereotypical view” of rural concerns, noted Anderson. “It's not clear cut, and we wanted to be a part of this so that we could give a fuller picture.”

 

DEVELOPING TOOLS

Going forward, “We'll create small interventions,” Tripodi told LJ . Possibilities include quizzes available via QR codes on bookmarks to help users gauge what kinds of help they need with their information-seeking habits, rather than ask them to complete a long module—“which is what a lot of the [information literacy] courses look like now,” Tripodi noted. Tools could be developed to help users identify biases that influence their search results.

“We’ll test these inside communities and see how well they’re working, or how well they're not, make modifications based on that feedback, and then create another set of tools to test again.”

Together with nonprofit news organization Retro Report, which produces mini-documentaries, the team is creating a series of “nudge videos” that encourage lateral reading, click restraint—a strategy that involves resisting the urge to immediately click on a result and instead scanning the page to make a more informed choice about where to click first—and the important role keywords play in the relevance of information they return. Tripodi will then interview users to see how—or whether—the new information aligns with their own online practices.

“What we’re trying to teach more people to do is open up a new tab and search for more information about the source,” she explained, encouraging them to ask questions such as, “Who funded this project? Do these people have a Wikipedia page? How can we use the information about them on their Wikipedia page to determine what biases they might bring to the table, or if this is a trustworthy source of information?”

(Editor’s note: Because Wikipedia’s volunteer editors are not without their own biases, searchers may have to go further afield to find information on women and others who are underrepresented there.)

The project’s final product will be a toolkit for search strategy education that can be implemented by a range of organizations outside K–12 or higher education. And while the team is open to what that may look like, they want it to be simple and extensible above all. “Hopefully, by adapting and creating more specifically culturally-centered media literacy, it will be more likely to be used,” said Tripodi.

“Because of the way the Internet is programmed, it's designed to best match the way we already see the world,” she added. “Google, DuckDuckGo, Bing, any kind of search engine, is actually much more likely to reflect our existing biases back to us. Something that I think librarians can really be useful in helping people think through [is] why their key terms are so important, and why those starting points are so important.”

“We don't want to make people do an hour-long course, but could we get them to watch a 30-second video and do a two-minute exercise? Can we make something that's optional as a pop-up [on library computers]?” Reich suggested. “What if there were tablets at every reference desk?”

Author Image
Lisa Peet

lpeet@mediasourceinc.com

Lisa Peet is Executive Editor for Library Journal.

Comment Policy:
  • Be respectful, and do not attack the author, people mentioned in the article, or other commenters. Take on the idea, not the messenger.
  • Don't use obscene, profane, or vulgar language.
  • Stay on point. Comments that stray from the topic at hand may be deleted.
  • Comments may be republished in print, online, or other forms of media.
  • If you see something objectionable, please let us know. Once a comment has been flagged, a staff member will investigate.


RELATED 

ALREADY A SUBSCRIBER?

We are currently offering this content for free. Sign up now to activate your personal profile, where you can save articles for future viewing

ALREADY A SUBSCRIBER?