Positive, Negative, or Somewhere In Between
What’s your immediate reaction? Those having a positive reaction may appreciate the value in using technology to help students succeed. That might be particularly important to academic librarians at community colleges or universities with large segments of low-income and first-generation students where retention rates are low. To help more of those students persist to graduation, we might be willing to use any technology resources at our disposal—especially as the latest data indicates that the number of students who fail to persist after the first year is on the rise. Those who had a negative reaction or were feeling some level of discomfort no doubt worry how such technologies invade and compromise personal privacy. There is also likely some middle ground where we can see the possibility for both good and bad outcomes. We want to give our students every advantage, but at what cost—and are we willing to pay the price?Already There?
However you reacted, the reality is that high-tech performance monitoring systems will become commonplace in the not too distant future. From the cradle to the grave, parents, educators, and employers will leverage monitoring and assessment technology to help us perform at our best, guide us to make the right decisions, and protect us from the unseen and unknown dangers that could derail us from the tracks of security and success. The growing popularity of wearable fitness devices points to a willingness to subject ourselves to performance monitoring if we believe it will produce a desired outcome. When the data from a day of activity is uploaded to the cloud, do we know exactly where it’s going, who can access the data, or the degree to which its confidentiality is protected? Surveillance technology is also being applied in the workplace, mostly in the service and retail industries, to monitor and improve employee performance as well as allow for the detection of theft or unethical behavior. Research from 392 restaurants that installed these systems reported that loss from thefts had moderately declined, but, more significantly, the sales per server increased dramatically. Knowing they were being monitored, servers worked more diligently to sell extra appetizers, drinks, and desserts. Owners received more profits, servers received bigger tips, and customers received better service. Who else was a winner? It also allowed the system owners to collect vast amounts of data to use for their restaurant consulting services. As the products and services we use all become more “intelligent,” data will be collected about and from us in ways we can hardly imagine today. By 2025, every automobile manufacturer will produce “connected cars” that collect data about our driving habits, destinations, and system performance. Would you be opposed to getting a text message from your car pestering you to stop procrastinating in getting the oil changed? Helpful, maybe, but at what cost to your privacy?Asking the Right Questions
Why, now, do humans need to be monitored to help them become better students or workers? What’s wrong with allowing students to experience college without a safety net there to save them whenever they lose their balance? John Warner smartly tackles these questions in his essay “The Costs of Big Data.” Reacting to a piece by Anya Kamenetz about the Course Signals system at Purdue, an analytical performance-tracking warning system, Warner asks what it is we really want for our students when it comes to success. Just graduating? He writes, “What if we worry that their adult lives will not come with Course Signal warnings? And mostly, what if we worry that this institutional focus on capturing and employing data distracts us from what is most meaningful about the college experience…maybe tells students that they are a data point. Or maybe Course Signals becomes a crutch, substituting tips and tricks for in-depth human interaction, the kind we know alters lives.” We need to have critics question the value or necessity of performance tracking and analytics systems, not only because of the mishandling of data and privacy intrusions but because of the unknown consequences it may have for our students. What if it helps them survive college but not the real world of work they’ll likely face where no one is helping them avoid failure—or will it be the world where constant monitoring, data collection, and analysis is just the way of life?Calming the Fears
Knowing that there are ways in which collecting and using student data could prove beneficial, perhaps we need to refrain from immediately writing off monitoring and preemptive warning systems as dangerous technologies. To that end, how student data is or will be used is a growing area of debate at all levels of the American education system. Repeated large-scale data mishandling and privacy intrusion incidents should rightfully have us questioning why feeding student data into these systems is a practice worth even considering. In his article “Reframing the Data Debate,” Steve Rappaport acknowledges this when he states, “Fears about misuses of student data feed into larger narratives about dangers to privacy and the security of data fueled by revelations about the NSA, Target, etc., and their fervor makes it impossible to dismiss them as ill-informed rants.” He goes on to remind us that progress in education at all levels has always depended on the collection of student data. Though he represents the interests of educational technology firms that produce the learning products consumed by students, Rappaport writes that those firms must clean up their acts and demonstrate that they can calm the fear by making sure student data is secure and that privacy rights are respected. That sounds good, but can we trust the EdTech industry to do the right thing?Setting Limits and Sensible Choices
Perhaps the use of digital tools for tracking, monitoring, and performance assessment, all intended to facilitate predictive analytics, is neither good nor bad. They are tools we have at our disposal to allow us to accomplish something helpful but could have unintended consequences that would lead to harmful results. It’s up to us to determine the level at which we implement and apply the tools and to understand fully the context for their use. While I think it’s interesting and builds on a growing personalization of service trends in academic librarianship, I’m personally uncertain about a “Library That Learns You” service. While I think some students would find it valuable, and it could possibly shift the odds of success to the student’s favor, it hardly seems like our preferred mode of interaction. Just because you could put a robot at your reference desk, would you do it? It may sound awful now, but in 25 to 30 years when it’s technologically possible, perhaps it will be just one more user expectation, not unlike expecting to find a café in today’s library.Learning From the Past
I don’t have the answers. What I do believe is that, over the next 20 or 30 years, our profession will be greatly challenged in this whole environment of student data. Some of the pressure to participate in these systems will come from our own academic administrations as they seek to improve student performance, lower student debt, and achieve the metrics required by emerging government standards. At one point in time, Taylorism was a respected method for improving the workplace and outcomes. Looking back, we now know that imposing scientific management achieved great efficiencies but did so at the cost of destroying worker morale. We will need to be careful not to repeat the mistakes of the past when it comes to deploying technology with the good intention of helping our students achieve short-term results when it is not clear to us, in the long run, how it will truly impact them.We are currently offering this content for free. Sign up now to activate your personal profile, where you can save articles for future viewing