We live in a world where data is everywhere, it is used by companies to do everything from show ads on social media to help sway elections across the globe.
Now, that list also includes predicting student behavior at the University of Arizona, according to a recent press release featuring the UA’s resident expert on data, professor Sudha Ram.
Ram, who runs the Eller College of Management’s INSITE center for analytics, recently shared how her team was able to predict, with around 90 percent efficiency, which students would drop out and which would stay enrolled by using data collected and provided by UA Information Technology.
“By getting their digital traces, you can explore their patterns of movement, behavior and interactions and that tells you a great deal about them,” Ram said in the press release.
While the goal of keeping students on the path to a degree is certainly a noble one, this vast collection of data raises some concerns that the university must address.
First, students were seemingly given no notice of the fact that more than 800 data points, from financial aid status to D2L usage, would be monitored, tracked, collected and analyzed.
Although the university is currently using this information to determine student success rates, what is to stop them from using that data for other methods, like trying to predict who will go to UA sporting events, then making extra efforts to sell tickets to those certain students?
While it’s extremely likely students signed their right to privacy away in exchange for a chance at an education, the fact that the UA is allowed to watch our every move, and would be doing so, should still have been made explicitly clear from the outset. No one likes to find out they are being tracked like a lab rat via the media.
The university has said that the data is anonymized to ensure privacy, yet also stated that information is shared with advisers in order to help prevent students from leaving. This clearly means that at some point, someone is able to correlate the data to an individual.
If this data can be used to determine when a student is likely to succeed or fail in classes, it can also be used to predict many other, less utilitarian goals as well.
Those that are uncomfortable with their data being tracked should be given an ability to easily opt-out. This will help ensure that no one with privacy concerns will feel exploited.
RELATED: EDITORIAL: UA must put student needs above athletic dollars
Second, the UA must be clear what methods it is using to ensure our data is safe. Universities are one of the most frequent targets for hackers, with systems under constant assault from state-sponsored actors and sophisticated criminal networks.
The UA administration must understand these risks, and is surely acting to protect against them. But as history has shown, anyone can be hacked, including credit companies, movie studios, political parties, government agencies and even the director of the CIA.
All it takes is one successful breech to expose thousands of students to potential risk. The amount of data the UA has collected is staggering, and it’s all just a few clicks away for those that are determined to get it.
Furthermore, INSITE also wants to track UA Wi-Fi data to get an even bigger snapshot of what students are doing.
Given the ubiquitous use of wi-fi on smart phones and other devices, soon the exact locations and behaviors of everyone on campus will be up for grabs, if wi-fi tracking becomes a reality.
In the classic sci-fi novel “Minority Report,” predictive data is used to eliminate murder by arresting people before the crime is even committed. The very question of free will is invoked, and some very poignant themes regarding authoritarianism vs. individuality are raised as well.
In the novel, the authorities are trying to control the future. While not using our data to predict when students will commit crimes (although that would hypothetically be possible, given the amount of information collected), the goals of the INSITE are very similar to the Precrime Division.
“It’s all about thinking about the future,” Ram said in the press release. “It’s about planning for the future and making sure you’re doing things in a way that enables the future to happen the way you want it — for everyone’s benefit.”
With fears of how data can be exploited, and even weaponized, currently filling our screens, the university must do everything in its power to protect the information it has collected. It must also give students an opt-out option, and be open and honest with the community about how that data has been used, and will be used in the future.
Today, when the data is being used to help, there may be little to fear. But as the protagonist says at the end of “Minority Report” after being falsely accused of murder based on predictive models, “Better keep your eyes open. It might happen to you at any time.”
Editorials are determined by the Daily Wildcat Opinions Board and are written by its members. They are Editor-in-chief Courtney Talak, Opinions Editor Andrew Paxton, Content Editor Marissa Heffernan, Engagement Editor Saul Bookman and Arts & Life Editor Pascal Albright. Follow The Daily Wildcat on Twitter.