In my role as the library dean I have already been engaged with data-driven student success initiatives. We are still working to implement use of EAB on our campus. The company offers a set of tools that can track grades and registration (OK – we already do that). It also provides functionality that tracks visits to the tutoring center, the health and counseling centers, and can analyze how successful students are when they take specific classes and specific times. If you guessed that faculty are very wary of that last bit of analysis you’d be correct.
In a meeting to preview the product I asked how our data would potentially be used by the vendor, and I asked when and how students would be informed about data collection. I also asked whether students could opt out. It took until the next meeting to get some of the answers. I am fairly certain no one thought about including the students as participants who need to be informed and aware of what is being collected, by whom, and for what purposes. And for them to participate in decision-making about what is done with those data.
At the same time, I am aware that, in the library, we have collected data (statistics as we call them in our charmingly Luddite way). We count how many people come in and out of the building, we count how many times items have been used (and can separate out by groups, e.g., just undergraduates), how many searches and full-text downloads of online articles we have, how often study rooms are used, and that doesn’t even begin to include how much money we spend on a wide variety of resources.
There is something about all those data that I think (and continue to think) is important. At my library we do not connect any individual identifying information with all those data. We don’t swipe IDs (voluntarily or not) at the Research & Writing Help Desk. We don’t have people swipe coming in or out of the building. And we have not done any studies that tie online resource use to specific students or cohorts of students, their GPAs, or anything else.
It has been my great good fortune to have arrived at Keene State College at a time when we have strong and vocal advocates for open pedagogy. By now many readers will recognize names like Karen Cangialosi, and Robin deRosa, and if you don’t you should. Go read things they write! And put Sara Goldrick-Rab, John Warner, Maha Bali, Bonnie Stewart, and my new BFF Rajiv Jhangiani on the list (and so many more). I owe them and a much larger community a debt for educating me. There are too many blog posts to mention by a librarian I admire and respect, Barbara Fister . She is my conscience as I struggle to forge my way through these issues.
It was with all of this in mind that I attended a session at the recent ACRL conference. The panel reported on a report published November 2018. The Institute for Museum and Library Services provided over $90,000 in funding for The Library Integration in Institutional Learning Analytics (LIILA) project. Megan Oakleaf from Syracuse University was the project lead. The final report, Library Integration in Institutional Learning Analytics can be found here. The session provided a summary of the report.
I have not yet had time to read the full report. Ten pages are devoted to “obstacles” to library integration including privacy, confidentiality, risk mitigation, and organizational culture. And the presenters readily acknowledged that much of the conversation was likely to make people uncomfortable.
But what I found most disconcerting was what I did not hear. Among the obstacles to the library’s participation in using learning analytics I did not see or hear anything about the ethics of using library systems and services data in this way. I heard, “this is happening all over campus already, in the LMS, at the dining hall, in the gym.” That’s true – my ID is swiped ever day when I go to the gym and I’m not sure if it’s just like a tic mark on paper or it is capturing information on me (I suspect it’s the latter). Because my gym membership is now paid out of pocket by me and then reimbursed by my insurance company, at least they don’t seem to care if I use the gym or not. But someone does.
Still, I can’t get that saying about everyone jumping off a bridge out of my mind. I don’t think I am ready to get in line behind everyone just yet. One presenter mentioned now asking students who use the research help desk to swipe their ID and giving them the ability to opt out (they quickly offered that they have never had anyone refuse to swipe). If you were a first-year student and a reference librarian asked you to do something would you feel comfortable saying no? And is anyone explaining what will happen to the data, or provide some reasons why a person might want to opt out?
So I am struggling to try to balance an outright knee-jerk rejection of this whole project (am willing to be convinced) with the honest acknowledgment that (1) yes, we already collect and use data, (2) we are invested in helping students flourish, and (3) we need to be at the table for these conversations. I remain skeptical that the kinds of learning analytics we could collect will contribute to the “student success” conversation. I still think we should ask students early and often what we can do to help them succeed. And, as library manager, I recognize that I need to be able to justify or demonstrate the need for and efficacy of the considerable sums of money we spend on resources.
But I am also willing to admit that I am not sure this is a road I want to go down. I need to read the report in its entirely, think it through, talk about it with my colleagues. But just like the discussions around assessment are finally capturing the attention of the assessment experts, I wonder if our appetites for learning analytics will also provide ultimately to leave a bad taste in our mouths.
I am interested in your opinions and ideas.
Hi Celia,
Thanks for sharing your reflections and participating in this important conversation. I wanted to note to you that you are not alone in your concern. With my colleague Michael Perry, we presented at ACRL some of our initial findings on student perceptions of their privacy related to learning analytics practices (see: http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2019/InTheirOwnWords.pdf). We had a very engaged conversation with attendees during Q&A and after the session.
The team is dedicated to investigating these issues over the next 2.25 years (see: datadoubles.org). Please stay in touch!
Happy to talk more offline,
Kyle Jones
Thanks to all of y’all for doing this work.
I really appreciate this reflection. Personally, I am not compelled by the “everyone is doing it argument.” I am compelled by the reality that others on campus can also collect data on library resource use (e.g., all the Internet traffic in/out of campus is logged, if your library resources are embedded in the LMS there is a record of use there too). And, that’s leaving aside that the platforms we contract with are logging every click and keystroke. So … I’d rather be at the table and (attempting to be in a position of) directing how the data is captured and used than be absent while who knows what is happening with the data elsewhere on campus and with third-party platforms.
Thanks, Lisa,
I am all for being at the table. I am serving on a system-wide group writing a RFP for bookstore services now. Luckily we don’t have any library resources embedded in our LMS. My being at the table allowed me to ask questions about the EAB implementation and I hope I am able to continue to listen, learn, and speak up.
I’m against collecting personal data from students for a couple of reasons: one, it’s contrary to the code of ethics of my profession. I don’t care how you spin your data protection stuff, you’re violating students’ privacy and our profession should absolutely stand up for privacy as a condition for intellectual (and every other kind of) freedom. Second, I don’t buy the argument that we have to do this to help students or we’re letting them down. It’s about helping the institution, and it’s stupid research design. You want to help students? Make sure they have a place to sleep tonight and enough to eat. Make sure they have human connections with people who care about them as people, not as data points or as subjects who can be shaped and corrected to fit some norm. Make sure your institution is kind and thoughtful and understanding about students’ actual lives. You want to help students? Talk to them. Find out what they need. Do ethical research if that helps you find out what they need. Don’t spy on them with card swipes – it’s neither ethical professionally or from a research design perspective.
Also, thanks for the mention. The admiration and respect is coming your way from me.
Thanks, Barbara – yes to what you have written. I have really come to the position that what we can do to support students best is what you describe. The 100 bicycles that we loan students each semester (whether they use them for getting to a job or to just get out for a ride) might mean more than other things we do and that is really OK with me.
What a cool program!!
I wrote these two five years ago. We’re still engaged in the same debates and raising similar questions:
http://www.ala.org/acrl/publications/keeping_up_with/learning_analytics
https://www.libraryjournal.com/?detailStory=taylorism-comes-to-campus-from-the-bell-tower#_
Now I think our institutions can do some pretty amazing things with data in a way that is less likely to violate any student’s personal privacy but can help them succeed where they failed in the past.
At my own institution (lots of 1G students, lots with lack of adequate prep for college, lots with considerable life challenges – and too few faculty (and too many adjuncts) to give personal attention to everyone) data science isolated the exact point in a high enrollment course where students were most likely to fail. This required course had students failing and re-taking it three or four times. Can you imagine the impact that has on added debt, longer times to graduation – if students even persist. With that failure point discovered, the course was redesigned to help struggling students get extra help at the exact point where it was needed. The failure rate dropped considerably and more students succeeded while accumulating less debt. That sounds like a big win for everyone.
I was at that session at ACRL and I questioned if the collection of data was really necessary to support success in the way I describe it above or if it was mostly to report data about library usage back to higher level decision makers. While it would be helpful to have more information about who we’re not reaching so we could better connect with them, we do need to keep asking if it would be worth it and what cost we would ultimately pay in obtaining it.
We might be asking these same questions and having the same debates 5 years from now…if our institutions aren’t already just collecting and analyzing all this data as they please.