Last week’s Times Higher Education had a piece about big data and the university. These questions about the use of metrics in the university sector seem to be gaining increasing importance. It would seem that wider forms of what Nigel Thrift called Knowing Capitalism are continuing to find their way into the academy. Masses of data are now available, with the increasing emphasis on using this by-product data to measure and guide practice. In this Times Higher article the suggestion is that we should use the accumulated ‘big data’ generated by routine engagements with universities to improve the performance of individuals and organisations. It is perhaps no surprise that colleagues across the sector are turning to the discourse of neoliberalism to try to understand these changes. Take this section from the THE article for an illustration of how such an ethos is now guiding the conception of the university:
Universities risk losing their competitive edge if they do not make better use of the information they are collecting from students and lecturers, experts have warned…Institutions automatically collect data on how students and staff interact with campus services such as the library or the finance department. By analysing these “big data” sets, they can uncover patterns that might help them to improve their performance.
I can see the obvious logic. The sense is that if we are working in a market place with competition etc then we need to be competitive. But I’m wondering what the use of ‘big data’ in universities will come to mean and what the consequences might be. It’s quite hard to imagine, but Roger Burrows offers some very thoughtful suggestions here (I’d really recommend to anyone interested in these questions to read this piece).