The recent development of a college-wide rankings strategy by Trinity and the creation of a Rankings Steering Group, chaired by the Provost, Patrick Prendergast, reveals the increased attention College is giving to rankings, and brings the debate about their importance, and how much energy any third-level institution should give to them, to the fore once again.
Much maligned, but unquestionably influential, university rankings have become an unavoidable part of the higher education landscape. Indeed, they are increasingly held up as the main target for universities desperate to attract staff and students in an increasingly competitive higher education sector.
The reason for this emphasis comes from the decline of Trinity’s position in both the QS and Times Higher Education rankings in recent years. For instance, this year’s Times Higher Education rankings placed Trinity 160th internationally, down 22 places from 2014. Indeed, Trinity’s position in 2015 also marks a significant drop from 2011, the first year of the Times Higher Education rankings, when it came in at 76th place.
According to the Rankings Editor of Times Higher Education, Phil Baty, it is not uncommon for universities to use their rankings as the basis for strategies and targets. Speaking to The University Times he explains: “One of the things we find, actually, is that people say to us, if they maybe do better in the QS rankings they’ll use that in their marketing, but actually in their strategic planning at institutional level, for the governing bodies, they use Times Higher Education.”
This focus is not just found in universities that further down the rankings or facing a decline. As Baty tells The University Times, rankings matter to even the best universities in the world: “We find at the very highest end of the rankings, people are still working on that basis. Places like Massachusetts Institute of Technology (MIT) have very sophisticated data analysts, institutional researchers, and they use rankings as part of their benchmarking, as part of their strategic development.”
This obsession with rankings is a somewhat-modern phenomenon, for most institutions, increasing their position in rankings is a crucial aim. “Right across the world you’re seeing universities quite actively targeting success in rankings, because I think, first of all, rankings, or certainly the rankings in Times Higher Education, do reflect real performance”, Baty says.
There is no doubt rankings are incredibly useful sets of data. They provide universities with indicators of the prestige of their research and the strength of their teaching, and Times Higher Education even look at how “international” a university is. All of this data can be used to develop strategic plans, and lead to affirmative action that may ultimately lead to improvement in the rankings.
Trinity’s own focus on rankings is abundantly clear. The College’s 2014-2019 Strategic Plan targets a top 20 spot in Europe in at least one major world university ranking and a place in the “upper levels” of the top 100 in the world in at least one major world university ranking. Trinity is currently well outside this aim in the Times Higher Education rankings, and is ranked 78th in the QS rankings worldwide. It is also ranked 78th in Europe by Times Higher Education, well outside that coveted top 20 position.
The plan describes these goals as a “common mission for the college community”. Indeed, once you begin to dig deeper into the world of university rankings, it is clear that they are not just about prestige. Instead, they are arguably central to the core goals and aims of a struggling university like Trinity.
At first glance, Trinity’s strategy does seem to focus on improving the university’s performance in specific ranking metrics. The explicit focus on areas such as outputs, citations, funding levels, staff composition and reputation, and the development of a publications strategy, is a clear attempt to maximise Trinity’s success across the range of indicators used by rankings like Times Higher Education.
While universities can use the rankings data internally for whatever they want, Baty does caution about using this data to “dictate behaviour”, whether through maximising submissions to publications or interpreting data in a certain way or “changing your strategy to chase rankings goals rather than be true to yourself”.
One of Trinity’s key aims is to attract more international students and, in an increasingly competitive higher education world, rankings are one of the best ways of achieving this. As Baty emphasises, rankings “give visibility to universities to demonstrate to the world how strong they are, and they provide great visibility so they allow universities to recruit international faculty and international students”.
Speaking to The University Times, President of NUI Galway, Dr James Browne, agreed: “Staff who are interested in building a career, young staff, will seek a job, seek a post, in an institution they believe is doing well.”
This concern is echoed by Trinity’s Strategic Plan, which states that a “university that competes globally for talented students and staff, and for industry research contracts and strategic partnerships cannot but recognise the importance of international rankings”. Thirty per cent of the Times Higher Education rankings are based on research, and there is nothing more attractive to industry investors than a strong body of academic research and a highly ranked faculty. The recent partnership with the University of California, San Francisco for the Global Brain Health Institute would arguably not have come about if Trinity wasn’t highly ranked for this type of research.
It is clear that the twin concerns of funding a modern university and competing in rankings are not mutually exclusive. Indeed, in a press release in September, Trinity’s Dean of Research John Boland drew this link himself when he said that “sustained investment in the university sector” was needed for Trinity to “sustain” and “increase” its world ranking.
This correlation between strong funding and rankings position is something recognised by Baty: “Money talks, money’s important, and there is a sense, I think, where universities that we can see are starved of funding or lacking sufficient funding to stay competitive are suffering in rankings.”
In recent years, nearly all Irish universities have all toppled down the rankings, a fall that has been attributed time and again to a lack of sustainable funding. The urgent need for a new funding model is part of the reason the government higher education funding working group, chaired by Peter Cassells, was established, with one of the options recommended by the working group an income-contingent loan scheme.
Referencing the funding models in the UK and the US, Baty said: “I think universities need a diversity of funding, and increasingly they need to look to the private sector to get some of that cash, and whether it’s fees or other arrangements, that’s not for me to say, but clearly, the more money you have, the more likely you are to stay competitive.”
Often, it takes this kind of sustained focus and scrutiny from a national government to boost international rankings positions. For instance, Baty references the Russian government’s “5–100” project, whose aim is to get five universities in the top 100 of world rankings by 2025, as well as similar schemes in India and Japan. So far, this prioritizing of higher education by national governments has been lacking in Ireland.
This does not mean the drop in rankings hasn’t caused concern at national level. However, much of the anxiety around rankings is attributed to the increasing staff–student ratio in Irish universities. In op-eds written in advance of the general election for The University Times, both the leader of Fianna Fáil, Michael Martin, and the leader of the Labour Party, Joan Burton, referenced the increase in student-staff ratios, with Martin emphasising that the “ratios have gone from amongst the best in the OECD in 2008, to amongst the worst today”.
Interestingly, however, the student-staff ratio is not something that the Times Higher Education rankings see as overly significant. The QS rankings, on the other hand, give such ratios a 20 per cent weighting, something Baty labels as “crazy”. “We think it’s important as part of the formula, and an important part of the overall picture, but we wouldn’t want to give it excessive weighting, we wouldn’t want to give it too much weight, because it’s a bit simplistic”, Baty says.
However, this is lack of emphasis on the staff–student ratio is perhaps one of the reasons that Times Higher Education is often left open to the charge of not being much use to students trying to decide their university.
Speaking about rankings generally, Browne is adamant that “the experience of a student in a university cannot be captured by a set of metrics like that. The student experience is much more rounded”. Listing the important of access to staff, the access to laboratories, and the general culture within the institution, Browne concedes that “they’re getting better all the time”, but rankings still cannot capture the full breadth of a university’s education.
Other rankings systems have sprung up in an attempt to substitute this academic focus of Times Higher Education for something more student-based . Most notable is the Guardian’s University Ranking, established in 2010, which is primarily focused on the needs of students going to universities in the UK. Unlike the Times Higher Education rankings, which focus on the views of academics and university staff, the Guardian instead surveys students on their own perceptions of their institutions. In addition, the staff–student ratio is given a 16.25 per cent weighting in their metrics, which reveals their overt focus on the interests of students. This student-focused analysis produces a rankings system that, while not totally different to that of the more established rankings, is certainly a product of its more idiosyncratic approach.
The broader focus of Times Higher Education is down to its original aims. Baty is confident that their rankings, despite having a more academic focus, can still benefit students: “While our ranking wasn’t created with the student in the front of mind, we do provide a set of benchmarks and a set of tools that are much much better for students than any of the other rankings.”