News

How Australian universities spent $4.5b on research in four years

Assessing the research performance of universities is a difficult pursuit. 

  • Bianca Nogrady

The Nature Index can tracks research performance by assessing the quantity and quality of academic publications. Credit: venimo/ Alamy Stock Photo

How Australian universities spent $4.5b on research in four years

27 October 2016

Bianca Nogrady

venimo/ Alamy Stock Photo

The Nature Index can tracks research performance by assessing the quantity and quality of academic publications.

Assessing the research performance of universities is a difficult pursuit.

While scientists argue that no measure or metric can account for the complexities of research, governments and funding agencies want to see the results of their investments.

Research performance has been weighed in many ways; by the size of a university’s research workforce, by the amount of funding it obtains and by the quality and quantity of its scholarly publications.

The Nature Index is another tool. It tracks the contribution of institutions to 68 top-tier natural science journals, and can be considered a proxy for high-quality research output. It categorises research into four broad fields, life sciences, physical sciences, chemistry and Earth and environmental sciences.

In Australia, Monash University in Melbourne had the highest contribution to the index over four years between 2012 and 2015, followed closely by the University of Queensland (UQ) and the Australian National University (ANU).

In specific fields, ANU led the country in physical sciences and Earth and environmental sciences, while UQ dominated the life sciences and Monash contributed the most to chemistry research.

Funding

To fund research, Australian universities received more than $4.5 billion of competitive grants between 2011 and 2014, the majority of which went to the larger Group of 8 universities. More than $3 billion of that was given to life sciences research.

The University of Melbourne was awarded the most money in that field, more than $534 million over four years, followed by Monash University, $494 million, the University of Sydney, more than $453 million, and UQ, more than $383 million.

In the physical sciences, the University of Sydney was awarded the most money over that period, more than $83 million, followed by Monash University with $81 million, and ANU with more than $50 million.

Competitive grant funding was mostly from the National Health and Medical Research Council (NHMRC) and the Australian Research Council (ARC). Funding data was provided by über research, a sister company of Springer Nature, the publisher of the Nature Index.

Research efficiency

Combining index output and grant funding reveals yet another picture of the country’s research landscape. Dividing a university’s contribution to the index by the amount of funding it received indicates, in relative terms, how this money has been translated into output in index journals.

While the Group of 8 universities received the lion’s share of funding, in some cases, smaller universities produced a greater portion of high-quality research with that money.

For instance, the University of Sydney received the most money for physical sciences research, $83.4 million between 2012 and 2014, but its contribution to articles in the index per million dollars was lower than 19 other universities.

In contrast, RMIT in Melbourne and the University of Wollongong received only $6.1 million and $4.1 million for research in that field, but their contribution to high-quality papers per million ranked 2nd and 4th in Australia.

In chemistry, Monash University received three times more funding than the University of Adelaide, but their contribution to quality papers per million dollars of funding was almost identical.

WHAT THE FUNDERS SAID:

The director of research analysis at the Australian Research Council, Marcus Nicol, says the cost of research varies dramatically between fields and study designs.

In life sciences, the cost of running a preclinical animal model study of a new compound with therapeutic potential would be moderate compared to investing in a large, phase III clinical trial.

While the animal study would require investment in laboratory space and equipment and a researcher and one or two assistants, a large clinical trial may enrol several thousand patients across multiple centres, possibly even countries. It would require the time of many health professionals, sometimes over several years, as well as medical and pathology resources and administrative costs.

Nicol says larger institutions are better placed to tackle more challenging and expensive research.

But such programmes may yield the same number of scholarly publication as cheaper projects undertaken at smaller universities, making larger institution appear less productive. “Bigger institutions are more likely to cover the full breadth [of research], from basic to applied, so they would have anything from modest costs to ridiculously expensive per publication,” Nicol says.

WHAT THE UNIVERSITIES SAID:

Calum Drummond, deputy vice-chancellor Research and Innovation and Vice-President at RMIT University, says smaller universities have to be more selective in the research assets that they invest in, and use those assets as efficiently as possible.

“We have limited resources to conduct research so we wish to ensure that we are using them productively and producing high-quality outputs and outcomes,” Drummond says. Often this means negotiating access to scientific infrastructure hosted or owned by other institutions, rather than buying or building the equipment themselves.

While access to resources can explains differences in research performance no measure can account for everything.

Richard Cook, manager of External Benchmarks at the University of Sydney , says index data has limitation. “A number of very high-quality journals such as New England Journal of Medicine and PLoS ONE are left out of this index, especially when multidisciplinary research is considered.”

Fractional counting, as used by the index, also penalizes research collaboration, because institutions receive less credit for a publication when it is a collaborative effort, he says.

But David Swinbanks, managing director of the Nature Index, said fractional count was used to fairly assign credit to all institutions that contributed to a paper. If the index only counted articles an institution had authors on, without measuring the size of that contribution, it would unfairly reward those institutions that had not contributed much to a lot of papers.

Journals included in the index are selected by a panel of active scientists, independently of Nature Research. The publications represent about 4% of articles in the natural sciences, but account for close to 30% of total citations in these fields.

Cook also believes that focusing too much on publication metrics neglects the bigger picture contribution that scientific research makes to society. “There are many different kinds of return on investment in research including, but not limited to, advances in the discipline, increased citations, impact in public health and the community, innovation and intellectual property creation, better student outcomes, influence on public policy and the creation of new industries and services,” Cook says.

“Publication output is only one part of research outcomes.”

INTERACTIVE: explore the research connections in Melbourne & Sydney

READ MORE: how science in Australia and New Zealand stacks up