Cover photo of Social Policy Journal

The Impact of the Performance-Based Research Fund on the Research Productivity of New Zealand Universities

Warren Smart
PhD candidate
Department of Finance
Auckland University of Technology

Abstract

The introduction of the Performance-Based Research Fund (PBRF) has resulted in much greater scrutiny of the research activities of New Zealand universities. This study examines the impact of this greater scrutiny on the research productivity of the universities. The analysis shows that most universities exhibit a significant increase in productivity in the period following the introduction of the PBRF. This finding is corroborated through the use of a production function approach to model the research process in New Zealand universities. This showed that the number of research publications listed in the Web of Science1 by university researchers is significantly higher following the introduction of the PBRF. Analysis of total research output data shows that this increase in Web of Science research publications has not been at the expense of other forms of research output.


Introduction

Governments around the world are increasingly using performance-based funding to allocate resources for research at higher education institutions. New Zealand is no exception to this trend, with the introduction of the Performance-Based Research Fund (PBRF) in 2004 representing arguably the most significant change to the funding of tertiary institutions in New Zealand since the introduction of the equivalent full-time student (EFTS) funding system in 1991. It marks the first time that a substantial proportion of tertiary education funding from Vote Education2 has been allocated based on institutional performance.

Although the main objective of the PBRF is to raise the average quality of research through rewarding excellence (Tertiary Education Commission 2004), the increased scrutiny the PBRF places on research performance is likely to have increased the quantity of research output at New Zealand universities. This study attempts to quantify this effect by analysing the impact of the PBRF in terms of stimulating research output, in the form of journal articles and reviews3 listed in the Web of Science, at the eight New Zealand universities4.

To measure the impact of the PBRF on research productivity, this analysis uses a mix of quantitative approaches. Firstly, the number of articles and reviews listed in the Web of Science per full-time equivalent research staff is examined over a 10-year period to see if productivity increased in the period following the introduction of the PBRF. Then a production function approach is used to model the research process at the New Zealand universities. Within this framework, multiple regression analysis is applied to panel data for the eight New Zealand universities. An advantage of using regression analysis in this case is that it can control for other factors that influence research output, thereby helping to isolate the impact of the PBRF.

This article begins by outlining the history of performance-based funding of research in the tertiary education sector in New Zealand and outlines briefly how universities reported their response to the PBRF. The reasons for using the Web of Science to measure research output are discussed and the limitations that apply to the coverage of total research activity in this data set are explained. The productivity of university research staff is then examined, and the empirical model used to estimate the research production function is introduced. This is followed by a discussion of the results of the production function analysis, with a focus on the impact of the PBRF. Finally, some conclusions and future areas of analysis are presented.


Performance-based Funding of Research

The New Zealand government has expressed a desire to introduce performance-based funding of tertiary education research for more than a decade. In 1997 the publication of the Green Paper (Ministry of Education 1997) signalled the government's intention to change the system of funding research from one based on the number of student enrolments to one where a contestable fund would be used to distribute funding based on the quality of research at an institution. However, a change of government in late 1999, along with concerns expressed by the universities at the size of the contestable fund and lack of operational detail, resulted in the proposed approach being deferred (Boston 2006).

The proposal for performance-based funding for research was revived by the Tertiary Education Advisory Commission (2001) report Shaping the Funding Framework. The report recommended the introduction of a "Performance-Based Research Fund", based on a mixed model of peer review and performance indicators,5 to assign funding based on the quality of research at an institution. The Cabinet signed off on the decision to go ahead with the PBRF in May 2002, with the operational detail of the PBRF being outlined in late 20026. The PBRF began allocating funding in 2004 as funding from enrolments-based research top-ups was phased out. In 2007, the year in which the transition from the research top-ups to the PBRF was completed, the PBRF is estimated to have allocated around $231 million in funding to participating tertiary institutions (Tertiary Education Commission 2007).

Sixty percent of the funding allocated via the PBRF is based on the results of the Quality Evaluation, which uses peer review to evaluate the quality of research by PBRF-eligible staff at participating tertiary institutions. Peer reviewers evaluate researcher performance across three dimensions: the quality of research output, the esteem with which the researcher is held by their peers and their contribution to the research environment. In generating a final quality category, the quality of research output has the highest weighting.

The top quality category A is assigned to a researcher who produces research that is assessed as being of international quality. This is followed in order by B, C and R quality categories. Funding is only allocated to those researchers who receive a minimum of a C quality category, with A researchers receiving the highest weighting of funding.

Importantly, the results of the Quality Evaluation are published by the Tertiary Education Commission. Therefore, there is a strong incentive for universities to improve the quality of their research output, not only to maximise the funding received via the PBRF but also to maximise the positive impact from a high ranking.

An examination of university profiles published shortly after the release of the first PBRF Quality Evaluation results in 2004 illustrates how the PBRF has influenced the management of the research process at the universities. A number of universities stated that the results of the first PBRF Quality Evaluation would be used to identify areas of research strength and also those areas that required additional support to improve the level of quality of research (University of Auckland 2004, Massey University 2004, University of Canterbury 2004, Victoria University of Wellington 2004). Having identified those areas requiring additional support, the universities indicated that improvements would be made through helping research-inactive staff to improve their performance (Massey University 2004) and/or by recruiting research-active staff (Auckland University of Technology 2004, University of Auckland 2004).

The examination of the university profiles also shows that a number of universities used PBRF measures explicitly in setting goals. For example, the University of Waikato stated they wanted to build on their 2003 Quality Evaluation results (University of Waikato 2004), while the University of Canterbury stated an explicit long-term goal of being New Zealand's top university for quality of research, as measured by the PBRF (University of Canterbury 2004).

Given the response of the universities to the PBRF, it seems likely there would be an associated increase in the volume of research activity, especially by those researchers seeking to improve their quality category. International experience also suggests that the introduction of performance-based funding of research increases research activity. Liefner (2003) interviewed professors from a number of prestigious universities around the world on aspects of the funding of research. There was broad agreement that the introduction of performance-based funding leads to an increase in research activity, and hence quantity and quality, as a result of the increased scrutiny of performance. An evaluation of the impact of the United Kingdom's (UK) Research Assessment Exercise (RAE), which, like the PBRF, involves peer assessment of research quality, observed that researchers targeted journal publication as a means of raising their level of research quality (McNay 1998). This was based on a perception that publishing in highly cited journals would be viewed favourably by the review panels (Elkin 2001).

If New Zealand researchers responded in a similar fashion, it would be shown by an increase in the number of journal articles and reviews published in the years following the decision to introduce the PBRF, and also in journals likely to be more highly cited.


Measuring Research Productivity

A key problem in measuring the research output of universities in New Zealand is a lack of consistency in the way the universities report their research output. Although universities have routinely reported counts of research output in their annual reports for several years, they use different categories and thresholds to report research output, and several have changed the way they report over time. Some have even ceased reporting the total number of research outputs in their annual reports. This makes a year-to-year comparison of research output at all eight universities problematic for any extended period of time7.

A number of recent studies have used bibliometric databases to analyse the research productivity of university departments and individuals within various academic disciplines. Dale and Goldfinch (2005) used data extracted from the Web of Science to measure the research output and impact of the research of political science units in Australasian universities over the period 1995 to 2001. The authors found that the political science department at the Australian National University produced the highest number of research articles per staff member overall, while the best-performing unit from New Zealand was at the University of Waikato.

In a similar vein, Macri and Sinha (2006) measured the quality and quantity of research by economists at New Zealand and Australian universities during the period 1988 to 2002 using data extracted from the ECONLIT8 database. The study found that over the period 1996 to 2002, economics staff at the University of Melbourne were the largest producers of journal articles per staff member overall. The most productive of New Zealand's economics departments was at the University of Otago.

At the institutional level, a report by the Ministry of Research, Science and Technology (2006) analysed the research output (in the form of articles and reviews) between 1997 and 2003 at each of the universities using data from Thomson's New Zealand National Scientific Database. This study showed a decline in the research productivity of New Zealand universities of 4.2% between 2000 and 2003. However, the window of analysis in the study ended before the impact of the PBRF could be observed.

These previous bibliometric studies focused on comparing the performance of research staff at different universities. In this study, the focus is not on comparing the performance of universities, but rather on analysing the performance of each individual university over time. To achieve this, the Web of Science is used to extract counts of research publications, in the form of journal articles and reviews, for each of the universities between 1997 and 20069. The Web of Science, published by Thomson Scientific, is an online searchable bibliometric database that contains the details of publications in over 9,000 peer-reviewed journals, most of them based in North America and Europe.

The key advantage in using the number of articles and reviews listed in the Web of Science to measure research output is that it allows for a consistent way to measure research output over time10. However, there are a number of important caveats relating to the use this data. For one thing, the Web of Science has much better coverage of journals in the sciences and medical disciplines than in the social sciences and humanities. This introduces an element of bias into the publication counts, in that institutions that have relatively large science faculties and/or medical schools11 will tend to have a larger number of articles and reviews listed in the Web of Science compared with institutions that are more focused on the social sciences/humanities and creative arts. In addition, because there are few New Zealand-based journals in the Web of Science, research that is published locally will not be captured in its database. Finally, the Web of Science does not capture research published in the form of books, book chapters, exhibitions, conference papers and other creative works -- which are important means of disseminating research for university researchers.

A comparison of the number of articles and reviews listed in the Web of Science with the total number of research outputs reported by universities in their annual reports gives an idea of the scale of coverage of the Web of Science. In 2004 the number of articles and reviews listed in the Web of Science ranged from a low of 6% of all reported research outputs at Auckland University of Technology (AUT) to a high of 25% for the University of Canterbury12.

Despite this relatively low coverage of research output, the Web of Science provides a consistent measure of research output over time, which is critical in assessing the impact of a policy instrument like the PBRF. In addition, as researchers in the relevant fields are likely to target journal publication in higher-status journals to improve their quality category, the Web of Science will be able to capture this trend.

Because of the limitations of the Web of Science, any direct comparison of the productivity of researchers between universities is not useful, as universities with a medical school and/or large science faculties will naturally have a higher number of articles and reviews listed in the Web of Science. However, the performance of an individual university can be examined to see if there has been a change in its productivity over time.

Although the PBRF did not begin allocating funding till 2004, the details of how it would work were widely available from 2002, so universities would have been able to begin responding to its introduction from that year onwards. Therefore, any "PBRF effect" should show up as an increase in the number of articles and reviews listed in the Web of Science after 200313.

Figure 1 shows the total number of articles and reviews listed in the Web of Science by researchers at New Zealand universities, by publication year, between 1997 and 2006. There is some evidence of a "PBRF effect", with total publications increasing by 35% between 2002 and 2006 following a period of relatively stable output between 1999 and 2002.

Figure 1 also displays the number of articles and reviews listed in the Web of Science on a per full-time equivalent (FTE) research staff basis to give an indication of changes in the productivity of research staff. To account for the lags that exist in the research publication process14, the articles and reviews in a particular publication year were divided by the number of FTE research staff15 in the previous year. For example, articles and reviews published in 2006 have been linked to the FTE research staff at the universities in 2005.

There appear to be three distinct phases to research productivity in the universities between 1997 and 2006. After an initial increase in the number of articles and reviews per FTE researcher of 12% between 1997 and 2000, productivity stagnated. Between 2000 and 2003, productivity declined by 2% to reach 0.41 publications per FTE researcher in 2003. Since 2003, there has been a significant rise in productivity, coinciding with the introduction of the PBRF, with publications per researcher increasing by 21% between 2003 and 2006.

Figure 1 Web of Science Research Publications by New Zealand University Authors, by Publication Year

Figure 1  Web of Science Research Publications by New Zealand University Authors, by Publication Year.

Source: Web of Science

The research productivity of individual universities is presented in Figure 2, where the focus is on how the productivity of an individual university has changed over time, rather than on a comparison of productivity between universities. There are a number of universities that display a significant increase in productivity following the introduction of the PBRF. One of the more obvious examples is the University of Auckland: between 2002 and 2004 the number of articles and reviews listed in the Web of Science per FTE fell by 4%, but was followed by a rise of 24% between 2004 and 2006.

The University of Canterbury is another university that exhibits a significant increase in the number of articles and reviews listed in the Web of Science per FTE following the announcement to introduce the PBRF. There was an increase of 26% in productivity at this university between 2003 and 2006, following a decrease of 16% between 2000 and 2003. The University of Waikato also exhibits an increase in the number of articles and reviews per FTE that coincides with the introduction of the PBRF and follows a period of relatively stagnant productivity. Between 2002 and 2006 publications per FTE increased by 31%, compared with a decrease of 25% between 1999 and 2002.

Although there is an increase in productivity of 36% at Victoria University of Wellington (VUW) between 2002 and 2006, the start of this upward trend appears to precede the PBRF. Also, VUW displays considerable variation in productivity during this time period, making it difficult to identify clear trends. As a result it is not clear how much of the increase in productivity exhibited by VUW in recent years is a result of the PBRF.

Massey University seems to display a long-term upward trend in productivity over time. However, there is a period of relative stagnation in productivity between 2000 and 2003, where productivity fell by 7%. This is followed by an increase in articles and reviews listed in the Web of Science per FTE of 30% between 2003 and 2006.

Lincoln University displays the greatest variation in research productivity, a reflection of its relatively small size. The evidence of a PBRF effect at Lincoln is not clear cut. Although there is an increase of 27% in the number of articles and reviews listed in the Web of Science per FTE in 2004 following a period of declining productivity, the publications per FTE fell by 12% between 2004 and 2006.

The steady increase in productivity at AUT reflects the continued maturing of research culture at this new university. Although there is a slight increase in the rate of growth in the number of articles and reviews listed in the Web of Science per FTE after 2003, separating the impact of the PBRF from the impact of a developing research culture at AUT is not easily achieved.

Figure 2 Web of Science Research Publications per Full-time Equivalent Research Staff, by Publication Year

Figure 2  Web of Science Research Publications per Full-time Equivalent Research Staff, by Publication Year.
Figure 2  Web of Science Research Publications per Full-time Equivalent Research Staff, by Publication Year  graph 2.

Source: Web of Science

Although the number of articles and reviews listed in the Web of Science per FTE increased significantly at most New Zealand universities following the introduction of the PBRF, it is possible this was simply coincidental and was part of a broader international trend. To test for this, Figure 3 compares the research output per FTE researcher at the two New Zealand's universities with the largest number of Web of Science publications, Auckland and Otago, with the Universities of Melbourne and Queensland - two of Australia's largest universities in terms of Web of Science publications16 and members of the Group of Eight research-intensive universities17.

Figure 3 Web of Science Research Publications per Full-time Equivalent Research Staff for Selected Universities, by Publication Year

Figure 3 Web of Science Research Publications per Full-time Equivalent Research Staff for Selected Universities, by Publication Year.

Source: Web of Science

As can be seen in Figure 3, the University of Melbourne displayed a relatively steady increase in productivity over time, with no evidence of an upswing in productivity from 2002 onwards18. This compares with the Universities of Otago and Auckland, which both experienced periods of declines in research output before displaying an increase in productivity over the last three to four years. Although the productivity of the University of Queensland does show an increase in productivity from 2002, this precedes the increases exhibited by the New Zealand universities. The differences in the trends in research productivity between the Australian and New Zealand universities would suggest that the PBRF effect is not simply a reflection of a wider international trend.

The widening of the gap in the number of articles and reviews listed in the Web of Science per FTE researcher between the Australian and New Zealand universities over the 10-year period in Figure 3 is not surprising, as government funding for research in New Zealand and Australia was allocated on a very different basis during this time. Research funding in Australia was partly allocated based on the quantity of research output19, while research funding at New Zealand universities was based on the number of domestic enrolments at bachelor's level and above until the phase-in of the PBRF from 2004.

The analysis of the number of articles and reviews per FTE researcher would suggest that productivity (by this measure at least) has improved significantly at a number of New Zealand universities following the introduction of the PBRF. However, attempting to establish a causal link between the introduction of the PBRF and increased research output is problematic given the multitude of factors that affect research performance (Adams and Smith 2006).

To control for some of these factors (such as the amount of research income, number of postgraduate research students and normal productivity improvements), regression analysis was applied to panel data for the eight New Zealand universities over a 10-year period. By controlling for these other factors, the impact of the PBRF on research output can be more clearly identified. It also allows for tests of statistical inference to be applied that indicate if any changes in research output following the introduction of the PBRF were statistically significant. The regression and production function methodology is presented in the next section.


A Production Function Approach

The production function approach used to model the research productivity of the New Zealand universities between 1996 and 200520 is based on the approach used by Abbott and Doucouliagos (2004) to model the research output of Australian universities. In this approach it is assumed that the research output of universities is determined by key inputs, such as academic staff and research income, along with other environmental factors such as size of institution and changes in technology over time.

The dependent variable in this analysis, PUBLICATIONS (mean = 437.3, SD = 345.3), is the number of articles and reviews listed in the Web of Science by researchers at each of the eight universities in each year. To take into account the lag that occurs in the publication process, the research publications in a particular year were matched to the inputs in the preceding year. For example, articles and reviews published in 2006 have been linked to inputs that were used in 200521.

To allow for the dynamics of the research process, a lagged dependent variable (PUBLICATIONS_LAGGED) is included as an explanatory variable in the regression model. This also helps reduce the likelihood of serial correlation in the regression model (Abbott and Doucouliagos 2004).

The three input variables in the regression model are the number of FTE research staff at a university (RESEARCH_STAFF), the amount of research income earned by the universities (RESEARCH_INCOME) and the number of EFTs at master's and doctoral level (POSTGRAD) (mean = 1,270, SD = 799)22. RESEARCH_STAFF (mean = 973.3, SD = 519.5) includes all academic staff and research-only staff at a university. RESEARCH_INCOME (mean = $24,638 thousand23, SD = $26,136 thousand24) captures the research income earned by the universities as reported to the Ministry of Education. This includes funding such as external research contract income, but excludes PBRF allocations and research top-ups funding. It has then been deflated using the Consumer Price Index to adjust for the effects of inflation.

A dummy variable with multiple categories (PBRF#) is used to capture the impact of the PBRF on research output. The reference category, NO_PBRF, represents the years 1996--2001, a period prior to the announcement of the intention to introduce the PBRF. Note that the year in this case refers to the year an input was used rather than the year the article and review were published. PBRF02 represents the year 2002, PBRF03 the year 2003, PBRF04 the year 2004 and PBRF05 the year 2005. By having separate categories for each year, an analysis of whether the impact of the PBRF on research productivity has altered over time can be made.

As discussed earlier, the coverage of the Web of Science favours those universities that have large science faculties and/or medical schools. To control for this, and also to capture any other provider-specific effects on research output25, a dummy variable with multiple categories representing each university (INSTITUTION#) is included in the regression model26. The reference category in the regression model is the University of Auckland.

Finally, a time trend variable is used to capture changes in technology (as in Abbott and Doucouliagos 2004). TIME takes a value of 0 for inputs used in 1996, 1 in 1997 and so on.

The regression model is presented in Equation 1 (below). Note that in this analysis a restricted trans-log specification has been applied. Also, the main input variables have been interacted with time to capture productivity; i.e. the effects of non-neutral or input-biased technology change increase over time. In addition, RESEARCH_STAFF and POSTGRAD are allowed to interact in the model.

ln PUBLICATIONSit =

ß0 + ß1 lnRESEARCH_STAFFit + ß2 lnRESEARCH_INCOMEit +
ß3 lnPOSTGRADit + ß4 lnRESEARCH_STAFF×lnPOSTGRADit +
ß5 PBRF#t + ß6 PUBLICATIONS_LAGGEDit + ß7 TIMEt +
ß8RESEARCH_ STAFF×TIMEit + ß9RESEARCH_ INCOME ×TIMEit
10POSTGRAD×TIMEit + ß11 INSTITUTION#i + v it
(Equation 1)

where ln is the natural log, i represents the university, t represents the time period and v is an error term.

With a wide range in the size of the universities, there is likely to be a problem with heteroscedasticity. Therefore, an ordinary least-squares procedure that adjusts for the presence of panel-level heteroscedasticity and produces robust standard errors was used to generate the coefficient estimates27. The regression output is presented in Table 1.

The results of the regression analysis indicate that the PBRF has been associated with increased research output at the New Zealand universities, and that its impact has increased over time, controlling for other factors. Although the coefficient of PBRF02 is positive, it is not statistically significant. However, the coefficients of PBRF03, PBRF04 and PBRF05 are positive and statistically significant. Converting these to percentages, it suggests that, controlling for other factors, research output at the universities was on average 20% higher in 2003 compared with the period prior to the introduction of the PBRF (1996 to 2001). In 2004 research output was 28% higher than prior to the PBRF, and in 2005 34% higher than prior to the PBRF. This suggests that the impact of the PBRF has been significant and is growing.

Table 1 Determinants of University Web of Science Publications -- Regression Results
Dependent variable = PUBLICATIONS

Explanatory Variables Category Coefficient Std Error

RESEARCH_STAFF

1.491

1.076

RESEARCH_INCOME

0.312**

0.084

POSTGRAD

2.061*

0.947

RESEARCH_STAFF x POSTGRAD

-0.281*

0.146

PBRF#

NO_PBRF

Reference category

PBRF02

0.060

0.047

PBRF03

0.181**

0.057

PBRF04

0.249**

0.068

PBRF05

0.293**

0.079

PUBLICATIONS_LAGGED

0.269**

0.110

TIME

-0.048

0.045

RESEARCH_STAFF x TIME

0.069**

0.017

RESEARCH_INCOME x TIME

-0.044**

0.012

INSTITUTION#

AUT

-1.871**

0.482

LINCOLN

-0.958*

0.474

MASSEY

-0.625**

0.130

AUCKLAND

Reference category

CANTERBURY

-0.627**

0.241

OTAGO

-0.149

0.157

WAIKATO

-1.182**

0.286

VUW

-1.005**

0.277

CONSTANT

-9.155

6.866

R2

0.99

Wald χ2

p< 0.0000

N

80

Note: * and ** denote statistical significance at the 5% and 1% levels, respectively. The regression output was obtained using STATA 8.2, xtpcse, option hetonly (Statacorp 2005).

Given the evidence that the PBRF was associated with a significant increase in articles and reviews listed in the Web of Science, a key question is whether the increase in this type of research output may have "crowded out" other forms of research publication, such as books and book chapters. Publication may also have shifted from non-refereed to refereed journal articles, or from local journals to ones that are contained within the Web of Science.

Figure 4 displays Web of Science research publications as a percentage of total reported research output at four New Zealand universities28 between 2003 and 2006. There is little evidence of any "crowding out" effect from the increase in Web of Science research publications. At the University of Canterbury the proportion of Web of Science research publications remains relatively stable at around 24%. At VUW the proportion of Web of Science research publications fluctuates around 15% without displaying any clear signs of crowding out other types of research output. The share of Web of Science publications actually decreases at the University of Otago, indicating that other types of research output have increased at an even faster rate. Although the proportion of Web of Science research publications increases at AUT, the increase is still relatively small: from around 4% in 2003 to 6% in 2006.

Figure 4 Web of Science Articles and Reviews as a Percentage of Total Reported Research Outputs

Figure 4 Web of Science Articles and Reviews as a Percentage of Total Reported Research Outputs.

The regression output in Table 1 also provides a wealth of information about the other determinants of research output. For example, the negative sign of the coefficient for the interaction term RESEARCH_STAFF×POSTGRAD suggests that a higher loading of postgraduate students on research staff involved a trade-off with the number of articles and reviews listed in the Web of Science. This result is similar to that found by Abbott and Doucouliagos (2004) in their study of Australian universities, which they suggest is caused by the increased burden of supervision.

The positive sign of the coefficient for the interaction term RESEARCH_STAFF×TIME suggests that research staff became more productive over time. A number of factors may have contributed to this result, such as, improvements in information technology. Also, the maturing of the research culture at AUT would have helped to increase research productivity.

The negative sign of the coefficient of the interaction term RESEARCH_INCOME×TIME suggests that the contribution of RESEARCH_INCOME to research output diminished over time. A possible reason for this may be that the increase in RESEARCH_INCOME sourced from private businesses (Ministry of Education 2006) may have resulted in the publication of research findings outside the scope of the Web of Science.

The POSTGRAD×TIME interaction term in equation 1 was not statistically significant and so was dropped from the regression model.

Care should be taken when interpreting the coefficients of the institutional dummy variables, which compare the number of articles and reviews listed in the Web of Science by the individual universities to the reference category, the University of Auckland. These coefficients illustrate the impact of a number of institution-specific factors on research output, some of which relate to the bias in the coverage of the Web of Science. For example, the relatively large social sciences and humanities faculties at the University of Waikato and VUW are a factor in the statistically significant negative coefficients. Similarly, as AUT was only granted university status in 2000, the negative sign of the coefficient for the variable representing this university is not unexpected.


Conclusions

The PBRF was designed to improve the average quality of the research in New Zealand tertiary education organisations through linking government funding directly to research performance. This study has shown that the greater scrutiny the PBRF has placed on the research activities of the New Zealand universities has been associated with a significant increase in research productivity at most universities, measured by the number of articles and reviews listed in the Web of Science per FTE research staff. This increase in Web of Science research publications has not been at the expense of other types of research output.

Given the selective nature of the peer reviewed journal set included in the Web of Science, the increase in the number of research outputs appearing in the Web of Science database implies that the quality of research being produced by New Zealand universities has also improved. However, this is only something that can be measured directly and then confirmed through exercises such as the PBRF Quality Evaluations.

What this study does confirm is that linking government funding directly to institutional research performance and ensuring the publication of that performance has been associated with significant changes in institutional behaviour.

However, the increase in research productivity raises important the important question of whether it involves a trade off in other areas of university activity, such as teaching and service, and whether the productivity increase can be sustained over the long term.

The impact of increased research output on teaching activities at the universities is outside the scope of this analysis. However, using a distance function approach, which can directly model multiple input and multiple output technology and obtain measures of technical efficiency and productivity change, may offer better insights into the overall effect of performance-based funding on university performance.


References

Abbott, M. and H. Doucouliagos (2004) “Research output of Australian universities” Education Economics, 12(3):251--265.

Adams, J. and D. Smith (2006) “Evaluation of the British Research Assessment Exercise” in L. Bakker, J. Boston, L. Campbell and R. Smyth (eds.) Evaluating the Performance-Based Research Fund, pp. 33--108, Institute of Policy Studies, Wellington.

Auckland University of Technology (2004) AucklandUniversity of Technology Profile 2005--2007, Auckland University of Technology, Auckland.

Auckland University of Technology (2004--2007) AucklandUniversity of Technology Annual Report [2003--2006], Auckland University of Technology, Auckland.

Australian Research Council (2008) Excellence in research for Australia (ERA) initiative, Australian Research Council, Canberra.

Boston, J. (2006) “Rationale for the Performance-Based Research Fund: Personal reflections” in L. Bakker, J. Boston, L. Campbell and R. Smyth (eds.) Evaluating the Performance-Based Research Fund, pp. 5--32, Institute of Policy Studies, Wellington.

Dale, T. and S. Goldfinch (2005) “Article citation rates and productivity of Australasian political science units 1995--2002” Australian Journal of Political Science,40(3):425--434.

Department of Education, Employment and Workplace Relations (2007) Staff 2006: Selected Higher Education Statistics, Department of Education, Employment and Workplace Relations, http://www.dest.gov.au/sectors/higher_education/publications_resources/profiles/Staff_2006_selected_higher_education_statistics.htm [accessed 16/9/2007].

Elkin, J. (2001) “The impact of the Research Assessment Exercise on serial publication” Serials,17(3):239--242.

Liefner, I. (2003) “Funding, resource allocation, and performance in higher education systems” Higher Education, 46:469--489.

Macri, J. and D. Sinha (2006) “Ranking methodology for international comparisons of institutions and individuals: An application to economics in Australia and New Zealand” Journal of Economic Surveys, 20(1):125--156.

Massey University (2004) MasseyUniversity Profile 2005—2007, Massey University, Palmerston North.

McNay, I. (1998) “The Research Assessment Exercise (RAE) and after: ‘You never know how it will all turn out’” Perspectives, 2(1):19--22.

Ministry of Education (1997) A Future Tertiary Education Policy for New Zealand: Green Paper,Ministry of Education, Wellington.

Ministry of Education (2005) Profile and Trends: New Zealand’s Tertiary Education Sector 2004, Ministry of Education, Wellington.

Ministry of Education (2006) Profile and Trends: New Zealand’s Tertiary Education Sector 2005, Ministry of Education, Wellington.

Ministry of Research, Science and Technology (2006) University Bibliometrics: An Analysis of Publication Outputs 19972003, Ministry of Research, Science and Technology, Wellington.

PBRF Working Group (2002) Investing in Excellence: The Report of the PBRF Working Group, Ministry of Education and Transition Tertiary Education Commission, Wellington.

Phelan, T. (1999) “A compendium of issues for citation analysis” Scientometrics, 45(1):117--136.

Statacorp (2005) Stata Statistical Software: Release 8.2, Stata Corporation, College Station, TX.

Tertiary Education Advisory Commission (2001) Shaping the Funding Framework, Tertiary Education Advisory Commission, Wellington.

Tertiary Education Commission (2004) Performance-Based Research Fund: Evaluating Research Excellence -- The 2003 Assessment, Tertiary Education Commission, Wellington.

Tertiary Education Commission (2007) Performance-Based Research Fund: Evaluating Research Excellence -- The 2006 Assessment, Tertiary Education Commission,Wellington.

University of Auckland (2004) University of Auckland Profile 2005--2007, University of Auckland, Auckland.

University of Canterbury (2004) University of Canterbury Profile 2005--2007, University of Canterbury, Christchurch.

University of Canterbury (2007) University of Canterbury Annual Report 2006, University of Canterbury, Christchurch.

University of Otago (2008) University of Otago Annual Report 2007, University of Otago, Dunedin.

University of Waikato (2004) University of Waikato Profile 2005--2007, University of Waikato, Hamilton.

Victoria University of Wellington (2004) VictoriaUniversity of Wellington Profile 2005--2007, Victoria University of Wellington, Wellington.

Victoria University of Wellington (2005--2007) VictoriaUniversity of Wellington Annual Report [2004--2006], Victoria University of Wellington, Wellington.


Footnotes

1Web of Science is an online academic service that provides access to five databases: Science Citation Index (SCI), Social Sciences Citation Index (SSCI), Arts & Humanities Citation Index (A&HCI), Index Chemicus, and Current Chemical Reactions.

2A fund for delivery of education in alternative learning settings for at-risk youth.

3 Other types of research publications captured by the Web of Science, such as book reviews, bibliographies and meeting notices, were excluded from this analysis.

4In this analysis it has been assumed that the colleges of education were merged with their associated universities for the entire period. Similarly, it is assumed that Massey University was merged with Wellington Polytechnic for the entire period. The purpose of this approach is to make any trends in research productivity clearer by removing the impact of the mergers with the colleges of education. It also allows for an analysis of university performance in their current (2008) configurations.

5 The indicators included a measure of the external research income earned by institutions and the number of research degree completions.

6This was when the PBRF Working Group published their detailed recommendations outlining the operational details of the PBRF (PBRF Working Group 2002).

7The data do show some evidence of an increase in productivity following the introduction of the PBRF at the University of Canterbury, University of Otago, Lincoln University, Auckland University of Technology and Victoria University of Wellington.

8The American Economic Association’s electronic bibliography which includes over 30 years of economics literature from around the world.

9An article and review were assigned to a university if at least one of the authors was from that institution.

10Although journals are added and dropped from the Web of Science journal set each year, the number of year-on-year changes is relatively low. The validity of using the Web of Science for time series analysis is discussed in Phelan 1999.

11The University of Otago and the University of Auckland have medical schools.

12The total research output data are sourced from Ministry of Education 2005.

13 Although individual researchers may or may not have responded directly to the PBRF signals, as was discussed earlier, management at the universities did respond and began to reorganise and promote research activity.

14The lag between research being carried out and being published can vary depending on the type of publication method and the discipline, and may take considerably longer than one year. However, due to the limited number of observations in this analysis a lag of one year in the research process is assumed.

15“Research staff” in this study includes academic and research-only staff.

16The number of Web of Science publications by the University of Sydney was the largest by an Australian university in 2006, but there was a lack of consistent FTE research staff data available for this university. The University of Melbourne had the second highest number of Web of Science publications and the University of Queensland the third highest of the Australian universities in 2006.

17The Group of Eight is an alliance of eight large metropolitan research-intensive Australian universities. The member institutions are: University of Melbourne, University of Sydney, University of New South Wales, University of Adelaide, University of Western Australia, Australia National University, University of Queensland and Monash University.

18 Staffing data for the Australian universities was sourced from the Department of Education, Employment and Workplace Relations (2007).

19This may change in the future if the proposals in the ‘Excellence in research for Australia initiative’ are implemented. These envisage the incorporation of measures of research quality into the framework for funding research in Australia’s higher education sector (see Australian Research Council ,2008).

20 This time frame was selected because input data are available from 1996 onwards. There is a total of 80 observations in the model.

21 The model was also estimated assuming a lag of two years, which produced relatively similar results. Longer lags were not used because they would have reduced the sample size in the model significantly.

22These data have been sourced from the Ministry of Education.

23 $24.6 million.

24 $26.1 million.

25 Such as size of university.

26 This has the effect of making this a fixed-effects panel data model.

27 The procedure used to generate the robust standard errors in STATA (Statacorp 2005) was xtpcse, option hetonly.

28 These universities reported research output in a consistent manner over this four-year period.

Cover photo of Social Policy Journal

Documents

Social Policy Journal of New Zealand: Issue 34

The Impact of the Performance-Based Research Fund on the Research Productivity of New Zealand Universities

Apr 2009

Related links

Print this page.