Wednesday 19 December 2007

Monday 13 August 2007

more on success rates

I recently received the Connect magazine for August 2007 (no 53) from EPSRC. The leading article is on “EPSRC Funding rates” and there is a cute little picture of some people running a race and one getting to the finishing tape first. This sets the tone for the whole article which seems to have been thought up by someone to defend low funding rates.


Imagine Jim Hacker and Sir Humphrey:

JH – Humphrey, I’ve had a lot of stick recently about funding rates for research grants. They are a bit low but can’t we do something to make people think it is all OK.

Sir H – Well minister, people have to realize that when the funding rate is 33%, that does not mean that the success rate is one in three.

JH – Really Humphrey, that sounds ideal but could you explain it to me first

Sir H – Yes minister!


So what did EPSRC write? Here are some of the gems.

“If there is one issue relating to EPSRC which is guaranteed to spark a lively discussion the funding rates are surely it, as we always receive many more excellent research proposals each year than we have the resources to fund. The figures are often interpreted as a measure of wasted effort, as well as sometimes acting as a daunting disincentive to potential applicants. But we would always advise against reading too much into them in isolation”

“Overall funding rates for EPSRC research grants, including all programmes, for 2006-2007 were 38% by value and 32% by number”

“It’s easy to say what funding rates are, but it’s also important to understand what they are not. Crucially they are not a simple measure of the likelihood of being funded. A funding rate of, for instance, 33% doesn’t mean that any particular proposal has a one in three chance of success. This is because peer review is used to prioritise applications for funding. It’s the intelligence introduced by peer review which means that a “random” interpretation of the figures will be misleading and probably dispiriting too.”

“Success in peer review is in your hands, and there is a lot you can do to influence your proposal's chances. You can find advice on proposal writing, how to choose reviewers and how to respond to their comments on our website. We are always happy to give individual advice on how to produce a competitive case for support”



I personally find the tone of all of this very offensive. What they seem to be saying is, yes the funding rate is a bit low but it’s your fault because you don’t write good proposals and anyway you don’t understand what a 33% funding rate means.

If a 33% funding rate does not mean that any particular proposal has a one in three chance of success then what does it mean? What they seem to be saying is that because of peer review some proposals will have a higher chance of success but some will have a lower chance of success. Now if all the proposals rated well by peer review were also funded then that wouldn’t matter too much but I doubt if that is so. Also the peer review system is not without its faults.

I have suggested some ways of improving the system elsewhere (see July 18th 2007) but it is important to look at the figures for grants funded. In 2006/07, there were 4346 applications to EPSRC. 1408 were funded, so that means 2938 were not and that represent a huge effort for no outcome.
“The figures are often interpreted as a measure of wasted effort” - too right they are!

To be fair to EPSRC, elsewhere on their web site they comment on the increased chance of success from universities that carry out rigorous quality control and submit fewer applications. Something like this has to come in universally but this needs to be a partnership between research councils and universities.

Friday 3 August 2007

another big issue on the block

Here is another letter to Nature that refers to the issue of grant success rates



http://www.nature.com/nature/journal/v448/n7153/full/448533b.html

Monday 30 July 2007

Friday 27 July 2007

peer review

If you are interested in potentially changing the peer review system in the UK, then have a look at the web site below which was developed for the Canadian system where they have many of the same problems.

http://post.queensu.ca/~forsdyke/peerrev.htm

Wednesday 18 July 2007

science funding in the UK



I recently wrote a letter to Nature about science funding in universities in the UK (Nature 448, 22, 2007, see link at end of this article). This followed a rather complacent editorial in the magazine and arose from my own experience and from talking to others. Space in Nature is limited and I wanted to say more about the science funding situation in the UK.

My recent experience with grant applications together with talking to others around the country made me think about the whole grant funding situation for biology in the UK. I have applied for many grants and got some and didn’t get others. I have also been a member of the BCB committee of the BBSRC so I have some idea of how the grant system works.

What amazed me when I looked in to it was how many grant applications are rejected and despite the low success rate we still keep trying. This is enormously wasteful of effort on the part of applicants, administrators and reviewers.

Let’s start by looking at the success rates. For BBSRC overall success rates are available in their annual reports and I have put these on a graph (Figure 1). The decline in success from 1999 to now is very striking. In 1998/2000 the success rate was ~40% whereas now it is about 25%. This statistic hides the fact that now about 15% of the applications rated International, BBSRC’s benchmark criterion for funding, are rejected, whereas in 1999 most of these were funded.



For MRC, Figures are available on their web site for 2005/6 and these show an overall rate of 19% for all applications, which is also true for the responsive mode applications. So, MRC are doing less well than BBSRC.

What underlies the decline is success rates? One very important factor for BBSRC is the increase in applications over the same period. This is shown in Figure 1 where there has been a ~60% increase in applications over the period. Factors that may have contributed to this increase in applications probably include the decline in funding available from the MRC and the Wellcome Trust over this period as well as the need to obtain overheads on grants, which the charities do not provide.
Overall funding for responsive mode grants has increased (Figure 2) over the period but presumably the combination of inflation and the number of applications has ensured that the increased overall funding did not maintain the success rate.



Another way to look at the numbers is to look at the number of failed applications. For BBSRC this is about 1400 a year and for MRC it is about 1200 a year. Just take a moment to think what this means. There are about 2600 grant applications submitted each year to BBSRC and MRC that fail, think of the wasted time in applicant’s, administrator’s and reviewer’s time that this takes.

What is the consequence of this massive waste of time? First people spend time writing grants when they could be actually doing research themselves. So it is bad for research. If it’s bad for research it is also bad for teaching. If people didn’t have to spend all this time on writing grant applications they could be putting more time into teaching. I know from my own observation that teaching suffers as a result of the time spent on grant applications. Many negative decisions are taken about teaching in universities on the basis that research has higher priority.

Why do people keep on applying when the success rate is so low? First and foremost, we have good ideas and want to do research. Most of us are optimists and there is always the chance that next time it might be funded. Also there is intense pressure for academics in the UK to be applying for grants. These applications will be looked on more favourably if they are directed towards the research councils following the change to Full Economic Costing. Performance indicators in terms of the number of applications made are often enforced in university departments. Without rigorous internal peer review this will lead to many poor applications being submitted contributing to the failure rate.

Another effect of this huge number of applications is to turn the grant committees into rejection committees. They need to reject a large number of grants and so take an essentially negative view, looking for problems rather than applauding good science. Coupled to this is the fact that most rejected applications cannot be resubmitted, so there is no scope for a constructive interplay between applicant and committee.

How can we get round this? One obvious way would be to put more money in to the system to increase the success rate. I am not suggesting that all applications should be funded but many internationally competitive applications are currently unfunded and a simple increase in funding would change this at a stroke.

We can also think about how to change the system for submitting and assessing applications. One possibility would be to put the onus more on to the universities who would be asked to review applications more rigorously ahead of submission, weeding out the poorer ones. Each university would be given a quota of submissions and they could then decide which ones to submit.

The track record of applicants could be used by Committees as a primary tool to grade applications. Currently applications are rated largely on the science proposed and the track record constitutes only a small part of the assessment. It is often easy to pick holes in a new project as these are inherently uncertain, but an applicant’s published work may be a clearer indicator of future performance. Obviously new applicants would require a special system.

We must find a way of avoiding this waste of effort, channeling it in to something more useful. If we could reduce the attrition rate of grant applications this would raise confidence all round and might improve the quality of research and teaching in the universities.


Web sites

http://bbsrc.mondosearch.com/cgi-bin/MsmGo.exe?grab_id=0&EXTRA_ARG=IMAGE3.X%3D8%00%26IMAGE3.Y%3D8&host_id=42&page_id=117&query=annual%20report&hiword=ANNUA%20REPORTABLE%20REPORTER%20REPORTERS%20ANNUALLY%20REPORTED%20REPORTING%20annual%20report%20REPORTS%20ANNUALS%20

http://www.mrc.ac.uk/ApplyingforaGrant/SuccessRates/ApplicationSuccessRates/MRC002626#P18_643


http://www.nature.com/nature/journal/v448/n7149/full/448022a.html