Comments on grade inflation – Google welcomes review of college’s ‘grade inflation’ – The Irish Times – Tue, Mar 02, 2010
There’s been a lot of comment about grade inflation at second and third level over the past few days. A couple of points are worth making.
First, this is not a uniquely or peculiarly Irish phenomenon. The Economist has had a few stories on this over the years, including this post from an anonymous US Professor. At Harvard, Harvey Mansfield famously distinguishes between ‘ironic’ (or official transcript recorded) grades, and ‘true and serious’ grades, which are not reported on transcripts (see this article from the Harvard Crimson). It would be nice to see some comparative international analysis of this problem, and not just the usual national bout of self-flagellation that accompanies stories like this. A sensible place to start would be take the results of universities that are similarly placed to Irish universities from the UK (as they have a similar grading system), and look at the comparative similarities in grade distributions between the two countries. This blog provides a list of some issues and citations to the UK.
Second, modes of assessment have changed dramatically over the past years. There is much more continuous assessment (CA) than heretofore. With easy access to superb online research materials (figures, journals, textbooks, etc.), search engines, as well spell and grammar checks in word processing programmes, it is much easier for students to produce high-quality CA work than ever before. This inevitably leads to a high degree of range compression, as the standard of work produced will of necessity be very high. If these CA items are weighted highly in exams, then there will be an inevitable skewing of results. One purpose of exams is to discriminate in the statistical sense between individuals measured on a certain set of variables. It may well be that CA militates against this statistical discrimination because of simple and upward range compression.
Third, upward movements in grade to some extent reflect inputs. Students with greater than five hundred points in the Leaving Certificate should be expected to get very good degree results, unless somehow we expect university to degrade performance and capacity over time.
Fourth, why not turn this issue to opportunity? We should set a national target of moving performance on the OECD PISA* to the very top range (that of Finland, say). These results can’t be gamed or subject to grade inflation.
Fifth (tongue in cheek), don’t forget the Flynn Effect!
*The Programme for International Student Assessment (PISA) is an internationally standardised assessment that was jointly developed by participating economies and administered to15-year-olds in schools.
Tests are typically administered to between 4,500 and 10,000 students in each country.