Please use this identifier to cite or link to this item:
Full metadata record
DC FieldValueLanguage
dc.contributor.authorIoannidis, J. P.en
dc.contributor.authorPatsopoulos, N. A.en
dc.contributor.authorKavvoura, F. K.en
dc.contributor.authorTatsioni, A.en
dc.contributor.authorEvangelou, E.en
dc.contributor.authorKouri, I.en
dc.contributor.authorContopoulos-Ioannidis, D. G.en
dc.contributor.authorLiberopoulos, G.en
dc.rightsDefault Licence-
dc.subjectresearch assessment exerciseen
dc.subjectworld universitiesen
dc.subjectfatal attractionen
dc.titleInternational ranking systems for universities and institutions: a critical appraisalen
heal.type.enJournal articleen
heal.type.elΆρθρο Περιοδικούel
heal.identifier.primaryDoi 10.1186/1741-7015-5-30-
heal.identifier.secondary<Go to ISI>://000252409300001-
heal.recordProviderΠανεπιστήμιο Ιωαννίνων. Σχολή Επιστημών και Τεχνολογιών. Τμήμα Βιολογικών Εφαρμογών και Τεχνολογιώνel
heal.abstractBackground: Ranking of universities and institutions has attracted wide attention recently. Several systems have been proposed that attempt to rank academic institutions worldwide. Methods: We review the two most publicly visible ranking systems, the Shanghai Jiao Tong University 'Academic Ranking of World Universities' and the Times Higher Education Supplement 'World University Rankings' and also briefly review other ranking systems that use different criteria. We assess the construct validity for educational and research excellence and the measurement validity of each of the proposed ranking criteria, and try to identify generic challenges in international ranking of universities and institutions. Results: None of the reviewed criteria for international ranking seems to have very good construct validity for both educational and research excellence, and most don't have very good construct validity even for just one of these two aspects of excellence. Measurement error for many items is also considerable or is not possible to determine due to lack of publication of the relevant data and methodology details. The concordance between the 2006 rankings by Shanghai and Times is modest at best, with only 133 universities shared in their top 200 lists. The examination of the existing international ranking systems suggests that generic challenges include adjustment for institutional size, definition of institutions, implications of average measurements of excellence versus measurements of extremes, adjustments for scientific field, time frame of measurement and allocation of credit for excellence. Conclusion: Naive lists of international institutional rankings that do not address these fundamental challenges with transparent methods are misleading and should be abandoned. We make some suggestions on how focused and standardized evaluations of excellence could be improved and placed in proper context.en
heal.journalNameBMC Meden
heal.journalTypepeer reviewed-
Appears in Collections:Άρθρα σε επιστημονικά περιοδικά ( Ανοικτά)

This item is licensed under a Creative Commons License Creative Commons