Showing posts with label Ranking. Show all posts
Showing posts with label Ranking. Show all posts

Sunday, November 25, 2018

Annals of Ranking: Which decades produced "better" Nobel Prizes in Science?


Patrick Collison and Michael Nielsen have the click-bait article of the month in The Atlantic, Science Is Getting Less Bang for Its Buck, with the following summary (abstract?):

Despite vast increases in the time and money spent on research, progress is barely keeping pace with the past. What went wrong?

Here's their methodology:

we ran a survey asking scientists to compare Nobel Prize–winning discoveries in their fields. We then used those rankings to determine how scientists think the quality of Nobel Prize–winning discoveries has changed over the decades.

As a sample survey question, we might ask a physicist which was a more important contribution to scientific understanding: the discovery of the neutron (the particle that makes up roughly half the ordinary matter in the universe) or the discovery of the cosmic-microwave-background radiation (the afterglow of the Big Bang). Think of the survey as a round-robin tournament, competitively matching discoveries against each other, with expert scientists judging which is better.

For the physics prize, we surveyed 93 physicists from the world’s top academic physics departments (according to the Shanghai Rankings of World Universities), and they judged 1,370 pairs of discoveries. [...]

Collison and Nielsen did this decade-wise comparison of Nobel winning discoveries across nine decades spanning the years 1901-1990 [The authors note that "[the prize-winning] work is attributed to the year in which the discovery was made, not when the subsequent prize was awarded"].

Not surprisingly, the two following decades (1911-1930) get the best ratings.

Wednesday, November 11, 2015

Stuff Our Government Says: Global University Rankings Edition


A lot of hue and cry is raised about our higher education institutes not figuring in global ranking. The reason is not lack of high quality research work but the fact that in India, a large section of research work is done in vernacular languages, whereas global rankings only consider research in English
-- HRD Minister Smriti Irani [Source: The Indian Express]

Wednesday, November 04, 2015

The Road to the Top 100


Anubhuti Vishnoi in The Economic Times: Centre to fund 10 institutes for next 3-4 years to help them find a place among top 100 on global academic rankings:

The Centre will soon pick 10 higher education institutes with potential and provide them with substantial funding over the next four years so that Indian institutes can finally storm into the top 100 on global academic rankings like QS and Times ... These 10 institutes, it is proposed, will be granted funds — ranging fromRs 100-500 crore for the next 3-4 years so that they can create world class research infrastructure and laboratories. The end target is getting Indian institutes among global top 100.

Saturday, December 27, 2014

University Assessments in the US and the UK


This month saw two significant events in higher ed elsewhere.

The first was in the US, where the Department of Education released a "draft framework" outlining a set of parameters which can form the basis for rating colleges and universities. The report is open for public discussion and debate, before the rating policy is finalized. See the NYTimes story on the report, and Kevin Carey's commentary on it. The ratings are meant to help aspiring undergraduate students in choosing the right colleges to apply to.

The second was the much awaited announcement of the results of an extensive assessment exercise called REF (Research Excellence Framework). As the name suggests, this exercise is only about the research conducted at the UK universities, and its results have a strong impact on universities as well as on individual departments. See The Guardian story: REF 2014: why is it such a big deal?.

The Guardian's coverage is a good place to start, but you can get all the data at the REF site.

See also: Five reasons why the REF is not fit for purpose‬‬‬.

Saturday, December 13, 2014

Ranking Tail and Institutional Dogs


Ranking of universities in the US has thrown up several cases of fraudulent reporting by places like George Washington and Claremont McKenna. Others have taken a more strategic route by re-prioritizing their spending to target higher scores in the metrics that matter. I just linked to a Boston Magazine story on Northeastern's efforts to align its priorities with those of US News.

We now have a BBC story about the French government taking this strategic route, which will cost "only" 7.5 billion euros:

As part of a huge government-driven academic and economic project, there will be a new university called Paris-Saclay, with a campus south of the French capital. The project has initial funding of 7.5bn euros (£5.9bn) for an endowment, buildings and transport links.

The French government is bringing together 19 institutions into a single structure, with the aim of building a university of a size and scale that can compete with global giants like Harvard or the Massachusetts Institute of Technology (MIT).

Dominique Vernay, the president of this new university, says that within a decade he wants Paris-Saclay to be among the top ranking world universities.

"My goal is to be a top 10 institution," he says. In Europe, he wants Paris-Saclay to be in the "top two or three".

* * *

Sometime ago, we also saw a study that looked at how much it would cost Rochester -- "consistently ranked in the mid-thirties" -- to break into the top 20 in the US News list. It arrived at a figure of 112 million dollars to take care of just two of the metrics -- faculty salary and resources provided to students.

Friday, December 12, 2014

Links


  1. Cat Ferguson, Adam Marcus, and Ivan Oransky in Nature: Publishing: The peer-review scam. "When a handful of authors were caught reviewing their own papers, it exposed weaknesses in modern publishing systems. Editors are trying to plug the holes."

  2. Matt Kutner in Boston Magazine: How to Game the College Rankings. "Northeastern University executed one of the most dramatic turnarounds in higher education. Its recipe for success? A single-minded focus on just one list."

  3. S. Rukmini in The Hindu: 40 % faculty posts vacant in Central varsities. "Faculty vacancy in the Indian Institutes of Technology (IITs) was 40 per cent as of July this year, most acute in Varanasi, Roorkee (above 50 per cent), Kharagur and Delhi. Vacancies were highest for OBC faculty. ...In the Indian Institutes of Management (IIMs), faculty vacancies stood at over 20 per cent, highest in Indore (52 per cent) and Ranchi (48 per cent)."

Tuesday, August 12, 2014

Links


  1. On this the World Elephant Day, you should enjoy this cute overload video shared by Sanjeeta in our Institute's Ecological Students Society blog. Institute.

  2. Andy Thomason in CHE: How Did the Federal Government Rate Your College a Century Ago?

  3. Another bit of historical curiosity: Did Srinivasa Ramanujan fail in math? A. Venkatachalapathy clarifies with some documentary evidence.

  4. Amir Alexander in SciAm: The Glory of Math Is to Matter.

    ... [M]ost fields of higher mathematics remain as they were conceived, with no practical application in sight. So is higher mathematics just an intellectual game played by exquisitely trained professionals for no purpose? And if so, why should we care about it?

Sunday, June 29, 2014

The Right Response To Ranking Exercises: We are X, And We Should Be The Best X We Can Be


Prof. Peter N. Stearns, the soon-to-be-ex Provost of George Mason University, has what I think is one of the best responses to the ranking exercises that seem to have a nasty effect on the sleep cycles of many university leaders. In his recent post entitled Mason Goals, this is what he has to say (and I hope he won't mind the extended excerpt), and with bold emphasis added by me:

T... [W]here should we be heading? I don’t mean the details of the new strategic plan, which is ambitious and fine, though my comments relate to the overall directions of the plan. I refer more to University identity.

And here we confront a puzzle: George Mason is really hard to categorize. When I first arrived I assumed my job was to help make the University even more recognized in the standard ways – move up in US News rankings, get mentioned more often as a research hub, become more selective, and so on. And we did do some of these things. But we also wanted to keep identities we already had, that were equally valuable: center of student diversity (US News shamefully ignores this in its main ratings); accessible to large numbers of first-generation students – and not just accessible – serving as a means for their academic success; eager to seize new opportunities and innovate where appropriate, without as much traditionalist resistance as is common in many other places.

We wanted, in other words, to be George Mason. I heard talk of earlier goals of becoming the “Harvard of the Potomac”, but this reference has faded partly because we simply lack the means, and partly because that’s not fully what we want to be anyway. Yet at the same time we’re not simply innovative and accessible. We really do want to move meaningful research forward. We really do want to combine opportunity with serious quality standards — otherwise we might have more degrees to brag about, but without the real service to students a good university must pledge. We want to be a distinctive mixture, and that’s what I hope we’ve been accomplishing and will accomplish in the future. We want to maintain an active, creative tension between serious conventional standards and the distinctive flavor we’ve developed as an up-and-comer.

Several years ago, pressed by a Board of Visitors interested in Mason aiming at “world class” standards, we hired a consultant who actually said it most clearly: strive to be the best George Mason we can be. Take pride in the difficulty people have in pinpointing us too easily. Take pride in the combinations. In the process we’ll find, as we already do with some of our international visitors, that other institutions will be seeking to adopt our formula.

The "Price" for Getting into the Top 20


A sobering way of examining university ranking exercises is to put a price on the goal of getting into the Top-20 (or Top whatever) in a certain list. Here's an Inside Higher Ed news story about a recent paper that looked into the price for the University of Rochester, which is "ranked consistently in the mid-thirties" in the US News list, to break into the Top 20 (the paper itself is available at a price that I am not willing to pay ;-):

If it wanted to move into the top 20, Rochester would have to do a lot on several of the various factors U.S. News uses to rank colleges. To move up one spot because of faculty compensation, Rochester would have to increase the average faculty salary by about $10,000. To move up one spot on resources provided to students, it would have to spend $12,000 more per student. Those two things alone would cost $112 million a year.

To get into the top 20, Rochester would also have to increase its graduation rate by 2 percent, enroll more students who were in the top 10 percent of their high school graduating class, get more alumni to give, cut the acceptance rate and increase the SAT and ACT scores of incoming students. Some of those things, like offering aid money to highly qualified students, might further increase the expense.

But that’s not all, the paper argues. Rochester would still have to do well in the rankings magazine’s “beauty contest.”

Because 15 percent of the ranking is based on reputation among other administrators, even massive expenditures year after year and huge leaps in student quality and graduation would not be enough. The reputation score as judged by its peers would need to increase from 3.4 to 4.2 on a scale of 5, something that has only a .01 percent chance of happening, the paper said.

IIT-M Director Bhaskar Ramamurthi on Global University Rankings


It's good to see an institutional leader trying to drill some sense into policy-makers who ought to know better than to go blindly by global ranking exercises. Here's the concluding paragraph from Prof. Ramamurthi's op-ed in The Indian Express:

The contributions of the IITs are to be assessed along several dimensions. Some of these are relevant globally and are used by international ranking agencies, while other important ones are totally ignored. We should reflect on the relative weightage given to the dimensions assessed and not lose sight of those that are not. To the extent that the rankings tell us something about where we stand globally with respect to research, visibility, etc, they are relevant, and the IITs should strive to improve their position. Above all, we should not blindly adopt these rankings as an end in themselves, nor allow ourselves to be railroaded into pursuing select dimensions of performance while neglecting others, especially those that are critical to our national development goals.

Monday, June 02, 2014

Links: Global Ranking of Universities


  1. D.D. Guttenplan in NYTimes: Re-evaluating the college ranking game.

    So who will rank the rankings?

    That was the inescapable question when representatives of the four leading ranking organizations sat on the same panel at a conference here last month.

    As Bob Morse, the research director for U.S. News and World Report, pointed out, rankings have become a fiercely competitive global business.

    So the presence of Mr. Morse; Phil Baty, editor of the Times Higher Education Rankings; Nian Cai Lui, originator of the Academic Ranking of World Universities (better known as the Shanghai rankings); and Ben Sowter of the QS World University Rankings would have been enough to make the gathering “a historic event,” Mr. Morse said.

    But in addition to trading friendly digs at their competitors’ methodologies, the rankers had to listen to some stinging criticism — not just from disgruntled academics complaining that their institutions have been undervalued, or education ministers responding to an absence of their country’s universities on a given list, but from their own invited guests. Even the conference host, Michael Arthur, the president of University College London, took a jab.

  2. Aisha Labi in NYTimes: E.U. Seeking Better Clarity on Rankings:

    A new international effort to gauge the performance of universities went online last month promising to be a nuanced tool for students and institutions in the contentious field of global rankings.

    The project is U-Multirank, which was announced in 2011, backed by the European Union and aims to foster greater transparency about higher education globally, including in the United States. But while its approach has received praise, some experts say it still has a way to go before achieving its goals, and some higher education groups have already raised questions about its methods.

Friday, March 14, 2014

Annals of Gaming: Does Size Matter?


Bloomberg's Oliver Staley has an excellent story focusing on a particular method to get ahead in global university league tables: merging universities, with examples from France and Finland. This discussion has some relevance to India since there have been suggestions that all the IITs put together might get the 'IIT System' to break into the top 100 in the rank lists.

Nations Chasing Harvard Merge Colleges to Ascend Rankings:

Twenty colleges and research institutes are combining to create Universite Paris-Saclay, soon to be one of France’s largest universities, at a cost of about 6.5 billion euros ($9 billion). It’s France’s bid to crack the top of rankings that increasingly dominate international higher education.

“Our ambition is to be among the top 10” in the rankings compiled by Shanghai Jiao Tong University, said Dominique Vernay, chairman of the foundation creating Paris-Saclay. “The first goal is to be the top university in continental Europe.”

Countries from Finland to Portugal are shaping their higher education policies based on outside rankings, eager for the validation and attention the annual lists bestow, even while they are criticized as flawed or misleading. Because bigger is perceived as better in these lists, governments are merging campuses in hopes of attracting research money and higher caliber faculty and students.

The high-stakes pursuit of bragging rights is distorting universities’ missions, favoring research over teaching and science over the humanities, said Ellen Hazelkorn, director of the Higher Education Policy Research Unit at the Dublin Institute of Technology.

Construction cranes stand on Electricite de France SA's (EDF) new research and... Read More “It’s all about national prestige,” said Hazelkorn, who has written widely about rankings. “Rankings are less about students and more about geopolitics.”

Sunday, October 06, 2013

Global Rankings: Why Do Indian Policy Makers Take Them Seriously?


Over at University Ranking Watch, Richard Holmes says what sounds to me to be quite plausible: a few high profile, highly cited papers could make a huge difference to an institution with a small research base. He cites two examples from this year's THE-TR rankings: the Tokyo Metropolitan University and the Panjab University, and suggests that their high profile papers are possibly related to the LHC collaborations.

In an op-ed (which he has posted on his blog with data tables) on the inconsistencies in global ranking exercises, Prof. Gautam Barua, former director of IIT-Guwahati, echos this view:

The high scores of Panjab and IIT-G vis-à-vis IIT-D could be explained by this. Panjab University's high energy physics group (and to a lesser extent IIT-G's) is part of global experiments at CERN and Fermi Labs, and papers from that project have very high citations. Thus, a small of group of international collaborations are providing a high score. Isn't the median number of citations per faculty a better measure than the average (there are other issues, for example, citations in the sciences are usually much more than in engineering)?

The global ranking exercises like THE-TR and QS rely on pretty dubious measures, including something called the reputation survey. Even on the so-called objective measures (such as citation metrics, which come with their own problems), they have screwed up -- remember Alexandria? Thanks to folks like Richard Holmes, we know how their "mistakes" and corrections and flip-flops have led to wild fluctuations in the ranking fortunes of Malaya over the years.

When a bunch of money-grubbing entities come along and tell the world that they will rank universities across the globe (irrespective of the vast differences among them), and end up doing a demonstrably shoddy job of it year after year, shouldn't we laugh them off the stage?

No! We treat them like they are superstars.

We welcome them to our living room, and have a tête-à-tête in which we ask them to "educate" us on what we need to do to get more Indian institutions in their top 200 or top 400 or whatever.

And we give their top-400 lists a privileged position in our higher-ed policies.

Forget about growing a spine -- it's time people grew some self-respect.

Friday, October 04, 2013

Trust the West to Find the Best


Here's one from the Interdisciplinary Department of Huh?-Who-knew?

The awesome twosome, Times Higher Education (THE) and Thomson Reuters, informed us all that we have had this hidden gem among us all along: Panjab University is the best in Asia in citations [bold emphasis added]:

Ranked in the 226-250 bracket in The Times Higher Education World University Rankings 2013-14 announced in London, Panjab University (PU) is ranked 32 in Asia. But besides the fact that PU is the best among Indian universities - even leaving behind the IITs - it is also the best in Asia when it comes to its research being cited in journals and studies across the world.

PU's score for citation, 84.7 on a scale of 100, is higher than the University of Tokyo, Japan, which has been otherwise ranked 1 in Asia and has a world ranking of 23 as per the study. Tokyo's citation score is 69.8. The citation score was based on the frequency with which research of those from PU was used by other researchers.

Sunday, August 25, 2013

Tail Wags the Dog


  • Exhibit 1:

    Under the proposed rules, foreign institutions that figure among the top 400 universities in the world — according to rankings published by the Times Higher Education, ... Quacquarelli Symonds, ... or Shanghai Jiao Tong University — will be able to set up campuses [in India]. [Bold emphasis added]

  • Exhibit 2:

    In a bid to get back on the top of the global best institutes of technology chart, a team of four IIT directors would hold talks with officials of the Ministry of Human Resource and Development (MHRD) and the Times Higher Education (THE) World University Rankings here on Wednesday. [Bold emphasis added]

* * *

"Practical men, who believe themselves to be quite exempt from any intellectual influence, are usually the slaves of some defunct economist huckster-peddlers from university rankings." [With apologies to Keynes...]

Friday, August 16, 2013

A new, field-specific, citation-centric rating / ranking of universities


It's described in a paper entitled Ranking and mapping of universities and research-focused institutions worldwide based on highly-cited papers: A visualization of results from multi-level models by Lutz Bornmann, Moritz Stefaner, Felix de Moya Anegon, Ruediger Mutz.

It just looks at one metric for each institution for each research field: the fraction of papers in the top 10 percentile of papers in that field.

Since this is a field-specific exercise, it is slightly better than university level ranking (even after discounting their unsound/stupid methodologies). I would still rank it as a bad exercise since it encourages people to see the research enterprise essentially as a race to get into the top 10 percentile of papers.

Anywasy, what makes it worth blogging is the time-sink interactive web app [password provided on demand] created by the authors that allows you to see for yourself where the good places (and the bad places) are. Proceed with caution -- you might end up spending a lot of time over some metric of colossal insignificance. [See this for example].

For the pointer, I blame thank Doug Natelson.

Thursday, February 09, 2012

Annals of Gaming (the System)


Stephen Budiansky describes a scheme once used by the Case Law School to get ahead in the US News ranking that is way too awesome: Three birds in one stone! [Hat tip: Henry Farrell at Crooked Timber]

Their other tactic was pure genius: the law school hired as adjunct professors local alumni who already had lucrative careers (thereby increasing the faculty-student ratio, a key U.S. News statistic used in determining ranking), paid them exorbitant salaries they did not need (thereby increasing average faculty salary, another U.S. News data point), then made it understood that since they did not really need all that money they were expected to donate it all back to the school (thereby increasing the alumni giving rate, another U.S. News data point): three birds with one stone!

Sunday, July 03, 2011

Links


  1. Jeremiah Jenne at Jottings from the Granite Studio: It’s a Mad Mad 90th Anniversary. "I present the Mad Men guide to 90 years of the CCP."

  2. By Vanessa Fuhrmans in WSJ: In Search of a New Course -- Germany's once-lauded education system is under fire. But fixing it hasn't been easy.

  3. Carl Zimmer in NYTimes: It’s Science, but Not Necessarily Right. "As a series of controversies over the past few months have demonstrated, science fixes its mistakes more slowly, more fitfully and with more difficulty than Sagan’s words would suggest."

  4. Global University Rankings and Their Impact [pdf] -- a report from the European University Association.

  5. David Leonhardt in NYTimes: Top Colleges, Largely for the Elite.

Friday, April 01, 2011

More stats on India's scientific enterprise


After reading the previous post, my colleague Prof. U. Ramamurty sent me the link to this Science Watch listing of field-wise comparison of India's performance against the world average. It has quite a few surprises.

First, the unsurprising bit: India's average for citations per paper is smaller than the world average in all the fields.

The surprise is in the fields that come closest to the world average: Engineering (a deficit of 16 percent), Computer Science (20%) Materials Science (22%), Physics (22%) and Psychiatry / Psychology (33%) are at the top. We see a lot of biology-related fields (agriculture, medicine, biochemistry, microbiology) among those where India's average is less than half the world average.

Once again, this table represents a snapshot; it has no timelines and trends. The accompanying report has some (but only some) info that points to a positive trend in India's share in publications and citations:

... [S]ince 2000 [India's] output has increased from some 16,000 papers to 40,000, world share has risen from 2.2% to 3.4%, and citation impact has improved from 40% to nearly 60% of the world average. While that means that Indian research still underperforms in per-paper influence compared with other nations, the gains represented by these statistics are noteworthy.

For the 11-year period from 2000 to 2010, India accounted for 2.8% of all the scientific publications. What are the fields in which India "held the highest world share"?

... agricultural sciences (5.8%), chemistry (5.4%), materials science (4.8%), pharmacology (4.4%), plant and animal sciences (3.7%), physics (3.6%), engineering (3.3%), and geosciences (3.2%) – all higher than India’s overall 2.8% share.

Thursday, March 31, 2011

Chemistry in Asia


Now that the India-Pakistan game is over, I guess we can get back to normal life. [My heart goes out to Misbah-ul-Haq, though. How can the same shit happen to the same man twice -- last man out against India in the final of the T-20 World Cup in 2007, and now this, the semi-final in this ODI World Cup].

While I'm not a big fan of scientometrics [mainly because their use is inappropriate in 'judging' the contributions of individuals], I do see their value in comparisons as long as (a) they are confined to single subject areas, and (b) they involve larger entities (such as departments, institutions, or even countries). With (a), we can avoid inappropriate comparisons -- e.g., between Courant Institute of Mathematical and Cold Springs Harbor Laboratory -- across fields with different citation practices. And with (b), we will have better and more meaningful statistics due to larger numbers of publications and citations.

Science Watch has a comparative table that meets these two conditions. So, go have a look at the ranking of Asia-Pacific Nations in Chemistry, 2000-2010.

The ranking is in terms of citations per paper; only Singapore (~citations per paper), Australia (12.5) and Japan (~12) do better than the world average (~11). Singapore, the Asia-Pacific topper, is actually ranked 12th in the world. India and China have about 7 citations per paper, and are ranked, respectively, 8th and 9th in Asia, and 38th and 39th in the world; however, China out-publishes India by a factor of nearly 3!

I still have a quibble. This report is a snapshot. I would much prefer an analysis of how the countries have done over the years; for example, the same data -- spanning 11 years, from 2000 to 2010 -- could have been analyzed for five consecutive 6-year periods, starting with 2000-05, all the way up to 2005-10. Such an analysis is better at describing which way each country is headed.

Of course, for policy makers, what would be even better is a sub-field level analysis, which could help identify a country's strong areas as well as weak areas.

Oh well, this is all we have for now.