I read Moneyball in 2004 and recall two things: a baseball team used statistics about individual contribution to team winning to build a winning team on the cheap (other teams at the time were selecting players based on gut feel and statistics not well correlated to team success; players that looked good in person to baseball insiders or looked good on paper to a naive analysis were overpaid) and some trivia about fan gathered versus team or league controlled game data.
The key thing for this post is that it was a team that was able to exploit better metrics, and profit (or at least win).
Just as many baseball enthusiasts were dissatisfied with simple player metrics like home runs and steals, and searched for metrics better correlated with team success (sabermetrics), many academia enthusiasts are dissatisfied with simple academic metrics like (actually all based upon) number of citations to journal articles, and are looking for metrics better correlated with an academic’s contribution to knowledge (altmetrics).
Among other things, altmetrics could lead researchers to spend time doing things that might be more useful than publishing journal articles, bring people into research who are good at doing these other things (creation of useful datasets is often given as an example) but not writing journal articles, and help break the lockin and extraordinary profits enjoyed by a few legacy publishers. Without altmetrics, these things are happening only very slowly as career advancement in academia is currently highly dependent on journal article citation metrics.
As far as I can tell, altmetrics are in their infancy at best: nobody knows how to measure contribution to knowledge, let alone innovation, and baseball enthusiasts faced a much, much more constrained problem: contribution to winning baseball games. But, given that so little is known, and current metrics so obviously inadequate and adhered to, some academics who do well on journal article citation metrics are vastly over-recruited and overpaid, while many academics and would-be-academics who don’t, aren’t. This ought mean there could be big wins from relatively crude improvements.
Who should gamble on potential crude improvements over journal article citation metrics? Entities that hire academics, in particular universities, perhaps even more particularly ones that are in the “big leagues” (considered a research university?) but nowhere near the top, and without cash to recruit superstars per gut feel or journal article citation metrics. I vaguely understand that universities make huge, conscious, expensive efforts to create superstar departments. Nearly all universities aren’t Columbia hoping to spend their way to recruiting superstars from Harvard and Princeton. Instead, why not make a long-term gamble on some plausible altmetric? At best, such a university or department will greatly outperform over the next decade and get credit beyond that for pioneering new practices that everyone copies. At worst, such a university or department will remain mediocre and perhaps slip a bit over the next decade, and get a bit of skeptical press about the effort (if made public). The benefits to society from such experimentation could be large.
Are there universities or departments pursuing such a strategy? I am in no position to know. I did search a bit and came up with What do the social sciences have in common with baseball? The story of Moneyball, academic impact, and quantitative methods. I’m pretty sure the author is writing about hiring social scientists who specialize in quantitative methods, not hiring social scientists based on speculative quantitative methods. What about universities outside wealthy jurisdictions?
…
Speaking of baseball players and academics, just yesterday I realized that academics have the equivalent of player statistics pages when I discovered my brother’s (my only close relative in academia, as far as I know) via his new site. I’ll have to ask him how he feels about giving such a public performance performance. My first reaction is that it is good for society. Such would be good for more professions — for most we have not conventional metrics like home runs or citations, needing improvement, but zero metrics, only gut feel or worse. Lots of fields, employment and otherwise, are ripe for disruption.
Addendum: Another view is that metrics lead to bad outcomes and that rather than using more sophisticated metrics, academia should become more like most employment and shun metrics altogether, hiring purely based on gut feel, and that other fields should continue as before, and fight any encroachment of metrics. Of course these theories may also be experimented with on a team-by-university-by-organization basis.
Further reading: http://taxprof.typepad.com/taxprof_blog/2009/07/somin-can-moneyball-strategies-still-work-for-law-schools.html — includes links to things you might find interesting.
Thanks, I had vaguely heard of the GMU story (the law&econ and free market parts) but it didn’t occur to me when writing the above. I skimmed some of what you linked to, and my initial read is that law school hiring is so poor that “altmetrics” considering more than citations is several steps ahead. The crudest possible methods can still be advantageous:
In Moneyball, scouts were the myth-based old system, ready to be outcompeted. But something like scouting in odd places is considered innovative for law schools? GMU discovered the Domincan Republic (I dimly recall from somewhere, perhaps not Moneyball, that particular locales were “discovered” at various times by U.S. professional baseball) or something … if one wanted to play up the free-market-scholars-were-discriminated-against angle, GMU started recruiting Jewish or African American players, as happened in various US sports leagues. Crude stuff in any case.
And regarding citations:
The law school is a primitive ass! (Meaning law schools, not GMU in particular, which seems to be less primitive; plural wouldn’t have worked as well, if the previous sentence works at all.)
Drifts of $20 bills in the streets, but the costs of picking them up are apparently huge.