Archive

Archive for the ‘futures’ Category

Rethinking Intellectual Property protection from Crooked Timber: ‘Information Feudalism’

January 29, 2011 2 comments

Information Feudalism — Crooked Timber.

Matt Yglesias writes:

A lot of our politics is about symbolism. And symbolically intellectual property represents itself in the contemporary United States as a kind of property—it’s right there in the name. But it’s better thought of as a kind of regulation. Patents and copyrights are modeled, economically, the same as you would model any state-created monopoly.I think the idea that intellectual property is property is too entrenched, at this point, for this to be an effective rhetorical strategy. Furthermore, rhetoric aside, philosophically the real breakthrough would be for people to realize that defending property rights is not tantamount to defending freedom. What strong IP protection generates is not a free market but something more like information feudalism: a market-unfriendly clusterfuck of fiefdoms and inescapably inefficient lord-vassal terms-of-service arrangements that any friend of freedom, in any ordinary sense, ought to look upon with disgust. The reason why libertarian rhetoric – defend property rights! – can underwrite feudalism, of all things, is that a certain sort of libertarianism, i.e. so-called propertarianism, really just plain is a form of feudalism. I’ve made the case at length.

More via link above. A good question is whether or not it is actually worth spending lots of time protecting IP in universities. Academics publish in peer-reviewed journals which are in turn gradually succumbing to open access (OA) trends and policies. OA is a bit of misnomer in the sense that even for the traditional academic journals, access to papers is available for a small fee (papers published in Nature are $32 each, for example); free-access might be a better term than open access. Lots of this work will also end in open access institutional repositories. Academic research is therefore already mostly open and freely available through peer-reviewed publication, and that academic IP in reality mostly comprises expertise, judgement, track record and know-how, not discrete inventions of new products (that might produce those so badly-wanted substantial revenue streams). Should universities spend so much time, money and effort on protecting IP? Maybe, but maybe not…

Advertisements

Mismeasuring scientific quality (and an argument in favour of diversity of measurement systems)

December 27, 2010 2 comments

There was a short piece here recently on the misuse of impact factors to measure scientific quality, and how this in turn leads to dependence on drugs like Sciagra™ and other dangerous variants such as Psyagra™ and Genagra™.

Here’s an interesting and important post from Michael Nielsen on the mismeasurement of science. The essence of his argument is straightforward: unidimensional reduction of a multidimensional variable set is going to lead to significant loss of important information (or at least that’s how I read it):

My argument … is essentially an argument against homogeneity in the evaluation of science: it’s not the use of metrics I’m objecting to, per se, rather it’s the idea that a relatively small number of metrics may become broadly influential. I shall argue that it’s much better if the system is very diverse, with all sorts of different ways being used to evaluate science. Crucially, my argument is independent of the details of what metrics are being broadly adopted: no matter how well-designed a particular metric may be, we shall see that it would be better to use a more heterogeneous system.

Nielsen notes three problems with centralised metrics (this can be relying solely on a h-index, citations, publication counts, or whatever else you fancy):

Centralized metrics suppress cognitive diversity: Over the past decade the complexity theorist Scott Page and his collaborators have proved some remarkable results about the use of metrics to identify the “best” people to solve a problem (ref,ref).

Centralized metrics create perverse incentives: Imagine, for the sake of argument, that the US National Science Foundation (NSF) wanted to encourage scientists to use YouTube videos as a way of sharing scientific results. The videos could, for example, be used as a way of explaining crucial-but-hard-to-verbally-describe details of experiments. To encourage the use of videos, the NSF announces that from now on they’d like grant applications to include viewing statistics for YouTube videos as a metric for the impact of prior research. Now, this proposal obviously has many problems, but for the sake of argument please just imagine it was being done. Suppose also that after this policy was implemented a new video service came online that was far better than YouTube. If the new service was good enough then people in the general consumer market would quickly switch to the new service. But even if the new service was far better than YouTube, most scientists – at least those with any interest in NSF funding – wouldn’t switch until the NSF changed its policy. Meanwhile, the NSF would have little reason to change their policy, until lots of scientists were using the new service. In short, this centralized metric would incentivize scientists to use inferior systems, and so inhibit them from using the best tools.

Centralized metrics misallocate resources: One of the causes of the financial crash of 2008 was a serious mistake made by rating agencies such as Moody’s, S&P, and Fitch. The mistake was to systematically underestimate the risk of investing in financial instruments derived from housing mortgages. Because so many investors relied on the rating agencies to make investment decisions, the erroneous ratings caused an enormous misallocation of capital, which propped up a bubble in the housing market. It was only after homeowners began to default on their mortgages in unusually large numbers that the market realized that the ratings agencies were mistaken, and the bubble collapsed. It’s easy to blame the rating agencies for this collapse, but this kind of misallocation of resources is inevitable in any system which relies on centralized decision-making. The reason is that any mistakes made at the central point, no matter how small, then spread and affect the entire system.

What of course is breath-taking is that scientists, who spend so much time devising sensitive measurements of complex phenomena, can sometimes suffer a bizarre cognitive pathology when it comes to how the quality of science itself should be measured.  The sudden rise of the h index is surely proof of that. Nothing can actually substitute for the hard work of actually reading the papers and judging their quality and creativity.  Grillner and colleagues recommend that “Minimally, we must forego using impact factors as a proxy for excellence and replace them with indepth analyses of the science produced by candidates for positions and grants. This requires more time and effort from senior scientists and cooperation from international communities, because not every country has the necessary expertise in all areas of science.” Nielsen makes a similar recommendation.

The Happy Planet index

December 24, 2010 Leave a comment

Something to play with over Christmas.

Introduction

The questions on the following pages will ask you about where you live, your health, lifestyle, and how you feel about life. The answers you give are used to calculate your own personal score on the Happy Planet Index. How happy are you… and at what price to the environment?!

Three Views of Matt Ridley, The Rational Optimist – tagline ‘ideas having sex’!

December 24, 2010 Leave a comment

Three Views of Matt Ridley.

Posted by David Boaz

Matt Ridley’s new book, The Rational Optimist: How Prosperity Evolves, is garnering rave reviews. Ridley, science writer and popularizer of evolutionary psychology, shows how it was trade and specialization of labor–and the resulting massive growth in technological sophistication–that hauled humanity from its impoverished past to its comparatively rich present. These trends will continue, he argues, and will solve many of today’s most pressing problems, from the spread of disease to the threat of climate change.

The Cato Institute has now presented three different looks at the book, with a review in the Cato Journal, another in Regulation, and an event at Cato with Matt Ridley himself.

FWIW, it is one of the most enjoyable books I have read this year, and a great counter to the pervasive misery-mongering rife. The tagline ‘ideas having sex’ is a great metaphor for the advancement of knowledge. Here is his TED talk – well worth watching.

Further on the (over)production of PhDs

December 23, 2010 Leave a comment

PhD production as a process of self-discovery by Chris Blattman (post reproduced in full). See also this post. A different and less-depressing view of PhD (over)production.

There is an oversupply of PhDs. Although a doctorate is designed as training for a job in academia, the number of PhD positions is unrelated to the number of job openings.

Meanwhile, business leaders complain about shortages of high-level skills, suggesting PhDs are not teaching the right things. The fiercest critics compare research doctorates to Ponzi or pyramid schemes.

From The Economist.

Many of the arguments are valid. But make two plausible assumptions and you get a different answer.

Assumption 1: Innovation, including academic research, is the fundamental driver of long term health, wealth and happiness for the human race. (The “including academic research” bit is the biggest leap.)

Assumption 2: Unfortunately it’s very difficult to say beforehand who will and who will not produce great, or even good, research. (Even after five years departments have trouble predicting which of their crop will excel.)

In this world, each extra PhD raises the chances of one more brilliant, world-changing idea. While hardly comforting to the thousands who toil without job prospects, the collective benefits just might outweigh all the individual misery.

The decision might be individually rational as well, especially if students are no better at predicting their success than their advisors (they probably aren’t).

(A similar analogy comes from Lant Pritchett, who points out that you need a system that produces an enormous number of terrible dance recitals to get the handful of sublime performers. The same logic applies, he argues, to development projects and policies.)

One counterpoint: Here is where I would expect to see overconfidence bias lead to oversupply (and few of the collective benefits thereof). So maybe we need a system that gives the least promising an easier out that saves face.

The World Of Big Data – The Daily Dish | By Andrew Sullivan

December 20, 2010 Leave a comment

Great post on ‘The World Of Big Data’ by Andrew Sullivan -reproduced in full below.

In passing, a Government truly interested in developing the smart economy would engage in massive data dumps with the presumption that just about every piece of data it holds (excluding the most sensitive pieces of information) from ministerial diaries to fuel consumption records for Garda cars to activity logs for mobile phones to numbers of toilet rolls used in Government Departments would be dumped in realtime on to externally-interrrogable databases. This would be geek-heaven and would generate new technological applications beyond prediction and application. And the activity would be local – could an analyst sitting in Taiwan really make sense of local nuances? The applications would be universal, portable and saleable, however. They would seed a local high-tech industry – maybe even a local Irish Google. Can’t see the Civil Service going for it, though…

Elizabeth Pisani explains (pdf) why large amounts of data collected by organizations like Google and Facebook could change science for the better, and how it already has. Here she recounts the work of John Graunt from the 17th century:

Graunt collected mortality rolls and other parish records and, in effect, threw them at the wall, looking for patterns in births, deaths, weather and commerce. … He scraped parish rolls for insights in the same way as today’s data miners transmute the dross of our Twitter feeds into gold for marketing departments. Graunt made observations on everything from polygamy to traffic congestion in London, concluding: “That the old Streets are unfit for the present frequency of Coaches… That the opinions of Plagues accompanying the Entrance of Kings, is false and seditious; That London, the Metropolis of England, is perhaps a Head too big for the Body, and possibly too strong.”She concludes:

A big advantage of Big Data research is that algorithms, scraping, mining and mashing are usually low cost, once you’ve paid the nerds’ salaries. And the data itself is often droppings produced by an existing activity. “You may as well just let the boffins go at it. They’re not going to hurt anyone, and they may just come up with something useful,” said [Joe] Cain.

We still measure impact and dole out funding on the basis of papers published in peerreviewed journals. It’s a system which works well for thought-bubble experiments but is ill-suited to the Big Data world. We need new ways of sorting the wheat from the chaff, and of rewarding collaborative, speculative science.

[UPDATE] Something I noticed in The Irish Times:

PUBLIC SECTOR: It’s ‘plus ca change’ in the public service sector, as senior civil servants cling to cronyism and outdated attitudes, writes GERALD FLYNN:

…it seems now that it was just more empty promises – repeating similar pledges given in 2008. As we come to the end of yet another year, there is still no new senior public service structure; no chief information officer for e-government has been appointed; no reconstitution of top-level appointments has taken place; and no new public service board has been appointed [emphasis added].

So nothing will happen.

Clay Shirkey on Collapsing Societies, Institutions and Business Models (… and how about universities?)

November 11, 2010 Leave a comment

Clay Shirkey in a much-commented upon post addressing collapsing business models, institutions and societies (this post has been around a while). Read the whole post – it seems very relevant to the way all sorts of instutions are actively resisting the call of the future at present. Universities in general seem resistant to this sort of institutional collapse, as they have survived for so long (but will particular universities be able to resist such institutional collapse, given the host of pressures that are building?). To re-write the financial watchdog warning: perhaps past survival of a university is no guarantee of future survival!

A quote:

In 1988, Joseph Tainter wrote a chilling book called The Collapse of Complex Societies. Tainter looked at several societies that gradually arrived at a level of remarkable sophistication then suddenly collapsed: the Romans, the Lowlands Maya, the inhabitants of Chaco canyon. Every one of those groups had rich traditions, complex social structures, advanced technology, but despite their sophistication, they collapsed, impoverishing and scattering their citizens and leaving little but future archeological sites as evidence of previous greatness. Tainter asked himself whether there was some explanation common to these sudden dissolutions.

The answer he arrived at was that they hadn’t collapsed despite their cultural sophistication, they’d collapsed because of it. Subject to violent compression, Tainter’s story goes like this: a group of people, through a combination of social organization and environmental luck, finds itself with a surplus of resources. Managing this surplus makes society more complex—agriculture rewards mathematical skill, granaries require new forms of construction, and so on.

Early on, the marginal value of this complexity is positive—each additional bit of complexity more than pays for itself in improved output—but over time, the law of diminishing returns reduces the marginal value, until it disappears completely. At this point, any additional complexity is pure cost.

Tainter’s thesis is that when society’s elite members add one layer of bureaucracy or demand one tribute too many, they end up extracting all the value from their environment it is possible to extract and then some.

The ‘and them some’ is what causes the trouble. Complex societies collapse because, when some stress comes, those societies have become too inflexible to respond. In retrospect, this can seem mystifying. Why didn’t these societies just re-tool in less complex ways? The answer Tainter gives is the simplest one: When societies fail to respond to reduced circumstances through orderly downsizing, it isn’t because they don’t want to, it’s because they can’t.

In such systems, there is no way to make things a little bit simpler – the whole edifice becomes a huge, interlocking system not readily amenable to change. Tainter doesn’t regard the sudden decoherence of these societies as either a tragedy or a mistake—”[U]nder a situation of declining marginal returns collapse may be the most appropriate response”, to use his pitiless phrase. Furthermore, even when moderate adjustments could be made, they tend to be resisted, because any simplification discomfits elites.

When the value of complexity turns negative, a society plagued by an inability to react remains as complex as ever, right up to the moment where it becomes suddenly and dramatically simpler, which is to say right up to the moment of collapse. Collapse is simply the last remaining method of simplification.