sexta-feira, 2 de dezembro de 2011

Protecting Brazil's forests

 
Fiddling while the Amazon burns

Keeping the worlds biggest forest standing depends on greens, Amerindians and enlightened farmers working together if lawmakers let them

From the Economist: Dec 3rd 2011 | JACI-PARANÁ, RONDÔNIA | from the print edition

DRIVE out of Porto Velho, the capital of the Amazonian state of Rondônia, and you see the trouble the worlds largest forest is in. Lorry after lorry trundles by laden with logs; more logs lie by the road, to be collected by smugglers who dumped them on the rumor of a (rare) road check. Charred tree-stumps show where ranchers burned what the loggers left behind; a few cattle roam sparsely through the scrubby fields. In places the acid subsoil shows through, sandy and bone-pale. Seen from above, the roads look like hatchet blows, with dirt tracks radiating outward like thinner wounds. The picture is reproduced across the Amazons arc of deforestation (see map).


The Brazilian Amazon is now home to 24m people, many of them settlers who trekked those roads in the 1960s and 1970s, lured by a government promise that those who farmed unproductive land could keep it. Chaotic or corrupt land registries left some without secure title. Rubber-tappers, loggers, miners and charcoal-burners came too. The most recent arrivals are 20,000 construction workers building dams on the Madeira and Xingu rivers to provide electricity to Brazils populous south. They have attracted some 80,000 camp followers, many of whom squat on supposedly protected land.

The population of Jaci-Paraná, the nearest town to the Jirau dam being built on the Madeira, has risen from 3,500 to 21,000 in a decade but it still has just four police. Prostitutes and drug-dealers do well. On payday, says Maria Pereira, a teacher, busloads of construction workers hit town to drink and fight. Knife-killings are common. When the dam is finished, many of the new residents will move on. Behind them, a bit more of the Amazon will be gone.

Brazils government no longer encourages cutting down the forest. Nearly half of it now lies within indigenous reserves, or state and federal parks where most logging is banned. Private landowners must abide by the Forest Code, a law dating from 1965 that requires them to leave the forest standing on part of their farms (four-fifths in the Amazon, less elsewhere), and in particular around the sources and banks of rivers, and on hillsides.

But the code is routinely flouted. Less than 1% of the fines levied for failing to observe it are ever paid, because of uncertain ownership and poor enforcement. The Suruí, an Amerindian people, recently mapped its territory in Rondônia, on paper strictly protected. The tribe was shocked to find that 7% had been cleared.

In Brasília 2,000km (1,250 miles) and a world away, politicians are haggling over laws that will affect the fate of the forest. Some legislators are pushing a bill that would give Congress, rather than the president, the power to create new reserves. That would probably mean fewer new ones a blow for the forest, says Ivaneide Bandeira of Kanindé, a non-profit group in Rondônia. Indigenous people protect the forest better than anyone else, she says.

The Senate is poised to vote on a new version of the Forest Code, already approved by the lower house. The president, Dilma Rousseff, wants a final version on her desk before Christmas. Everyone agrees that change is needed. The share of private land that must be set aside has risen since 1965 and farmers who were once in compliance but omitted to update their paperwork can end up lumped in with lawbreakers. Kátia Abreu, a senator who is the president of the main farm lobby, says farmers find such uncertainty deeply worrying. Environmentalists dislike it too, since it encourages loggers and land-grabbers by fuelling disrespect for the law.

But the consensus has gone no further. The farm lobby wanted all past land clearance regularized, arguing that if farmers had to replant trees, crop output would fall, food prices soar and poor Brazilians go hungry. Greens countered that an amnesty would fuel future deforestation. So far, at least, the farm lobby is winning. The current draft allows farmers to dodge fines for illegal logging and postpone their obligation to replant by simply declaring that their violations were committed before July 2008 and by enrolling in a vague and leisurely environmental recovery programme, to be run by individual states.

This is an amnesty in all but name, says Maria Cecília Brito, the head of WWF-Brazil, a conservation group. Without safeguards, states will be able to postpone forever the requirement to act. After several years in which the annual rate of deforestation fell, this year it has risen, possibly because landowners think the new code will let them get away with it. Law-abiding farmers are outraged. When Darci Ferrarin bought a large farm in Mato Grosso in 1998, he knew that its riverbanks had been illegally cleared. He paid to replant. Those who deforested illegally should go to jail, declares his son, Darci Junior.

The only promising aspect of the new code, thinks Roberto Smeraldi of Amazônia Brasileira, a green NGO, is that it offers benefits such as subsidized loans to landowners who have always stuck by the rules, or who are reforesting faster than the law demands. But he laments the missed opportunity for a grand bargain to align opposed interests. A cap-and-trade system like those used to limit industrial pollution in rivers could have helped farmers short of set-aside to comply with the law by paying neighbors with more than the legal minimum to maintain it. That would both have spared farmers from costly replanting and cut future deforestation by making standing forests financially valuable.

Ms Rousseff promises to veto any amnesty for illegal deforesters. But the fig leaf of the environmental recovery programme may give her scope to temporize, and with a heavy legislative schedule she may be tempted to do so. If she does, the Amazons best hope will lie with the enlightened farmers and indigenous tribes who care for their land better than the state is willing to.

For Mr Ferrarin, the way to halt deforestation is to use existing farmland better. Almost half his farm of 13,350 hectares (33,000 acres) is set aside as forest; the rest supports 3,000 cattle as well as soya and several other crops, farmed in rotation. Innovative no-till methods cut carbon emissions, fertilizer use and labor. The Ferrarins run workshops to teach other farmers about such integrated farming techniques. Mr Ferrarins daughter, Valkiria, runs a cattle-breeding programme, with an on-site IVF clinic where embryos from prize animals are implanted in surrogates. A productive farm can support an extended family for several generations, he says.

Cassio Carvalho do Vals father settled in Redenção in Pará in 1959. It was then virgin rainforest: the last 150km of the journey was by donkey, carrying dried meat, rice and beans. Nine-tenths of the 300,000 hectares he was granted has since been sold, but the farm is still vast (the average farm in the United States comprises around 160 hectares), and unproductive, with just one cow per hectare of pasture. But his son has started to fatten his cows with grain and plans to try integrated farming. Its the dream of every crop farmer to be a rancher, he says with a laugh. Its so much easier. But he thinks he needs to keep up with the times.

The Suruí set an example

Some of Brazils indigenous peoples are redoubling their efforts to protect the standing forest. The 1,300 Suruí have moved their 25 villages to the borders of their territory to get early warning of incursions. With help from Kanindé and others, since 2005 they have started to reforest where intruders have cleared. To the inexperienced eye, the new trees already look ancient (though to the Suruí the sparser cover is still obvious). Next year the tribe will host other indigenous peoples who want to repair deforestation on their own lands. They hope to start teaching non-indigenous folk, too.

The Suruí are the first Brazilian tribal people to set up a REDD project, an international aid scheme to prevent deforestation. Up to 10% of the income generated will go to local non-Indians, to show them that standing forest can create jobs and income. We are not saying, don’t use the forest, explains the chief, Almir Narayamoga Suruí. We are saying you should think about the medium and long term when you decide how to use it. That will be easier if the politicians approve a Forest Code that looks to the future, not the past and then provide the means to enforce it.

from the print edition | The Americas

sexta-feira, 4 de novembro de 2011

Google to Host Terabytes of Open-Source Science Data

Einsteingoogle Sources at Google have disclosed that the humble domain, http://research.google.com, will soon provide a home for terabytes of open-source scientific datasets. The storage will be free to scientists and access to the data will be free for all. The project, known as Palimpsest and first previewed to the scientific community at the Science Foo camp at the Googleplex last August, missed its original launch date this week, but will debut soon.
Building on the company’s acquisition of the data visualization technology, Trendalyzer, from the oft-lauded, TED presenting Gapminder team, Google will also be offering algorithms for the examination and probing of the information. The new site will have YouTube-style annotating and commenting features.
The storage would fill a major need for scientists who want to openly share their data, and would allow citizen scientists access to an unprecedented amount of data to explore. For example, two planned datasets are all 120 terabytes of Hubble Space Telescope data and the images from the Archimedes Palimpsest, the 10th century manuscript that inspired the Google dataset storage project.
UPDATE (12:01pm): Attila Csordas of Pimm has a lot more details on the project, including a set of slides that Jon Trowbridge of Google gave at a presentation in Paris last year. WIRED’s own Thomas Goetz also mentioned the project in his fantastic piece of freeing dark data.

One major issue with science’s huge datasets is how to get them to
Google. In this post by a SciFoo attendee over at business|bytes|genes|molecules, the collection plan was described:
(Google people) are providing a 3TB drive array (Linux RAID5). The array is provided in “suitcase” and shipped to anyone who wants to send they data to Google. Anyone interested gives
Google the file tree, and they SLURP the data off the drive. I believe they can extend this to a larger array (my memory says 20TB).
You can check out more details on why hard drives are the preferred distribution method at Pimm. And we hear that Google is hunting for cool datasets, so if you have one, it might pay to get in touch with them.

Freeing the Dark Data of Failed Scientific Experiments

By Thomas Goetz Email 09.25.07
It's Time to Free the Dark Data of Failed Scientific Experiments
Photo: Mauricio Alejo 
 
In 1981, the New England Journal of Medicine published a Harvard study that showed an unexpected link between drinking coffee and pancreatic cancer. As it happened, researchers were anticipating a connection between alcohol or tobacco and cancer. But according to the survey of several hundred patients, booze and cigarettes didn't seem to increase your risk. Then came a surprise: An incidental survey question suggested that coffee did increase the chances of pancreatic cancer. So that's what got published.

Those positive results, alas, were entirely anomalous; 20 years of follow-up research showed the coffee-cancer connection to be bunk. Nonetheless, it's a textbook example of so-called publication bias, where science gets skewed because only positive correlations see the light of day. After all, the surprising findings are what makes the news (and careers).

So what happens to all the research that doesn't yield a dramatic outcome — or, worse, the opposite of what researchers had hoped? It ends up stuffed in some lab drawer. The result is a vast body of squandered knowledge that represents a waste of resources and a drag on scientific progress. This information — call it dark data — must be set free.

For the past couple of years, there's been much talk about open access, the idea that more scientific publications should be freely available — not locked behind firewalls and subscriptions. Thanks to the Public Library of Science (PLoS) and other organizations, that notion is making headway. Liberating dark data takes this ethos one step further. It also makes many scientists deeply uncomfortable, because it calls for them to reveal their "failures." But in this data-intensive age, those apparent dead ends could be more important than the breakthroughs. After all, some of today's most compelling research efforts aren't one-off studies that eke out statistically significant results, they're meta-studies — studies of studies — that crunch data from dozens of sources, producing results that are much more likely to be true. What's more, your dead end may be another scientist's missing link, the elusive chunk of data they needed. Freeing up dark data could represent one of the biggest boons to research in decades, fueling advances in genetics, neuroscience, and biotech.

So why doesn't it happen? In part, it's a logistics problem: Advocating the release of dark data is one thing, but it's quite another to actually collect it, juggling different formats and standards. And, of course, there's the issue of storage. These days, an astronomical study of quasars or an ambitious bioinformatics project can generate several terabytes of data. Few have the capacity to store that, let alone analyze it.

Google, among others, is lending a hand with its Palimpsest project, offering to store and share monster-size data sets (making the data searchable isn't a part of the effort). As storage costs drop, similar data banks will emerge, along with format standards, and it should become ever easier to share results, good or bad.

Technology is actually the simple part. The tougher problem lies in the culture of science. More and more, research is funded by commercial entities, which deem any results proprietary. And even among fair-minded academics, the pressures of time, tender, and tenure can make openness an afterthought. If their research is successful, many academics guard their data like Gollum, wringing all the publication opportunities they can out of it over years. If the research doesn't pan out, there's a strong incentive to move on, ASAP, and a disincentive to linger in eddies that may not advance one's job prospects.
There are some islands of innovation. Since 2002, the Journal of Negative Results in Biomedicine has offered a peer-reviewed home to results that go negative or against the grain. Earlier this year, the journal Nature started Nature Precedings, a Web-based forum for prepublication research and unpublished manuscripts in biomedicine, chemistry, and the earth sciences. At Drexel University, chemist Jean-Claude Bradley practices "open notebook" science — chronicling his lab's work and sharing data via blog and wiki. And PLoS is planning an open repository for research and data that is other wise abandoned.

These are great first steps. But freeing dark data should be the norm, not the exception. Once the storage and format problems are solved, scientists will need easy ways to search and retrieve each other's data. (And there should be a simple way, perhaps via metadata, to make sure that work is always duly cited and acknowledged by others.) Congress should mandate that all federally funded research be disseminated, whatever the results.

Getting science comfortable with exposing its dark data is really just the beginning. Once you start looking for it, dark data is everywhere: It's locked away in out-of-print books and orphaned art, the stuff that Creative Commons and Google Book Search have been bringing to light. Speaking of which: Hey, Google! Know all those research projects your employees do that the company will never green-light? How about letting the rest of the world take a crack at them?

Deputy editor Thomas Goetz (thomas@wired.com) wrote about DNA diagnostics in issue 15.08.

The Thirteen Mindfulness Trainings


The Thirteen Mindfulness Trainings form the moral guidelines to develop harmony in any simple community. One of the essential elements of the Mindfulness Trainings is that they are directly applied in our daily lives. Every moment of our lives gives us the chance to put them into practice.
The idea is to recite the Mindfulness Trainings regularly so that we can review our behaviour and observe where we have not lived up to our aspirations. It is important that we observe rather than judge ourselves, so that gradually, without resistance, our lives become imbued with the qualities they represent.

Understanding (prajna), concentration (samadhi) and Mindfulness Trainings or ethics (sila) are the threefold trainings that the Buddha passed on to his lay students. The practice of each of these trainings is equally important.

The encounter between Eastern philosophies and the West is bringing us something very exciting, very important. When Buddhism enters one country, that country always acquires a new form of Buddhism. The merge result must be suitable, appropriate to the psychology and the cul ture of the society that it serves.

The Thirteen Mindfulness Trainings comes from different sources. The last part of them (5) comes directly from the Five Precepts followed by all the Tibetan Buddhist schools (Sila). The first eight comes from the Tiep Hien Order founded in Vietnam during the war and which came from the Zen School of Lin Chi.

Tiep and Hien are Vietnamese words of Chinese origin. I would like to explain the meaning of these words, because understanding them helps in understanding the spirit of the Trainings.

Tiep means “to be in touch.” First of all, to be in touch with oneself in order to find out the source of wisdom, understanding, and compassion in each of us. Being in touch with oneself is the meaning of medi tation, to be aware of what is going on in your body, in your feelings, in your mind. That is the first mean ing of Tiep.

Tiep
also means to be in touch with Buddhas and Bodhisattvas, the enlightened people in whom full understanding and compassion are tangible and ef fective. Being in touch with oneself means being in touch with this source of wisdom and compassion.

Hien means “the present time.” We have to be in the present time, because only the present is real, only in the present moment can we be alive. We do not practice for the sake of the future, to be reborn in a paradise, but to be peace, to be compassion, to be joy right now.

The First Mindfulness Training: Openness

Aware of the suffering created by fanaticism and intolerance, I am determined not to be idolatrous about or bound to any doctrine, theory or ideology, even Buddhist ones. Buddhist teachings are guiding means to help me learn to look deeply and to develop my understanding and compassion. They are not doctrines to fight, kill or die for.

The Second Mindfulness Training: Non-attachment to Views

Aware of the suffering created by attachment to views and wrong perceptions, I am determined to avoid being narrow-minded and bound to present views. I will learn and practise non-attachment from views in order to be open to others’ insights and experiences. I am aware that the knowledge I presently possess is always changing and not an absolute truth. Truth is found in life and I will observe life within and around me in every moment, ready to learn throughout my life.

The Third Mindfulness Training: Freedom of Thought

Aware of the suffering brought about when I impose my views on others, I am committed not to force others, even my children, by any means whatsoever, to adopt my views. I will respect the right of others to be different and to choose what to believe and how to decide.

The Fourth Mindfulness Training: Awareness of Suffering

Aware that looking deeply at the nature of suffering can help me develop compassion and find ways out of suffering, I am determined not to avoid or close my eyes before suffering. I am determined to be with those who suffer, so I can understand their situation deeply and help them transform their suffering into compassion, peace and joy.

The Fifth Mindfulness Training: Living Simply

Aware that true happiness is rooted in peace, solidarity, freedom and compassion, and not in wealth or fame, I am determined not to take as the aim of my life fame, profit, wealth, or sensual pleasure. I am committed to living simply and sharing my time, energy and material resources with those around me in need, including my blood and spiritual family, my friends, my ancestors and my descendents.

The Sixth Mindfulness Training: Dealing with Anger

Aware that anger blocks communication and creates suffering, I am determined to take care of the energy of anger when it arises and to recognize and transform the seeds of anger that lie deep in my consciousness. When anger comes up, I am determined not to do or say anything, but to find the best way to ease my anger. I will learn to look with the eyes of compassion on those I think are the cause of my anger.

The Seventh Mindfulness Training: Dwelling Happily in the Present Moment

Aware that life is available only in the present moment and that it is possible to live happily in the here and now, I am committed to training myself to live deeply each moment of daily life. I will try not to lose myself in dispersion or be carried away by regrets about the past, worries about the future, or cravings, anger or jealousy in the present. I will practise mindful breathing to come back to what is happening in the present moment. I am determined to learn the art of mindful living by touching the wondrous, refreshing and healing elements that are inside and around me, and by nourishing seeds of joy, peace, love and understanding in myself, thus facilitating the work of transformation and healing in my consciousness.

The Eighth Mindfulness Training: Communication and Reconciliation

Aware that lack of communication always brings separation and suffering, I will make every effort to keep communications open and to reconcile and resolve all conflicts, however small. Aware of the suffering caused by unmindful speech and the inability to listen to others, I am committed to cultivating loving speech and deep listening in order to bring joy and happiness to others and relieve others from their suffering.

The Ninth Mindfulness Training: Truthful Speech

Aware that words can create suffering or happiness, I am committed speak truthfully and constructively, using only words that inspire hope and confidence. I am determined not to say untruthful things for the sake of personal interest or to impress people, nor to utter words that might cause division or hatred. I will not spread news that I do not know to be certain nor criticize or condemn things of which I am not sure.

It is important to remember the four faults by the word:
  • lying.
  • pronouncing offensive words (hurtful speech).
  • uttering words that can cause division between two or more people by talking negatively of some of them, spread rumors, or sowing doubts. We have to be careful not to increase discord between people but to make all efforts to reconcile and resolve all conflicts, however small.
  • indulging in idle chatter. We must abstain of any kind of unmindful speech; spreading news we do not know to be certain; criticize or condemn people.
Our speech must embody mindfulness and we must be completely aware and responsible while pronouncing words.
Try making your words wiser than the silence that has been broken.
The Tenth Mindfulness Training: Reverence for Life

I am committed to abstain from doing harm to any sensitive beings. This commitment implies the determination to respect the life of all beings, not only human beings but also all animals and plants, regardless of their size; and an increasing awareness of the sanctity of life. Moreover it means more than a respect for the sacredness of life since it means a commitment to neither harm nor offend anyone under any circumstances and to promote peace and peace education.

The Eleventh Mindfulness Training: Respect and Generosity

I am committed to abstaining from taking what has not been given to me. This commitment implies the respect for other beings’ property. It not only implies not to steal, but encourages us to cultivate the attitude of not to demand or claim what we wish, but to learn ways to work for the benefit of people, animals and plants, practicing generosity, sharing our time, energy and resources with those in need, and to patiently wait to receive the results of our good actions.

The Twelfth Mindfulness Training: Correct use of our Senses

I am committed to abstain from the wrong use of my senses. This commitment implies contemplating our body as a temple that we take care of and cultivate for spiritual growth. What is advised here is to abstain from the disorderly use of the body, sexuality and body sensations, avoiding falling into gluttony, lust, incest and all kinds of disorders produced by inadequate sensory behavior.

The Thirteenth Mindfulness Training: Mindful Consumption

Aware of the suffering caused by unmindful consumption, I am committed to abstain from consuming items that cloud or blur the mind. It not only implies to abstain from alcohol and intoxicating drugs but to take care of the quality of information that we are consuming through all our senses, what are we reading, what we are seeing, who we are joining. We are aware that everything we consume will feed our mind and the collective mind of our family and society.

quarta-feira, 12 de outubro de 2011

Entering the anthropocene: ‘Geonauts’ or sorcerer’s apprentices?

Ignacy Sachs
Centre de recherche sur le Brésil Contemporain, Ecole des Hautes Etudes en Sciences Sociales, Paris
Abstract
The Second Earth Summit to be held in Rio de Janeiro in 2012 will coincide with the ratification by the International Commission on Stratigraphy of the concept of a new geological era, the anthropocene. This term emphasizes the acknowledgement of the increasing impact of human intervention on the future of the Spaceship Earth. Humanity is thus at a crossroads and we need, more than ever, to abide by the principle of responsibility. We must mobilize ourselves to learn how to speedily mitigate deleterious climate change without losing sight of the urgent need to reduce the abyssal social disparities. The immediate imperative is to propose long-term development strategies to go hand in hand with an aggiornamento of long-term democratic planning. Such strategies must rely on two pillars: food security and energy security. Last but not least, the United Nations ought to take advantage of the forthcoming Earth Summit to set in motion a global transition towards a socially inclusionary and environmentally sustainable path. 

Keywords
aggiornamento of democratic planning, anthropocene, climate change, Earth Summit Rio 2012, evergreen revolution, food and energy security, green economy and social inclusion, principle of responsibility


According to Paul J Crutzen and Eugene F Stoermer, a new geological era – the anthropocene – started with the industrial revolution. This term has been chosen to emphasize the astounding expansion of mankind, both in numbers and per capita exploitation of the earth’s resources: the tenfold increase of human passengers on the Spaceship Earth during the past three centuries to 6000 million, accompanied by a growth in cattle population to 1400 million, a tenfold growth of urbanization in the past century and the near exhaustion of the fossil fuels that were generated over several hun- dred million years (see Crutzen & Stoermer, 2000).1 In addition, we could mention that between 1800 and 2010, the output of the world-economy increased by a factor of almost 50, yet about one billion people still suffer from food insecurity (see Diniz Alves, 2011).

The entry into the anthropocene should be seen as an unprecedented disruption in the long history of the co-evolution between our species and the biosphere insofar as ‘it is at once the golden age marked by great discoveries, scientific progress, democracy, the lengthening of human life and the era of blindness; we had not seen it coming, we were and would be for ever the most powerful’ (Lorius & Carpentier, 2010: 13). The least we can say is that we ought to abide by the principle of responsibility as formulated by Hans Jonas (1984).

We might remind ourselves that the first disruption occurred some twelve thousand years ago and has been known as the Neolithic revolution, marked by the domestication of various vegetable and animal species, the sedentarization of human settlements and the very beginnings of urbanization.2 The second, recognized ex post as the starting-point of the anthropocene, was triggered by the fantastic changes brought about by the indus- trial revolution in terms of demographic growth, scientific and technical progress, for good and evil, two world wars and major upheavals in the geopolitical setting, from the colonial age to the emancipation of the Third World, to which we should add the rise and fall of the Soviet Union.

As a matter of fact, we have been living up to now in the anthropocene without acknowledging it, like the main character of Molière’s Bourgeois Gentilhomme who did not know that he was speaking prose. Most likely, the ratification of the new term by the International Commission on Stratigraphy will broadly coincide with the second Rio de Janeiro Earth Summit, scheduled to meet in the middle of 2012.

Changing the time-scale, we should point to the acceleration of history since the end of the Second World War with the following major events briefly enumerated here: the pacific independence of India in 1947 (followed by the outbreak of hostilities between India and Pakistan), the victory of the communists in China in 1949, the Bandung Conference of Afro-Asian nations in 1955 and the recognition of the five principles of pacific coexistence (Panchsheel, proposed by India and the People’s Republic of China), the decolonization of Africa in 1960, the invasion of Czechoslovakia in 1968, followed by the fall of the Berlin wall in 1989 and the implosion of the Soviet Union, and, finally, the recent awareness of an impending ecological catastrophe, unless we manage in the next few decades to reduce drastically our planet’s greenhouse gas emissions.

Humanity is thus at a crossroads. Will we continue to behave like sorcerers’ appren- tices moved by greed and locked into ‘short-termism’ (as rightly stressed by Deepak Nayyar)? Or shall we speedily mobilize ourselves to learn the new function of ‘geonauts’, in Erik Orsenna’s words, co-pilots of Spaceship Earth, capable of mitigating the deleteri- ous climate change brought about by excessive greenhouse gas emissions without losing sight of the social imperative – the urgent need to reduce the abyssal disparities between the affluent minority and those, much more numerous, who continue to go to bed hungry in spite of the progress achieved by the world-economy?

A caveat should be introduced here. The adaptive capacity is not equally distributed among the human passengers of the Spaceship Earth. One can assume that the Dutch could, were it necessary, strengthen their dykes to protect themselves from the rising sea-levels. However, the same cannot be said for the inhabitants of the Maldives and of Bangladesh, unless the latter can count on the solidarity of richer nations, by no means to be taken for granted in the present international setup.

On a more philosophical level, we shall never be full ‘masters of nature’, as thought by Descartes. But we can still hope to contribute to the ascent of man by acting as Pascal’s ‘thinking reeds’ to reduce the greenhouse gas emissions and thus adapt ourselves to the still plentiful potentialities of various biomes in order to meet the basic necessities of life for the whole population of Spaceship Earth: more than six billion today, at least nine billion by the middle of the century when demographic growth is likely to come to a standstill.

Until now, there has been no reason to listen to Cassandras who claim that our planet – Gaia – will destroy us unless we learn to preserve it and reduce the world population to half a billion equipped with nuclear energy – surprisingly deemed to be the safest (see Lovelock, 2008). Nor should we indulge in unrestricted epistemological optimism as illustrated in a recent book edited by Sylvie Brunel and Jean-Robert Pitte (2010). The anthropocene era requires an urgent dialogue between scientists and citizens in order to overcome the narrow confines of technoscience, which has no legitimacy to define its own research programmes (Testard, Sinaï & Bourgain, 2010).

Our long-term future should be thought of as the unfolding of a civilization of being in the equitable sharing of having (JL Lebret).3 Yet, as shown in the path-breaking study conducted by the Barriloche Foundation in Argentina (Herrera et al., 1977) as a response to the Club of Rome’s Limits to Growth (Meadows et al., 1972), eliminating appalling social disparities and lifting everybody above the threshold of a decent material life is a precondition for moving towards this higher stage of our history, in which an ever greater parcel of societal time will be consumed in cultural activities in the broadest meaning of this term, and Huizinga’s (1955) homo ludens will take the upper hand over the homo faber.

Our immediate task is to propose long-term development strategies, environmentally sound and socially inclusionary,4 at the antipodes of the course defined by the uncon- strained play of market forces. Left to themselves, markets are short-sighted and socially insensitive, as the present crisis has yet again shown. At the same time, we should reject, at least for the next few decades, the proposition to halt material growth and even start a process of ‘degrowth’, as suggested by Serge Latouche (2006).

It follows that we must give utmost priority to an aggiornamento of long-term demo- cratic planning as the main instrument of governance within each nation and at the global level.

Towards a new planning paradigm for a green and inclusionary economy

As a starting-point, we could use the following quite comprehensive quote:

The green and inclusionary economy is a new form of organization of productive activities, enabling an improvement in the well-being of humanity and a reduction in social inequalities, while avoiding to expose the biosphere and the future generations to significant environmental risks and ecological scarcity. It deals with the reconfiguration processes of economic activities and infrastructure, in order to bring better returns to natural, human and economic capital investment, at the same time as it reduces the greenhouse gas emissions, uses less natural resources, produces less residues and allows for waste recycling, the generalization of sanitation and the reuse of raw materials and manufactured products. It is an economy that achieves more with less and uses a smaller quantity of material goods and a greater quantity of immaterial and intangible goods and services. The green economy implies forest reconstitution, biodiversity protection, promotion of sustainable agriculture, aquiculture and water resources, as well as urban planning and the nurturing of sustainable transportation and housing. It is an economy which fosters and articulates the society of knowledge with sustainable development, creation of green jobs and decrease in polluting activities, generating the growth of new income opportunities, less consumerism and greater social inclusion. (Diniz Alves, 2011)

This is a tall order indeed. The UNEP (2011) has just released the first comprehensive study of the green economy, rightly aimed at reconciling the twin development goals of environmental prudence and social justice. Fortunately, we are not starting from scratch even though future historians of our time will have some difficulty in explaining the ups and downs of planning over the last hundred years.

Born as the offspring of a war economy, central planning was adopted as the main tool of governance by the Soviet Union, at a moment when the only instrument available to planners in this huge country was the abacus. It spread to many other countries, both socialist and capitalist, after the Second World War. Even the US government went so far as to advise Latin American countries in the early 1960s to produce development plans as part of the Alliance for Progress launched by President Kennedy to counter the influence of the Cuban revolution.

Paradoxically, planning lost its appeal at the very moment when it could count with new powerful tools associated with the computer revolution. Part of the opprobrium attached nowadays to planning comes from the misdoings of the authoritarian regimes, which used planning as a cover for arbitrary actions. However, the ultimate explanation is to be sought in the implosion of the Soviet Union and the neoliberal counter-reform fostered by the so-called Washington consensus.

The tide is turning once more as the present crisis has exposed as bogus the alleged capacity of markets to regulate themselves.

We owe to M Kalecki, author of a remarkable and pioneering methodology of long- term planning in the Polish economy (Kalecki, 1993; see also Feiwel, 1975: 414–433), the shortest definition of planning: ‘planning is variant thinking’.

As a matter of fact, the simple extrapolation of the past trend seldom corresponds to the best possible use of an economy’s potential for the satisfaction of the population’s basic and less basic needs. Two pitfalls must be avoided: on the one hand, we may encounter bottlenecks which prevent further growth, unless they are taken care of; on the other, it would be a pity not to make full use of the growth potential of the economy, postponing in this way the satisfaction of the population’s urgent needs.

At the same time, we must recognize the ethical and political dimension of the trade- off between, on the one hand, more investment, leading to quicker growth and paid for by lower consumption in the short run, and, on the other, greater consumption in the short run, compensated for by a slower long-term growth rate. No algorithm exists to find an optimum. Hence the importance of a democratic setting in which these trade-offs are examined and debated before a decision is taken. Two remarks are in order here.

Quite clearly, the absence or the weakness of this debate was the Achilles’ heel of planning in former socialist countries. Furthermore, political leaders were always press- ing for the highest possible growth rate, as if the competition between socialism and capitalism were to be solved in this way.5

Another limitation of the planning experiences in the post-war period stemmed from the non-inclusion of the environmental dimension. Planning methodologies should incorporate such concepts as the ecological footprint and biocapacity, leading to the dis- tinction between deficit and surplus countries in terms of their biocapacity. International action should be geared to assist the countries with a low footprint to make better use of their biocapacity, while countries with a high footprint should be called to order.6

The best way of advancing in this direction would consist in deciding at the 2012 Rio de Janeiro Second Earth Summit that all UN member countries ought to produce, say in a 2-year time span, comprehensive national development plans, facing the dou- ble challenge of climate change and of the urgent need to overcome poverty and social inequality.

Such plans, meant to be socially inclusionary and environmentally sound, should be built on two pillars: food and energy security.7

Food and energy security
By food security we mean an adequate and regular supply of calories and proteins to all members of the workforce and their families, made accessible through markets and self-produced by consumers, both in rural and urban settings, or else distributed by the State and charities, so as to ensure that the workforce is in good enough condition to fulfil its productive functions. According to MS Swaminathan (2004), food security has three major dimensions:

  • availability of food – a function of production;
  • access to food – a function of purchasing power/access to sustainable livelihoods; 
and
  • absorption of food in the body – determined by access to safe drinking water and 
non-food factors such as environmental hygiene, primary health care and primary education. 
Improving food security calls for further progress in the green and blue revolutions, without forgetting that land reforms, a not so popular theme nowadays, may be in many countries a precondition to moving in these directions. 
MS Swaminathan (2004) coined the term ‘evergreen revolution’ to highlight a path- way ‘where advances in crop and farm animal productivity are not accompanied by either ecological or social harm’, and the small producers are the main beneficiaries. According to him, the growing paradox between grain mountains and hungry millions can be overcome by food-for-ecodevelopment initiatives managed at the local level by community food banks (CFBs) operated by women’s self-help groups. Such CFBs would be instrumental in addressing chronic, hidden and transient hunger with low transaction costs and transparency. Where animal husbandry is important, the CFBs could also operate food and fodder banks. 
Other initiatives might include growing food within cities, exploring the potential of small, yet highly productive super-vegetable gardens, using biochar (charcoal) as a soil enhancer following the example of the indigenous populations of the Amazon region that has resulted in the creation of the so-called terras pretas, known for their fertility (see Bakewell-Stone, 2010). With biochar we find ourselves on the threshold of a third wave in the green revolution, allowing millions of urban and peri-urban dwellers to improve their daily food consumption8 and calling for a reconsideration of the rural–urban divide. 
Side by side with the advances of the evergreen revolution, we should explore the potential of the blue revolution, with its two main components:

• Shifting from fishing to fish breeding, both along the seashores and in the conti- nental waters – rivers, lakes and manmade reservoirs often associated with the building of dams to produce hydroelectricity. There are reasons to believe that the hitherto untapped potential for fish breeding, especially of vegetarian species,9 is still considerable. The future may belong more to the expansion of pisciculture than that of cattle breeding on account of the environmental harm caused by cattle’s methane exhalations and the felling of forests to expand grazing lands.

This expansion should go side by side with increasing the number of cattle per hectare and converting degraded extensive pastures thus released into agricultural land. An important research programme led at the Instituto Socioambiental de São Paulo (ISA) by a team coordinated by Gerd Sparovek pointed out, on the basis of the census conducted in 2006, that at present, pastures account for 158 million hectares, i.e. one-fifth of Brazilian territory, the equivalent of almost three Frances. Around 20 percent of these pastures occupy land with a reasonable aptitude for agriculture (see Notícias da Amazônia, 2011).

Growing microalgae and algae for bioenergy production, a promising technologi- cal frontier likely to be operational within the near future.10 Insofar as the blue revolution transfers the production of animal proteins and of bioenergy from limited agricultural land to yet unexplored sea expanses, it ought to play a major role in long-term development strategies which aspire to improve the living stand- ards of a growing human population, which, as already mentioned, will reach nine billion in the middle of the century before stabilizing.

By 2050, shall we be able to fill the plates of nine billion men, women and children every day? John Parker, in a recent well-documented special report on feeding the world (The Economist, 2011) tells us that, though not easy, it should be perfectly possible to feed nine billion people by 2050. It will require boosting yields and reducing harvest losses in Africa, where production averages one ton of grain per hectare, as compared with four to five tons per hectare achieved through the green revolution and the ten tons per hectare obtained at the Rothamsted farm on the outskirts of London, a leading British research outfit. The report ends on an optimistic note:

There are plenty of reasons to worry about food: uncertain politics, volatile prices, hunger amid plenty. Yet, when all is said and done, the world is at the start of a new agricultural revolution that could, for the first time ever, feed all mankind adequately. The genomes of most major crops have been sequenced and the benefits of that are starting to appear. Countries from Brazil to Vietnam have shown that, given the right technology, sensible policies and a bit of luck, they can transform themselves from basket cases to bread baskets. (The Economist, 2011: 18)

An even greater optimism permeates the UNEP report (2011) entitled Towards a Green Economy – Pathways to a Sustainable Development and Poverty Eradication. Its authors claim that ‘Moving towards a green economy has the potential to achieve sustainable development and eradicate poverty on an unprecedented scale, with speed and effectiveness’ (2011: 622). Green economy is defined as low carbon, resource efficient and socially inclusionary. The authors of this document go so far as to say that the so-called trade-off between economic progress and environmental sustainability is a myth and that a green economy delivers more jobs in the short, medium and long term than business as usual. The least one can say is that this assertion is yet to be demon- strated. The fad for green has equally spread to other international agencies, such as the OECD (see OECD, 2010).11

Energy security refers to the adequate supply of stationary energy and fuels, allowing for increases in labour productivity as well as for the transportation of goods and people.

The main problem here is to phase out the production and consumption of fossil fuels responsible for emitting greenhouse gases in the atmosphere and thus causing global warming which will be instrumental in bringing about, if unchecked, deleterious climate changes; our future as a species may be endangered unless we manage in the next few decades to radically reduce these emissions.

Hence the importance of the search for new energy paradigms responding to the three criteria of greater sobriety, greater efficiency and, whenever possible, substitution of fossil fuels by renewables: wind, solar and biomass, the latter subject to the need for respecting the postulate of food security.12 How fast and how far can we move in these directions? According to the bold and ambitious (over-ambitious?) scenario prepared by the WWF (2011), humanity might shift one hundred percent to renewable energy by 2050, while phasing out nuclear energy deemed costly and too dangerous.

Whither the United Nations?

A final word should go to the United Nations. This, the main international organization, also requires an aggiornamento. Beside the long overdue reform of the Security Council, the United Nations ought at last to move in the direction of transferring one percent of the world’s GNP from richer countries to those whose GNP per head is well below the world average. It might in addition consider establishing tolls on oceans, as well as an international tax on carbon, so as to substantially increase the funds available for assist- ing the least-developed countries in their development.

Another urgent task for the UN is to promote meaningful scientific and technical cooperation between countries sharing similar biomes.

Last but not least, the UN should take advantage of the forthcoming Second Earth Summit in 2012 to set in motion the process of defining national long-term plans with a view to their harmonization and coordination, in order to smooth the world’s transition towards following a socially inclusive and environmentally sustainable path.

Author biography
Ignacy Sachs, eco-socioeconomist, is Honorary Professor at the École des Hautes Études en Sciences Sociales in Paris, associate researcher at the Institute of Advanced Studies, São Paulo University, Brazil and currently a consultant for the Brazilian Ministry of Agrarian Development. He is the author of several books among which: Transition strategies towards the 21st Century (foreword by Maurice F. Strong, Delhi: Interest Publications for Research and Information System for the Non-Aligned and Other Developing Countries, 1993, with translations into French, Italian, Portuguese, Japanese and Polish), Understanding Development: People, Markets and the State in Mixed Economies (New Delhi : Oxford University Press, 2000). (as editor with Jorge Wilheim & Paulo Sergio) Brasil: um século de transformaçōes. (São Paulo: Ed. Companhia des Letras, 2001), Desenvolvimento includente sustentável sustentado. with a preface by Celso Furtado (Rio de Janeiro: Garamond, 2004), Rumo à ecossocioeconomia: Teoria e prática do desenvolvimento, ed. Paulo Freire Vieira (São Paulo: Cortez Editora, 2007). La troisième rive (Paris: Bourin Editeur, 2007; also published in Brazil under the title: A terceira margem, São Paulo: Companhia das Letras, 2009) and co-author with Ladislau Dowbor and Carlos Lopes of Crises e oportunidacles em tempos de mudança (Imperatriz, MA: Ética Editora, 2010).

 Notes

Prepared for the special issue of Social Science Information on the occasion of its fiftieth anniversary.

.    1  See Crutzen & Stoermer (2000); also Lorius & Carpentier (2010: 126): ‘[L]’anthropocène, cette drôle de petite fenêtre dans l’histoire de la Terre, où l’homme a découvert les énergies fossiles, les a exploitées, consommées, brûlées, et entièrement épuisées, détruisant son atmos- phère, ses océans, ses sols, et massacrant le vivant.’

.    2  See the pioneering book by Gordon Childe, What Happened in History, 1942.

.    3  Civilisation de l’être dans le partage équitable de l’avoir (‘La civilisation de l’être dans le partage équitable de l’avoir’, cited by P Blancher: ‘Quel développement? Humain parce que durable’; available at: www.economie-humanisme.org/Revue360…). Who will put it better in 
so few words?

.    4  The adjective inclusionary (rather than inclusive) has been used by AK Sen.

.    5  To temper the enthusiasm of growth maniacs, Kalecki had the following joke: the highest growth rate in the short run will be achieved by investing the whole GNP, thus starving to 
death the whole population.

.    6  The reader may consult the Global Footprint Network’s website: http://www.footprintnetwork. 
org/en/index.php/GFN/; see also Wackernagel & Rees (1999); Boutaud & Gondran (2009).

.    7  Furthermore, they should take advantage of recent discussions on economic, social and environmental indicators; see in this respect Méda (2008); Stiglitz, Sen & Fitoussi (2009); and also a critical appraisal of the same by another group of scholars, FAIR (Forum pour 
d’Autres Indicateurs de Richesse) (2011: in particular pp. 41–42).

.    8  According to data provided by the NGO Pro-natura International (http://www.pronatura. 
org/), a biochar-enriched Super Vegetable Garden of less than 60 m2 may provide a balanced 
diet for a family of 10 with 80% less water consumption.

.    9  To avoid ‘fish cannibalism’ among carnivore species.

.    10  According to Bill Gibbons, from the South Dakota State University, the new generation of ethanol produced from blue algae (cyanobacteria) is around the corner, just 4 or 5 years away (Gibbons, 2011).

.    11  See OECD (2010). The OECD’s work on green growth will form a major part of its contribution to Rio+20 along with the forthcoming Environmental Outlook to 2050.

.    12  See on this point Dessus & Gassin (2004) and the negawatt scenario, available at: http://www. negawatt.org/V4%20scenario%20nW/scenario.htm.

References
Bakewell-Stone P (2010) Introduction to Biochar in Tropical Agriculture, July 2010; available at: www.pronatura.org.

Boutaud A, Gondran N (2009) L’empreinte écologique. Paris: La Découverte.
Brunel S, Pitte J-R (eds) (2010) Le ciel ne va pas nous tomber sur la tête. Paris: JC Lattès.
Childe G (1942) What Happened in History. London: Penguin Books.
Crutzen PJ, Stoermer EF (2000) The ‘Anthropocene’. Global Change Newsletter – The Interna-

tional Geosphere–Biosphere Programme (IGBP): A Study of Global Change of the International

Council for Science (ICSU) 41: May.
Dessus B, Gassin E (2004) So watt? L’énergie, une affaire de citoyens. La Tour d’Aigues: Editions

de l’Aube.
Diniz Alves JE (2011) Economia verde, limpa e inclusiva: novo paradigma de sustentabilidade.

Mercado Etico/Ecodebate, 10/02/11; available at: http://mercadoetico.terra.com.br/arquivo/

economia-verde-limpa-e-inclusiva-novo-paradigma-de-sustentabilidade/ (last access: 22/02/11). The Economist (2011) The 9 Billion-People Question – a Special Report on Feeding the World.

26 February: 18.
FAIR (Forum pour d’Autres Indicateurs de Richesse) (2011) La richesse autrement – Alternatives

Economiques. (Hors-série poche no. 48, Mars).
Feiwel GR (1975) How to plan and how not to plan. In: Feiwel GR, The Intellectual Capital

of Michał Kalecki – A Study in Economic Theory and Policy. Knoxville: The University of

Tennessee Press, 414–433.
Gibbons B (2011) Ethanol’s Evolution – Promising research Goes Beyond Corn. Available at:

http://www.sdstate.edu/news/featurestories/ethanols-evolution.cfm, last access: 28/02/11. Herrera A, et al. (1977) Un monde pour tous: Le modèle mondial Latino-Américain. Paris: Presses

Universitaires de France.

Huizinga J (1955[1938]) Homo Ludens, a Study of the Play Element in Culture. Boston: Beacon Press.

Jonas H (1984) The Imperative of Responsibility: In Search of Ethics for the Technological Age. Chicago, IL: University of Chicago Press. (Transl. of Das Prinzip Verantwortung, 1979, by Jonas H, Herr D).

Kalecki M (1993) Collected Works of Michał Kalecki, Volume IV – Socialism: Economic Growth and Efficiency of Investment, ed. Osiatynski J; transl. Jung B. Oxford: Clarendon Press.

Latouche S (2006) Le pari de la décroissance. Paris: Fayard.
Lorius C, Carpentier L (2010) Voyage dans l’anthropocène – Cette nouvelle ère dont nous sommes

les héros. Arles: Actes Sud.
Lovelock J (2008) La revanche de Gaïa – Préserver la planète avant qu’elle ne nous détruise.

Paris: Editions J’ai Lu.
Meadows D, et al. (1972) The Limits to Growth. New York: Universe Books.
Méda D (2008) Au-delà du PIB, pour une autre mesure de la richesse. Paris: Flammarion. Notícias da Amazônia (2011) Há Terras disponíveis para ampliar a produção sem aumentar o

desmatamento. Available at: www.amazonia.org.br, 02/03/11.
OECD (2010) Interim Report of the Green Growth Strategy: Implementing our Commitment for

a Sustainable Future. Meeting of the OECD Council at Ministerial Level, 27–28 May 2010. Stiglitz J, Sen A, Fitoussi J-P (2009) Performances économiques et progrès social – Rapport de la Commission sur les performances économiques et du progrès social pour le président de la

République. Paris: Odile Jacob.
Swaminathan MS (2004) Evergreen revolution and sustainable food security. Paper delivered

at the Conference on Agricultural Biotechnology: Finding Common International Goals, National Agricultural Biotechnology Council, 2004. (nabc.cals.cornell.edu/pubs/nabc_16/ talks/Swaminathan.pdf).

Testard J, Sinaï A, Bourgain C (eds) (2010) Labo planète ou comment 2030 se prépare sans les citoyens. Paris: Editions des Mille et une nuits.

UNEP (United Nations Environment Programme) (2011) Towards a Green Economy – Pathways to Sustainable Development and Poverty Eradication. Nairobi. Available at: www.unep.org/ greeneconomy.

Wackernagel M, Rees W (1999) Notre empreinte écologique. Montréal: Ecosociété.
WWF (World Wildlife Fund) (2011) The Energy Report – 100% Renewable Energy by 2050. Gland.

Corresponding author:
Ignacy Sachs, Centre de Recherche sur le Brésil Contemporain, EHESS, 190 avenue de France, 75013 Paris, France
Email: ignacy.sachs@gmail.com; isachs@msh-paris.fr
Downloaded from ssi.sagepub.com at EHESS on September 19, 2011
Funding
This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.

Mood of possibility defines E F Schumacher centenary festival

Audience and speakers excited that conditions may finally be right for the ideas of the green economist to become reality
Ernst Friedrich Schumacher
German-born economist Ernst Friedrich Schumacher (1911-1977). 
Photograph: David Montgomery/Getty Images
 
"The most exciting time to be alive" is not a phrase that trips off the tongue of many politicians currently grappling with a global debt crisis and the threat of recession, but it was almost a mantra at the centenary festival for the economist and "soul of the green movement", E F Schumacher.
The great and the good of the movement, including activists, academics and even a few bankers, turned up at the weekend event in Bristol to pay homage to the author of Small is Beautiful, the landmark 1973 environmental text that questioned the drive for relentless GDP expansion.

With many economies now flat or in decline, the financial system in crisis and the climate increasingly erratic, the crowds that gathered in Colston Hall had come not just to celebrate the life of Schumacher but to bask in the possibility that conditions may finally be ripe for his ideas to be implemented.
"The current economic model is broken and no one is clear about how to fix it. I think that makes Schumacher's ideas more resonant," said Caroline Lucas, the leader of the Green party. "It's time to shift towards an economy that isn't based on an accumulation of stuff."

The timing of this festival of alternative thinking could not have been more apposite. The day before the opening, Mervyn King, governor of the Bank of England, announced £75bn of quantitative easing to tackle what he described as "the most serious financial crisis at least since the 1930s if not ever." Next year, world leaders will gather at a United Nations conference in Brazil to try to map out the path to a "green economy".

"This 100-year anniversary is an opportunity to expose the fallacy of the economic system. Schumacher is becoming more influential because of the crisis," said Satish Kumar, editor of Resurgence, in an opening address. He later told the Guardian: "Schumacher was the soul of the green movement. He realised the environment is not just an empirical, technical, policy matter; it is related to human values, which are a part of natural values."

Tim Jackson, a senior adviser on the Sustainable Development Commission under the last government, said this was the most exciting time to be alive because the potential for radical change had never been greater.

Jackson said the global financial system was now near the point of collapse due to the obsession with growth, which he described as a "fetish for enormous proportions". Schumacher, he said, prefigured the current anxiety about selfish, novelty-seeking consumerism that encouraged people to "spend money they don't have on things they don't need to create impression that won't last on people they don't care about."

Schumacher was born in Germany and became a naturalised British citizen after catching the attention of John Maynard Keynes. He was heavily influenced by Leopold Kohr who coined the "small is beautiful" dictum, and Mahatma Gandhi, who underscored the importance of a spiritual dimension to economics. Schumacher called his approach "Buddhist economics", though joked it might just as easily have been "Christian economics, but that wouldn't have sold as well."

Though he met Jimmy Carter and other world leaders in the 1970s, his ideas went out of vogue during the Thatcher-Reagan years. But today, it is once again fashionable to quote Schumacher. David Cameron cites him as an inspiration for the "big society" and his promise to lead "the greenest government ever". There was short shrift for such claims among the society's true believers, one of whom noted that "politicians and bankers have managed to achieve zero growth only by mistake".
The mood of imminence and possibility was very different from that at the annual conventions of the main UK political parties, which were marked by poor turn-outs and lacklustre speeches. Schumacher Society organisers said this year's gathering drew more than twice as many people than a usual year, filling the 800-seat venue.

The speakers - not linked by formal affiliation but by the shared influence of Schumacher - were not short of big ideas.

Lawyer Polly Higgins called for the United Nations to add "ecocide" to its list of "crimes against peace"; Rob Hopkins, the founder of Transition, described a localisation drive to prepare for the peak of oil, and green financier Peter Blom of Triodos Bank, proposed a shake-up of business school teaching and greater "biomimicry" in the financial sector to strengthen a system that has come to resemble a fragile monoculture.

"We've seen some new things today: a green lawyer, a green politician and a green banker," said Diane Schumacher. "If there was any three groups of people that Fritz [Schumacher] suspected, it was them. He'd be delighted by the revolutionary stuff coming out of their mouths."

Some in the audience said Schumacher's heirs were too idealistic, too white, too middle-class. This has been a common refrain for 30 years, but there was also a feeling that, given the current crisis, even such proposals may be too timid.

Bill McKibben, the US climate activist, struck the most assertive note, with a video message explaining why he pulled out of the festival at the last minute so he could join the fight against the proposed Keystone XL pipeline that would take oil from Canada's tar sands down to the Gulf of Mexico.
McKibben was arrested earlier this year while challenging this pipeline, but he said it was necessary to continue with direct action against the fossil fuel industry because "the worst thing that ever happened to the world is now happening".

The tar sands, he said, contained enough oil to raise the amount of carbon dioxide in the planet's atmosphere from the current 393 parts per million (ppm) level to 530ppm, which would warm the world by about 1C.

"I look forward to standing shoulder to shoulder with you in the most important fight humans have ever fought," he said, in a sign-off that met with some of the loudest applause of the weekend.

Redefining the Meaning of No. 1

Opinion

HERE in America, we seem to be more interested in finishing first than we are in figuring out what race we ought to be in.

The refrain is insistent, from President Obama on down. He, like others in both parties, urges us on — to build or educate or invest or cut the deficit — so that “America can be No. 1 again.”
We want to be No. 1 — but why, and at what?

The size of our economy is one measure of success, but it’s not the only measure.

Isn’t the important question not how we remain No. 1 but rather, what we want to be best at — and even, whether we want to lead at all?

But we are Americans and we seem to think the rest of the world looks best when framed in our rear-view mirror.

We outstrip the world by many measures but lag, sometimes shockingly, in many others. The metrics by which we choose to measure our success determine our priorities. Yet, some of the metrics we rate as most important, like G.D.P., stock indices or trade data, are so deeply flawed as to be irrelevant or worse, dangerous distractions. And at the same time, countries that could hardly hope to outperform the world in any category are far ahead of us when it comes to things that matter more to people. Choosing metrics to measure our society is not a value-free process. As a country we have consistently relied on indicators that keep us focused on the interests of business, financial institutions or the defense industry whereas equity, quality of life and even social mobility metrics are played down.

Calculating national income is a relatively new concept. Previously, countries measured their economic well-being by tallying land holdings or counting railroad boxcars. But in the midst of the Great Depression, Congress, showing a great deal more intellectual curiosity than it does today, commissioned a group of economists led by a future Nobel Prize winner named Simon Kuznets to better measure economic activity.

Although Kuznets and his team fulfilled their mission, they released their results with considerable unease. Not only were they aware that the statistic they devised ignored many types of economic activity — from the work of housewives to illegal enterprises — they also knew their number did not assess the social benefits of what they were tracking.

Kuznets warned of this: “The welfare of a nation can, therefore, scarcely be inferred from a measurement of national income” like the one they created. That hasn’t stopped us from making this misleading number perhaps the most influential statistic in the world.

Americans use G.D.P. in discussions about how well we are doing. It’s at the heart of discussions of whether we are in a recession or not, ahead or falling behind.

Yet, when China “passes” us, it will remain for the most part a very poor country racked with social problems. And as we have seen, though the past decade was marked mostly by United States “growth,” recent Census data shows that since 1999, median American incomes have fallen more than 7 percent while the top 1 percent showed gains. Almost one in four American children live in poverty. We have a high level of unemployment compared to many of our peers.

THE G.D.P. number is not the only culprit, of course. Listening to the news, you might be forgiven if you thought that stock market performance was linked to reality. But markets are oceans of teeming emotions that make the average hormone-infused high school look calmly rational, and much of the “data” that moves markets is just bunk. Trade deficit numbers may be scary but they are also frighteningly flawed, doing a terrible job of accounting for trade in services, trade via the Internet, and inter-company trade, to pick just three among many problem areas.

Worse than the shortcomings of these statistics are the consequences of our over-dependence on them as measures of the success of our society. A country, for example, that overemphasizes G.D.P. growth and market performance is likely to focus policies on the big drivers of those — corporations and financial institutions — even when, as during the recent past, there has been little correlation between the performance of big businesses or elites and that of most people.

Furthermore, of course, the purpose of a society is not merely the creation of wealth, especially if most of it goes to the few. Even John Locke, who famously enumerated our fundamental rights as being to life, liberty and property, qualified this by asserting that people should appropriate only what they could use, leaving “enough and as good” for others. Thomas Jefferson later consciously replaced the right to property with a right to “the pursuit of happiness.” And happiness has become the watchword for those seeking different measures that might better guide governments.

According to the economist Carol Graham, the author of a recent book called “The Pursuit of Happiness: An Economy of Well-Being,” “happiness is, in the end, a much more complicated concept than income. Yet it is also a laudable and much more ambitious policy objective.” While she notes distinctions between approaches to happiness — with some societies more focused on goals like contentment and others on the creation of equal opportunities — she joins a growing chorus of leading thinkers who suggest the time has come to rethink how we measure our performance and how we set our goals.

This diverse group has included thinkers and public figures like President Nicolas Sarkozy of France, who established a commission in 2008 to address the issue that was co-led by the Nobel Prize-winning economist Joseph E. Stiglitz; the Columbia economist Jeffrey D. Sachs; the British prime minister, David Cameron; and the trail-blazing people of Bhutan, who since 1972 have set a goal of raising their gross national happiness.

Dr. Graham admits that it’s a challenge to set criteria for measuring happiness. However, in a conversation, she told me she did not see it as an insurmountable one: “It doesn’t have to be perfect; after all, it took us decades to agree upon what to include in G.D.P. and it is still far from a perfect metric.”

But for Americans, beyond choosing the right goals, there remains the issue of being No. 1. Many of us have lived our lives in a country that has thought itself the world’s most powerful and successful. But with the United States economy in a frustrating stall as China rises, it seems that period is coming to an end. We are suffering a national identity crisis, and politicians are competing with one another to win favor by assuring a return to old familiar ways.

This approach, too, is problematic. We, as a developed nation, are unlikely to grow at the rapid pace of emerging powers (the United States is currently ranked 127th in real G.D.P. growth rate). Europe and Japan, too, are grappling with the realities of being maturing societies.

But maturing societies can offer many benefits to their citizens that are unavailable to most in the rapidly growing world — the products of rich educational and cultural resources, capable institutions, stability and prosperity.

AS a consequence, countries that at different times in history were among the world’s great powers, such as Sweden, the Netherlands, France, Britain and Germany, have gradually shifted their sights, either in the wake of defeat or after protracted periods of grappling with decline, from winning the great power sweepstakes to topping lists of nations offering the best quality of life.

When Newsweek ranked the “world’s best countries” based on measures of health, education and politics, the United States ranked 11th. In the 2011 Quality of Life Index by Nation Ranking, the United States was 31st. Similarly, in recent rankings of the world’s most livable cities, the Economist Intelligence Unit has the top American entry at No. 29, Mercer’s Quality of Living Survey has the first United States entry at No. 31 and Monocle magazine showed only 3 United States cities in the top 25.
On each of these lists, the top performers were heavily concentrated in Northern Europe, Australia and Canada with strong showings in East Asian countries from Japan to Singapore. It is no accident that there is a heavy overlap between the top performing countries and those that also outperform the United States in terms of educational performance — acknowledging, of course, the mistake it would be to overemphasize any one factor in contributing to something as complex as overall quality of life. Nearly all the world’s quality-of-life leaders are also countries that spend more on infrastructure than the United States does. In addition, almost all are more environmentally conscious and offer more comprehensive social safety nets and national health care to their citizens.

That virtually all of the top performers place a much greater emphasis on government’s role in ensuring social well-being is also undeniable. But the politics of such distinctions aside, the focus of those governments on social outcomes — on policies that enhance contentment and security as well as enriching both human capabilities and opportunities — may be seen as yet another sign of maturity.
It is also worth noting that providing the basics to ensure a high quality of life is not a formula for excess or the kind of economic calamities befalling parts of Europe today. For example, many of the countries that top quality-of-life lists, like Sweden, Luxembourg, Denmark, the Netherlands and Norway, all rank high in lists of fiscally responsible nations — well ahead of the United States, which ranks 28th on the Sovereign Fiscal Responsibility Index.

What these societies have in common is that rather than striving to be the biggest they instead aspire to be constantly better. Which, in the end, offers an important antidote to both the rhetoric of decline and mindless boosterism: the recognition that whether we are falling behind or achieving new heights is greatly determined both by what goals we set and how we measure our performance.

 David J. Rothkopf is the author of the forthcoming “Power, Inc.: The Epic Rivalry Between Big Business and Government — and the Reckoning that Lies Ahead.”

quarta-feira, 14 de setembro de 2011

Free Science, One Paper at a Time

Howard Eisen, 1942-1987

On Father’s Day three years ago, biologist Jonathan Eisen decided he’d like to republish all his father’s papers. His father, Howard Eisen, a biologist and a researcher at the National Institutes of Health, had published 40-some-odd papers by the time that he died by suicide at age 45. That had been in Febuary 1987, while Jonathan, a sophomore at college, was on the verge of discovering his own love of biology. At the time, virtually all scientific papers were just on paper. Now, of course, everything happens online, and Jonathan, who in addition to researching and teaching also serves as an editor for the open-access, online-only journal PLoS Biology, knows this well. So three years ago, Jonathan decided to reclaim his father’s papers from print limbo and make them freely available online. He wanted to make them part of the scientific record. He also wanted, he says, “to leave a more positive presence” — to ensure his father had a public legacy first and foremost as a scientist.
I researched and wrote this article last summer and fall (2010) under assignment from a magazine that accepted and paid for it but, in the way these things sometimes work, decided not to run it and gave me its blessing to publish it elsewhere. I’m publishing it here at Neuron Culture and also in an identical post at my website, daviddobbs.net. It is based on extensive reporting. I’d like to thank in particular Jonathan Eisen,, for reasons that the article will make obvious; Cameron Neylon, Peter Murray-Rust, Richard Grant, Michael Nielsen, Martin Fenner, Leslie Carr, and Lord Rees, whose ideas are fundamental to the story and its subject; Mark Patterson and Brian Mossop of PLoS; Victor Henning, Jason Hoyt, Ian Mulvany, William Gunn, and Jan Riechelt, all of Mendeley; and Melody Dye, Kristi Holmes, John Timmer, and Sara Wood, who along with Reichelt participated in a session on open science I organized at ScienceOnline 2011. I also had several conversations off-record; you know who you are — thanks.
How hard could it be? Howard Eisen had been a federal employee, so his work rightly lay in some sense in the public domain. And Jonathan, as an heir, presumably owned copyright anyway, along with his brother Michael (also a biologist, and one of the founders of the Public Library of Science, the innovative journal group that publishes PLoS Biology). Yet to the brothers’ continuing chagrin, Jonathan has found securing and publishing his father’s papers to be far harder than he expected.

For instance, even though Jonathan has access to the enormous University of California library system, which subscribes to a particularly high number of journals, he often can’t even find some his father’s papers. And when he finds a paper in a journal the university doesn’t subscribe to, he is asked to pay as much as $50 to read the paper — even though his father did the work with public funds. He’s not alone; one recent study found that even most university researchers have access to only about half the papers they need to cite for a given bit of research. Just yesterday, in fact, Jonathan asked on Twitter if anyone could send him a copy of one of his father’s paper and confronted a paywall asking for his credit card number. “I ain’t payin’,” he replied.
Meanwhile, Jonathan has found and downloaded the PDFs for about half his father’s papers, but he remains uncertain whether he could safely post them on his website. While some publishers allow such “collegial sharing,” others leave their policies unclear, and he worries about getting sued. His brother urged Jonathan to post them.

“Come on,” Michael wrote in a comment at Jonathan’s blog. “I DARE them to sue us.”

Jonathan has posted the whole list at his blog and uploaded what PDFs he could obtain, and so far he has not been sued or asked to take them down. Yet he remains wary and unsatisfied. He knows that few researchers will find his father’s papers if they reside only on his web page. So for now, his father’s work remains buried in an old structure — a calcified matrix. Though Jonathan bangs away at the surrounding rock, he knows he hasn’t really pried the work loose. This frustrates him on two fronts: It stops him from freeing his father’s work. And it confirms to him science, which should be a fluid medium, has much of its content still trapped in old structures.

“I started this partly to test how hard it would be to try to make science more available in the current system,” he says. “I’m finding that even with my father’s papers, or even with my own, it’s not very easy.”

Jonathan Eisen’s quest has solidified his conviction that science needs to radically rework the way it collects and shares its data, methods, and findings. He has plenty of company. A growing number of prominent scientists want to replace the aging journal system with something faster, cheaper, and richer. The current system, they note, grew out of meeting notes and journals published by societies in Europe over three centuries ago. Back then, quarterly or monthly volumes could accommodate the flow of ideas and data from most disciplines, and the printed journal, though it required a top-heavy, expensive printing and publishing infrastructure, was the most efficient way to share those ideas.
“But now,” says Jonathan Eisen, “there’s this thing called the Internet. It changes not just how things can be done but how they should be done.”

As Stanford biochemist and PLoS co-founder Patrick Brown put it a few years ago, “What seemed an impossible ideal in 1836, when Antonio Panizzi, librarian of the British Museum, wrote, ‘I want a poor student to have the same means of indulging his learned curiosity, … of consulting the same authorities, … as the richest man in the kingdoms,’ is today within reach. With the Internet, we have the means to make humanity’s treasury of knowledge freely available to scientists, teachers, students and the public around the world.”

“The existing system worked well for quite a while,” says Jonathan Eisen. “But it was not designed by theory. It was designed by constraints.” In a world that provides communications conduits far larger and faster, those constraints have now made science’s traditional pipeline a bottleneck.
~
To get a sense of how the current system curbs science, consider a rare case in whichresearchers attacked a big medical problem with an open-science model. In 2004, in the United States, a network of government and private researchers, including large drug companies, used open-science principles to accelerate research into Alzheimer’s. The project, as Gina Kolata aptly described it in the New York Times last summer, “was an agreement … not just to raise money, not just to do research on a vast scale, but also to share all the data, making every single finding immediately available to anyone with a computer anywhere in the world. Before that, researchers worked separately, siloing off much of their work. Now methods and data formats were standardized. The data would immediately enter the public domain, where anyone could build on it.”

An extraordinary project ensued. The U.S. National Institute on Aging contributed over $40 million, and 20 companies and two nonprofit groups kicked in another $25 million to fund the first six years. The program produced an explosion of papers on early diagnosis and helped generate more than 100 studies to test drugs or other treatments. It greatly sped and opened the flow of findings and data. According to the New York Times, the project’s entire massive database had been downloaded more than 3,200 times by last summer, and the data sets containing images of brain scans was downloaded almost a million times. Everyone was so pleased with the results that they renewed the accord this year. And all because, as a researcher told Kolata, “we parked our egos and intellectual-property noses at the door.”

The language used here — everything entering the public domain, the dismantling of silos, the parking of egos and IP padlocks — might have been lifted from an open-science manifesto. And even Big Science appreciated the outcome. To open-science advocates, this raises a good and somewhat obvious questiknowleon: Why don’t we do science like this all the time?

Part of the answer, strangely, is the very thing at the center of science: the paper. Once science’s main conduit, the paper has become its choke point.

It’s not just that the paper is slow, though that is a huge problem. A researcher who submits a paper to a traditional journal right now, for instance, won’t see the published piece for about a year. She must wait while the paper gets passed around among editors, then goes through rounds of peer review by experts in her field, who might and often do object not just to her methods or data but to her findings and interpretations. Finally, she must wait while it moves through an editing, layout, and publishing pipeline that itself might run anywhere from 2 to 12 weeks.

Yet the paper is not simply slow; it’s heavy. Even as increasingly data-rich science has outgrown the paper’s ability to deliver and describe all that science has to offer — its deep databases, its often elaborate methods — we’ve loaded it up needlessly with reputational weight and vital functions other than carrying data.

The paper is meant to be a conduit for the real content and currency of the science: the ideas, methods, data, and findings of the people who do science. But the tremendous publishing and commercial infrastructure built around the academic paper over the last half-century has concentrated so many functions and so much value in the journal that the paper itself, rather than the information in it, has become science’s main currency. It is the paper you must buy; the paper you must publish; the paper you must cite; the paper on which not just citations but tenure, reputation, status, and even school rankings are built.
~
To get an idea of the paper’s excess weight, go to Cambridge, England, and find Mark Patterson. Patterson is a scientific-publishing old hand gone rogue. He formerly worked at two of the biggest scientific publishing companies, Elsevier and Nature Publishing Group (NPG), each of which puts out scores of journals. A few years ago he moved to the staff at PLoS.*  Patterson is now director of publishing there, and since he joined, PLoS has leveraged open-science principles to become one of the world’s biggest publishers of peer-reviewed science and the biggest single publisher of biomedical literature. Readers like it because they get free access to good science. Researchers like it because their work reaches more readers and colleagues. PLoS’s success is heartening open-science advocates greatly — and unsettling the traditional publishers.

To describe PLoS’s innovations, Patterson likes to talk about how PLoS’s most innovative journal, PLoS One, deals with four essential functions of science that are currently wrapped up in the scientific paper: registration, certification, dissemination, and preservation. The current publishing regime, he argues, locks up these functions too closely in the current, conventional version of the scientific paper — even though some of these functions can be met more efficiently by other means.

So what are these functions?

Registration is essentially a scientific claim of discovery — a marker crediting a particular researcher with an idea or finding. The current system registers these contributions via a paper’s submission date. Certification is essentially quality control: ensuring a paper is solid science. It is traditionally done via peer review. Dissemination means getting the stuff out there — publication and distribution, in printed journals or online. And preservation, or archiving, involves the  maintenance of the papers and citations to create a breadcrumb trail other researchers can later follow back to an idea or finding.

“The current journal system does all four of those things,” says Patterson. “But it doesn’t necessarily do them all well. The trick is finding a system that gets each of these done most efficiently, sometimes by other means, instead of having them all held by the publisher.” He and others contend that science would gain both speed and rigor by “unbundling” some of these functions from the paper and doing them in new ways.

PLoS loosens things up mainly in distribution and quality control.  All of its journals are open-access — that is, free to read. Instead of making every would-be reader either buy a journal subscription or pay a per-article price of $15 to $50, PLoS collects a fee from the researcher to publish — usually about $1400 or so — and then publishes the paper  online and makes it free. The author fee is substantial, but it’s actually a small addition to the other costs of doing science, and performs the essential function of getting it out there. It’s Panizzi’s dream realized: every poor schoolchild — or at least every schoolchild with web access  — can read PLoS. Researchers like this, and it works. A recent study showed that on average, papers and data published open-access receive more citations than did those behind paywalls.

PLoS’s rapid growth has shaken things up. Some journal groups, such as Elsevier, have responded by allowing authors to pay to have a paper open-access on publication. Yet commercial publishers that do this tend to retain certain rights that PLoS does not, and they’re less likely to release underlying data, metadata about the publications, or other data and rights. And the practice creates a weird and uncertain market: You can go to, say, Neuron, and find, in the same issue, one paper you can download for free and another that costs $30. The difference? The authors of the latter paper didn’t pay the open-access fee.

Meanwhile, PLoS’s biggest, most cross-disciplinary journal, PLoS One, streamlines quality control in a way that’s more complex and raises more ire. The traditional route, peer review, generally involves having two or three experts evaluate the entire paper — data, methods, findings, conclusions, significance. The publisher relays these peer critiques to the author, usually with requests for either changes or clarifications. If the author answers those to the publisher’s satisfaction, the paper gets approved.

PLoS One uses a similar process but — crucially — asks its reviewers to judge only on technical merits, and not on any assessment of the paper’s novelty, significance, or impact. “The idea,” says Patterson, “is to let the importance be determined later by how much the paper’s ideas and findings and conclusions are taken up by the community. We’re letting the scientific community at large determine a paper’s value and importance, rather than just a couple of reviewers.”

This makes many people at Patterson’s old workplaces uneasy. Gerry Altmann, editor of Cognition, an Elsevier journal, and an open-minded man, doubts this sort of post-publication filter can serve the purpose. “Peer review should be about ensuring that there’s a robust fit between findings and conclusions, and that a paper sits well within the context of a discipline,” Altmann told me. “These are insidious changes.”

Can the hivemind do quality control? Patterson answers by noting that any paper’s true value — its lasting contribution — is generally decided by the scientific community even under the current system. Yet he acknowledged that at present few scientists actually go online and make comments or otherwise review papers published there. We’re a long way from the vision of an active scientific community replacing peer review with a crowd-sourced rigor and fact-checking. The hivemind apparently has better things to do. Altmann thinks it’s starry-eyed to think that will change.

Others say researchers would engage these tasks if it was worth their while. They argue that you can make it worthwhile by giving researchers credit for a wider range of contributions to science, starting with post-publication peer review and evaluations.

This is the idea behind ORCID, a program that would give each researcher a unique, immutable digital identification, somewhat like a permanent url. That ID would serve like a deposit account: the researcher would accumulate reputational credit not just for papers published, but also for other  contributions the scientific community deems valuable. Reviews of others’ work could thus generate deposits, as could public outreach, talks, putting data online, even blogging — anything that helps science but currently goes unrewarded. This would allow hiring, tenure, grant, and awards committees to weigh a broader set of contributions to science. ORCID holds particular promise because it has already lined up buy-in from publishing giants Nature and Thomsons Reuters (though it’s unclear what contributions various stakeholders will agree to credit).

What would such a system look like? One idea is being developed by a team led by Luca de Alfaro, of the University of California, Santa Cruz. Working with Google, the team hopes to develop broader-based reputational metrics that are built, writes de Alfaro in a recent essay in The Scientist, “on two pillars”: tenure, grant, and similar rewards for authors of papers and their reviewers alike; and — crucially — a content-driven way of gauging the merit of both papers and reviews. Authors would get credit for work of high value, as measured by citations, re-use of data, and discussion generated. Reviewers, meanwhile, would get credit based not just on output but on how well their reviews predicted a work’s future value.

“Thus two skills would be required of a successful reviewer,” writes de Alfaro: “the ability to produce reviews later deemed by the community to be accurate, and the ability to do so early, anticipating the consensus. This is the main factor that would drive well-respected people to act as talent scouts, and to review freshly published papers, rather than piling up on works by famous authors.” De Alfaro says much of the technology to weigh such variables already exists in algorithms used at Google and (for evaluations of reviewers) Amazon.

Such a system could readily be incorporated into a program like ORCID. It could also give researchers incentives and credits — points, essentially — for public outreach or for openly sharing underlying data and details about method, both after and even before publication, so that other researchers can more easily test or use the data and methods. In short, a more flexible credit system could generate more activity in almost any area of science simply by weighting it more heavily.
~
These many pressures and alternatives seem to be loosening the publisher’s grip. Last June,  librarians at the University of California system balked when the Nature Publishing Group sent a contract renewal containing a 400 percent price hike on the scores of NPG  journals the huge library system subscribes to. The increase would have jumped the cost to over $17,000 per journal. The librarians objected that it was ludicrous for universities to fund research and then pay to read it. They threatened to boycott NPG not just as subscribers but as contributors to the journals. NPG softened and worked out a deal.

Meanwhile, universities and researchers are rebelling in other ways. Some are starting open-access journals or opening up some they already publish. And PLoS  continues to create new models, including fast-track journals for time-sensitive disciplines, such as those that cover the flu and other infectious diseases, to cut the traditional one-year publication cycle down to a day. Another outfit, LiquidPub, is launching what it calls “liquid journals,” in which “social computing and liquid knowledge will shape and navigate information waters.” Phillip Lord and Robert Stevens, of Newcastle University and the University of Manchester, have created KnowledgeBlog, a publishing framework based on blog technology. Even Shakespeare scholars are entering the open-science world:  Last summer, the Shakespeare Quarterly ran an experiment in which it not only put its journal online but opened the job of peer review to the public, so that anyone who cared to register could comment, say, on the racial implications of playing Titus Andronicus as an “American Gangsta.”

And then there are those such as Newcastle University computer scientist Phillip Lord, mentioned earlier, who just publishes on Wordpress. A blog may seem a sketchy way to publish science. Yet in a way it makes sense. Science, however rigorous, implicitly recognizes that every explanation is provisional; there’s no finished version. So what could be more fitting than to revamp science through a platform explicitly built to be revised, commented on, and updated?
~
Yet if a more open scientific publishing landscape may seem inevitable, it’s hardly clear how to get there. Talk of inevitability hasn’t much helped Jonathan Eisen get his father’s papers out in the open. He has struggled to find the right leverage point, or perhaps the right tool, to lift them onto a platform any more prominent than his own web page.

A few months ago, however, Jonathan ran into a tool that added some leverage —  and just might chip away as well at the calcified matrix in which his father’s and others’ scientific work  has been stuck.
It is the simplest of academic tools: a desktop reference manager called Mendeley. Yet it comes with an extra dimension: a website at which you can share papers you like, creating a metadata-rich index that can lead other users to your user profile and papers, and vice-versa. You load your bibliography up — all those papers on cognitive neuroscience, say, or dark energy, or, if you’re Jonathan Eisen, evolutionary biology and extremophile bacteria — and Mendeley’s algorithms link you up with papers you might have overlooked and the researchers who wrote, read, or collected them. Maybe, Jonathan wondered aloud on Twitter, he could create a posthumous profile for his dad and post his papers up there. Mendeley promptly told him he could. He did. Howard Eisen now has his own Mendeley page, with all 41 of his papers listed and 24 of them uploaded  as PDFs. Now you, as well as anyone in the research community who takes a minute to sign up at Mendeley, can find and read them, add them to your libraries. Since Mendeley now has over 800,000 members and is growing at an accelerating rates, this puts Howard Eisen and his work if not in science’s mainstream, then in a sizeable and fast-growing tributary.

Jonathan also likes Mendeley because it seems to advance the larger open-science agenda. It’s a sort of friendly Trojan horse. You download a reference manager — a good one, and free — and suddenly have a tool that can help open science.

“Smart,” says Leslie Carr, director of the Web Science Training Center at the University of Southampton. “Most people who’ve tried to create software to drive open science have started off on the web and tried to encourage people to share. Mendeley starts where the researchers already are, with a tool researchers need, which is a desktop reference manager to manage their bibliography and  organize their thoughts. Then the very act of looking for more papers leads them into an open science model based on sharing.”

Mendeley exists because its CEO and co-founder, Victor Henning, needed a tool to understand better the cross-disciplinary mountain of literature he’d compiled for his thesis at Bauhaus-University of Weimar. “I had this huge trove of papers from different disciplines,” Henning told me, “and I wanted to see where the connections and overlaps and gaps were.” But when he looked for software to do this, he found nothing he liked.

“This was 2004, 2005. Last.fm was happening. By then I was collaborating a lot, and we were talking about doing a couple different people’s data. Then we realized you could do a social version of this. Why not map a bunch of people’s data? That would give an even better picture of the ideas in play.” By this time, being a business student, he was thinking: startup. He also realized he didn’t like most of the reference managers on the market. So he thought: let’s roll that in too.

Thus emerged Mendeley. The name rose from its dual mission: Mendeleev was the Russian chemist who created the periodic table, which organized the known elements into a structure that suggests the properties of other elements still to be found. Mendel was the 19th-century monk and botanist who saw that crossing two packets of information could yield a third packet that derived but differed from the first two. Two nice models of how science works.

The focus on the paper came of pure necessity. The  American bank robber Willie Sutton said he robbed banks because “that’s where the money is.” At least for now, the paper is where the data are. But the people at Mendeley know quite well that a) the paper will get unbundled and in many functions displaced and b) they’re now grasping a bundle with a bunch more stuff in it. But they’re most interested in the threads that run from paper to paper. They mean to charge not for the bundled information but for helping people find the connections between the bundles. The company offers a free version that accommodates smallish libraries. If your library runs bigger than 500 MB, you can pay a $10 a month to run the company’s algorithms and store a copy of your data and papers in the cloud. If you’re a company or a department or simply someone who wants to run some highly sophisticated or customized analyses on aggregated scientific data — and on the all-important hivemind indications of what’s newly hot — you can pay more, providing Mendeley another income stream. Mendeley also talks of striking a deal, maybe, with publishers to make papers available on a rough iTunes model: a buck a paper, perhaps, with algorithms running in the background to help you find papers you’d like but don’t know about.

Many feel this model holds a lot of potential not only to make papers available more freely (or cheaply) but to help unbundle and redistribute the functions now unnecessarily bundled with the paper. But can Mendeley do this? It seems to possess the vision and flexibility of mind. While Mendeley is necessarily focused on the PDF right now, for instance, there’s no reason it can’t adjust its databases and algorithms to index, share, and analyze the importance of contributions other than traditionally published papers; they can do new metrics. And the company’s advanced programming interface, or API, recently published, should allow outside developers to create modules and add-ons to track new metrics, including author identifiers such as ORCID.

The chassis, then, can accommodate changes under the hood. The trickier part may be getting the steering right — that is, creating a UI that offers a powerful and full-featured but easy and intuitive way to use both the traditional reference manager and the broader social, sharing, and analytical tools.
They’re still working on that. “Most of the people I talk to who’ve used this,” says Leslie Carr, “think that the desktop and the web sharing aren’t as well integrated as they could be, from a software perspective, and that the analytic tools aren’t as accessible or transparent as they should be.” I find the same thing myself: Many of the metrics and connections between papers aren’t accessible on the desktop, presumably because they require the server’s data and processing power, and finding them on the web interface feels vaguely opaque. Even when you find some relationships, you worry you’re missing something.

Yet the company seems both open and responsive. When users pressed last summer for more hivemind information and more fluid sharing, the company substantially upgraded the website’s social-sharing module. It was quick to produce iPhone and iPad versions of its software. In general it seems fairly nimble and eager to meet user needs.

On the other hand, a lot could stop them. They could run out of money; with over 40 employees, their burn rate is high, but then again, their funding angels seem both confident and deeply pocketed. They could get sued. They could fail to add features fast enough to satisfy demanding users.  They could not quite create the magic that software needs to be transformative. In short, they’ll need what any gamechanger needs: a good concept, some serious programming and promotional chops, and luck.
Mendeley chief scientist Jason Hoyt thinks the real killer app in open science will not be software but … the researcher. He recently made the call in a blog post titled “Dear researcher, which side of history will you be on?”

For the past three centuries, he noted, technology has prevented us from fulfilling Panizzi’s dream of fast, free science. But the technology is there now, and so are the business models, as PLoS has shown. So what is the revolution waiting for.

It is waiting, wrote Hoyt, “for us, the researchers.”
We could choose to publish in only Open Access. We could choose to reward tenure for Open Data. We could choose to only reward publications or data that are proven to be reused and make either a marked economic or research impact. Instead, we choose to follow a model that promotes prestige as the primary objective. …

“The future, I suspect, will look upon our society and practice with regards to scientific knowledge-share as we similarly do now with the Dark Ages. Each time we hold back data or publish research that isn’t immediately open to all, we have chosen to be on the wrong side of history.
He has a point. It’s interesting, for instance, to imagine what would happen if researchers and university librarians got together and created a global version of the sort of revolt that the University of California librarians threatened. “You get all the librarians together on this,” says Cameron Neylon, a director at the UK’s Science and Technology Facilities Council and an academic editor at PLoS, “and this is pretty much over.” And Librarians at the Ramparts sure makes a nice image.

Jonathan Eisen, too, thinks that opening science will require the researchers to step up. But he suspects they won’t step up in number until reward systems offer some incentive more tangible than being on history’s good side. Only then will the upslope ease. In the meantime, Jonathan continues to push his father’s papers up that hill, and he waits to see how well Mendeley, among his other efforts, can help pull them up into the open. Jonathan tends to push hard in strong spurts around Father’s Day, make some progress, then set the load down a while before resuming.

“It’s one of those things that’s just going to take some time,” he says. “I didn’t think it would be quite so hard. But we’ll get there.”
~ ~ ~
Copyright 2011 David Dobbs. All rights reserved. You may excerpt short sections, as per fair use, as long as you link back to this article. For permission to reprint in whole, please drop me a line.

*Disclosure: I sometimes write for Nature Publishing Group and have friends both there and at PLoS.
NOTE: In the week or so after this published, Jonathan Eisen was inspired to substantially complete the job of assembling his father’s publications at Mendeley. See my short follow-up post here.

Corrections:
May 12, 2011: • Fixed some typos. • Changed pounds to dollars. • The original version made it sound as if all PLoS journals evaluated submitted papers based only on method, rather than method plus significance. The current version is corrected to state that only PLoS’s flagship journal, PLoS One, uses that streamlined method of peer review.

May 13, 2011: • Corrected i.d. of Jonathan Eisen’s position at PLoS, where is an academic editor-in-chief of PLoS Biology. • Prior version called PLoS One the “flagship journal” of PLoS. A couple people differed. Changed to note that it is PLoS’s most innovative and cross-disciplinary journal. • Clarified criteria by which PLoS One referees papers. • Corrected description of KnowledgeBlog, which was created by Phillip Lord and Robert Stevens, not Peter Murray-Rust (who told me about it).

Related:

Resources & more reading — a post of links I put together
Jonathan Eisen’s page on trying to find his dad’s papers
Jonathan Eisen Frees (Almost All) His Father’s Papers (a follow-up) | Neuron Culture
How to Crack Open Science – from ScienceOnline
A TED Talk to Open Your Eyes to Open Science
Google to Host Terabytes of Open-Source Science Data
Open Access | Wired Science | Wired.com (all Wired Science stories tagged “open science”)
Open Data | Wired Science | Wired.com (all Wired Science stories tagged “open data”)
Open-Access Debate: Public Library of Science Responds (Wired.com story from 2007)

See the original article at Wired Neuron site, with a very interesting discussion at the end.