Scenes from the Open Science Summit
Ronald Bailey's first dispatch from the conference that aims to launch Enlightenment 2.0
Berkeley, California—The inaugural Open Science Summit kicked off Thursday afternoon at the University of California, Berkeley's International House. Some 200 participants have gathered to "update the social contract for science." The summit's chief organizer, a young intellectual entrepreneur named Joseph P. Jackson III, says his aim is to jumpstart Enlightenment 2.0. Let's hope he succeeds, because the first panel of presenters made it clear that Enlightenment 1.0 is mired in a bureaucracy run by careerist professionals.
Before the formal presentations began, the summit opened with a kind of free-forum forum where participants stood up to ask panel members questions. The first question from the floor was: What is "open science" anyway? My personal favorite definition was enunciated at the outset by quirky Cambridge University chemist Peter Murray-Rust: "The 'open' bit means that it is available to anyone in the world to do whatever they like without any restrictions." Statistician and Yale Law postdoctoral asssociate Victoria Stodden described the open science movement as an amalgam of folks who want open access to the peer-reviewed scientific literature; open access to all scientific data; and to improve the efficiency of how science is done, largely by enhancing cooperation between researchers. Surprisingly, none of the initial answers addressed one the more interesting motivating themes behind the conference: how the falling costs of enabling technologies are empowering citizen scientists to participate fully in the scientific enterprise, liberating science from the stifling bonds of the government-academic-corporate research complex.
The negative effect of those bonds was highlighted when University of Manchester computer scientist and research social networking guru Carole Goble noted that openness is being stymied in part because many young pre-tenure researchers are afraid to share their results before publication. They fear that sharing will allow their rivals to "steal" their results and that scientific journal editors will refuse to publish anything that has already been aired in public. Murray-Rust admitted that this is a problem which he hoped would be solved in the next 10 years.
From the floor, a young mathematician (unidentified) then ridiculed the fields of biology and chemistry for being backward fuddy-duddies with regard to publication and scientific priority. She pointed out that the norm among mathematicians and physicists is that as soon as they produce something reasonable, they put it up on the arXive preprint server. Everyone can then see that you've published it and that you got there first. Peer review, such as it is, happens when you get around to sending your results to the journals. One life scientist from the floor pointed out that genomic research has already fostered a similar culture where researchers make their data public.
Another issue raised was sharing negative and inconclusive results. Among other things, the failure to publish negative and inconclusive results skews statistical analyses that aim to determine the effectiveness of drugs or the alleged toxicity of chemicals. In addition, making negative or inconclusive results public would considerably speed up scientific research by sparing researchers from traveling down previously explored dead ends. Some panelists noted that several journals have tried this in the past and failed, due in part to the fact that researchers barely have enough time to put together and publish their successful results. Australian National University chemist Cameron Neylon responded that software developments will soon make these excuses untenable because it will be possible to standardize and upload the results of failed experiments onto the Internet where others can find them. Neylon also argued that researchers should want both more positive results and more rubbish published because then someone can build a Google for science. So instead of filtering information on a pre-publication basis, filter it once it's out there.
The free-form forum was followed by a number of brief, formal, TED-like presentations. To give readers an idea of what is on offer at the Open Science Summit, let's look at a selection from Thursday night's presentations. One of the fiercer presenters was statistician Victoria Stodden. She argues for framing the open science movement in terms of two principles: reproducibility and knowledge sharing. She actually views this as a return to the traditional scientific method of complete disclosure. According to Stodden, computation in research is now pervasive and many scientists fail or even refuse to release the computer code they use to determine their reported results. Without this code, outside investigators cannot reproduce the reported results. Although she didn't say it, this means that outsiders are being forced to take reported results on faith. Stodden noted that the "Climategate" scandal erupted, in part, because a clique of researchers refused to share their data and computer models with skeptical outsiders. This very week, three cancer treatment trials were halted because outside statisticians could not reproduce the results of Duke University scientists. Fortunately, as Stodden pointed out, this situation may be changing since funders like the National Science Foundation are requiring grantees to enact data release plans and journals are also setting up requirements that researchers share their code and data at the time of publication.
Morgan Langille, a young University of California, Davis genomics researcher, unveiled his BioTorrents file sharing service which enables researchers to find and speedily download vast data files. It was a bit surprising that it has taken biologists this long to figure out how to leverage this technology. Kudos to Langille. Next up was Jason Hoyt, a geneticist and research director at Mendeley, which offers a software system that aims to break through the scientific silos that confine and often obscure relevant results from researchers. Ultimately, Mendely wants to build the world's largest academic database. In the 18 months since it went online, Mendeley has 450,000 users and has aggregated the metadata from 29 million papers. Hoyt noted that Thomson Reuters took 50 years to aggregate 50 million scientific papers.
Neuroscientist Martha Bagnall noted that a lot of valuable criticism of new published work takes place among colleagues in laboratories that never gets aired in public. She observed that lots of journals now allow commenting, but there is a big ironic problem—while peer review is anonymous, commenting at journals is not. Anonymity is important because researchers fear reprisal from criticized colleagues when their own papers or grant proposals come up for review. So to capture the lab's informal criticism and make it publicly available, she and her colleagues have created The Third Reviewer website. The site lists journal research by discipline, complete with abstracts, and lets anonymous commenters have at it without fear of reprisal. It also gives authors an opportunity to defend their research. Third Reviewer is now expanding to other disciplines.
In one of the more exciting presentations, young scientists D.J. Strouse and Casey Stark unveiled their Colab open source science site at the summit. They argue that open science is more than open publishing. Strouse and Stark complained that the state of scientific publication is static and keeping up with the latest results is much like "playing ping pong under a strobe light." In Colab, researchers can dynamically collaborate by describing a problem, figuring it out together, and publishing their results on the site, where those results can be continuously improved in public. Since the entire process is open, a researcher's idea is time-stamped so that everyone knows who gets credit for scientific priority. However, if a researcher still fears being scooped, she can make her research problem private and invite specific collaborators to work on it, only publishing once they are done.
On the subject of empowering citizen scientists, a number of young entrepreneurs discussed their efforts to build cheap biotech research equipment. Josh Perfetto talked about the Open PCR (polymerase chain reaction) project which aims to build a PCR thermocycler that will cost under $400. PCR thermocyclers are used to amplify DNA samples for analysis. Using the Kickstarter microfinance platform, Perfetto and his colleagues raised more than $12,000 for their project and have now completed a desktop prototype they plan to offer later this year. Similarly, University of Michigan stem cell biology Ph.D. student James Peyser and colleagues have created the Otyp project, which aims to get real biotech experiments—e.g., putting genes for green fluorescence into E. coli bacteria—into high school classrooms. Also using the Kickstarter financing platform, Otyp is developing affordable open source hardware, wetware, and software to achieve this goal.
The final presenter of the evening was Todd Kuiken from the Woodrow Wilson Center in Washington, D.C. Kuiken heads up the Center's DIYBiosafety project. Its aim is to create a culture of research safety among DIY biotechnologists. Apparently, some people are nervous about do-it-yourselfers genetically manipulating bacteria, plants, and animals in their garages and kitchens. Go figure.
Further dispatches from the conference will detail presentations on open peer review, distributed decentralized amateur science, alternative research funding methods, and more.
Ronald Bailey is Reason's science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is now available from Prometheus Books.
Show Comments (64)