Solutions: Open Science

Source:  The Corbett Report

 

“The idea that science should be opened to the wide public—even to the wild public—is one that produces a great deal of consternation among the defenders of the scientific status quo. What role do the unwashed masses have to play in the hallowed halls of the modern Church of Science? Aren’t these spaces reserved for the white-robed priests of this secular religion?

Thankfully, as more and more innovators step up to the plate to provide ideas for the wider public to access scientific knowledge and play an increasingly important role in developing, sharing and using that knowledge, the ideas of “citizen science” and “open science” are no longer something to be laughed at.” ~ James Corbett

 

by James Corbett
March 22, 2019

 

In the face of the crisis of science, it is easy to throw our hands up and watch as the old guard of the scientific establishment circles the wagons and goes back to business as usual. But there are real solutions to these problems, and we all—scientists and non-scientists alike—have a part to play in implementing them. Today on The Corbett Report we explore Solutions: Open Science.

 

https://youtu.be/MlVVUgWsBRo

Watch this video on BitChute / DTube / YouTube or Download the mp4

 

TRANSCRIPT

Biostitutes selling dodgy data to to the highest bidder. Scientific frauds fudging figures to publish before they perish. Statistical charlatans p-hacking significant results in the confidence that no one will be checking their work.

Last time on The Corbett Report, we examined The Crisis of Science, or, more precisely, the crises of science: the Replication Crisis; the Crisis of Fraud; the Crisis of Publication; and the Crisis of Peer Review. We also explored the shared root of these problems in the rise of Big Science, where large-scale capital investments are increasingly a requirement for cutting edge research.

DWIGHT D. EISENHOWER: Today, the solitary inventor, tinkering in his shop, has been overshadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers.

The prospect of domination of the nation’s scholars by Federal employment, project allocations, and the power of money is ever present—and is gravely to be regarded.

Yet, in holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientific-technological elite.

SOURCE: Eisenhower Farewell Address

Big Science requires Big Money, either from Big Corporations or Big Government. But as we have already seen, when Big Corporations are funding the research, the “science” is invariably skewed in the interests of the company who is paying for it, and when Big Government is funding the research, the “science” is invariably skewed by political interests, lobbyists, and military contractors. Even worse, we sometimes get the admixture of the two, combining Eisenhower’s twin nightmare of a military-industrial complex with a scientific-technological elite.

This is the problem facing humanity at the crossroads of the 21st century, on the cusp of innovations in robotics, computing, genomics and other breakthrough sciences that have the potential to transform our world forever—for better or for worse.

In the face of such monumental challenges, it is easy to throw our hands up and watch as the old guard of the scientific establishment circles the wagons and goes back to business as usual. But there are real solutions to these problems, and we all—scientists and non-scientists alike—have a part to play in implementing them.

Today let’s explore Solutions: Open Science.

This is The Corbett Report.

Ever since the publication of John Ioannidis’ groundbreaking 2005 paper, “Why Most Published Research Findings Are False,” the scientific community has been engaged in a debate about what this crisis of science signifies, what kinds of measures are needed to fix it, and even whether there is really a crisis at all.

But as the data continues to pour in from every field of study, the results are by now unquestionable: the scientific institutions that exist today are producing extremely untrustworthy results.

BRIAN NOSEK: […][I]s there actually a reproducibility crisis? And Nature went as far as to say “Let’s ask people and see if they agree that there is a crisis.” And so they surveyed 1,500 researchers and 90% of them agreed that there is a significant crisis or I don’t know what a slight crisis is but a slight crisis.

SOURCE: Professor Brian Nosek on the reproducibility crisis and open science in psychology

JEVIN WEST: In industry, CEOs and leaders in the field of in biotech and pharma are coming out and saying “Well, we’ve known this for a long time. We already know that, you know, probably fifty percent of the studies published in top-tier academic journals can’t be repeated. We know it. We can’t repeat it in our labs.” This should be unnerving because we depend on science to fly in those planes, to get that antibiotic that you need when you get sick and have an infection when you land in the emergency room. This is a big deal.

SOURCE: Calling Bullshit 7.4: A Replication Crisis

IOANNDIS: They could replicate only 6 of the 53 landmark studies for oncology drug target projects and the conclusion was that “the failure to win the war on cancer has been blamed on many factors but recently a new culprit has emerged: too many basic scientific discoveries are just wrong.” And we just need to do the whole job from scratch as if these papers did not even exist.

This is very worrisome. Hedge funds don’t trust science any longer. So this is from a business journal. They claimed that at least 50% of published studies, even those in higher academic journals, cannot be repeated with the same conclusions by an industrial lab. And the potential for not being able to reproduce academic data is a disincentive to early stage investors. At least one firm now is hiring CEOs to independently validate academic science prior to putting up serious money. What this means is that these companies, these hedge funds, they they say that the scientific literature it’s just for the scientists, it’s not serious. It’s more of a toy. And if you really want to be serious and not waste your money, you’d better try to do it from scratch and make sure that it works. Otherwise, you’re running a very large risk.

SOURCE: CLB | Dr. John Ioannidis on The Reliability of Biomedical Evidence and How to Improve It

It is getting harder for researchers to deny that there is a problem. But as with any such crisis, if the problem is defined narrowly enough then the “solution” to that problem can be limited to a few cosmetic alterations of the existing system.

If we take the crisis of science as merely a problem with shoddy statistical analysis, for example, then surely all that is needed is to put more time and effort into training scientists in the proper use of statistical tools. With an increased awareness of the problem of p-hacking or other statistical tricks, journal editors and reviewers could put extra time into scrutinizing the results of statistical analyses in research papers.

Or if the crisis is simply a problem of fraud, then an awareness campaign about the problem could pressure researchers to publish their raw data for scrutiny by the wider scientific community.

If the crisis is just a result of the publication pressures that modern academics are subjected to, then the creation of alternative journals that publish negative results or inconclusive findings could provide an outlet for researchers to earn publication credit while being forthcoming with their failures.

Indeed, all of these problems and many more have been identified and all sorts of solutions have been proposed or even implemented to help remedy them.

There are growing calls to raise the threshold for “statistical significance,” issuing guidelines for the use of p values in research, or even outright banning the use of p values in papers, as the journal Basic and Applied Social Psychology did in 2015.

There are calls for more publications to require scientists to publish raw data, methodology and other relevant information along with their research so that their experiments can be more reliably replicated.

A number of journals dedicated to publishing negative and null results have been created in recent years, and in 2017 the Journal of Negative Results in Biomedicine ceased publication after declaring that it had succeeded in its mission of convincing other, mainstream journals to publish more articles reporting negative or null results.

Sites like RetractionWatch keep an eye on the fraud, abuse, mistakes and misdeeds of scientists, publishers and institutions around the world, drawing attention to scandals and problems in the system rather than trying to sweep them under the rug.

All of these ideas, and many more, are important and necessary steps in fixing some of the problems that have come to plague modern institutional science. But they are not sufficient to solve the crisis of science. Because, as even the leaders of this movement to re-imagine science will readily admit, this crisis is not about p values or publishers or practices. It is about the nature of the scientific community itself.

IOANNIDIS: Who should take responsibility for the replication culture? Well, I think that one option is if you have the whole field coalescing—which is what’s happening in genetics—it could be the same investigators. If you have multiple investigators, each one of them kind of cross-checking each other, they can have multiple analytical teams look at the same data. Hopefully that would be pretty objective.

Someone might fear that this might be too much inbred so you need different investigators, and if you want different investigators then who is that going to be? If you have an all-inclusive consortium approach it’s difficult to find such people. Maybe you can find some who still belong to the same school and therefore you don’t have really independence in the replication process.

One option is to try to see if there’s investigators of competing theories and hypotheses. If they can be convinced, if they can look at the data—well, provided the data, the methods, the software, the script is available—if they can also repeat a study according to what they think is the best way to do it and they get the same results, I think this is very very strong evidence. But that model may not necessarily always be available.

You can have also combinations to the above, or you can open the process to the wide public. Now, the wide public could also be the wild public. Now lots of senior investigators will start saying, “I’m a senior scientist. I have trained for 500 years to become so experienced, and how can I have someone who’s clueless, who has never tried his hands on the field look at my research? We need to be careful, but we also need to be open. And there’s many research questions that indeed involving the wide public in some sort of citizen scientist model might be the way to go and to compare notes on what we get.

SOURCE: ESOF 2018 – Enhancing reproducible research – John Ioannidis

The idea that science should be opened to the wide public—even to the wild public—is one that produces a great deal of consternation among the defenders of the scientific status quo. What role do the unwashed masses have to play in the hallowed halls of the modern Church of Science? Aren’t these spaces reserved for the white-robed priests of this secular religion?

Thankfully, as more and more innovators step up to the plate to provide ideas for the wider public to access scientific knowledge and play an increasingly important role in developing, sharing and using that knowledge, the ideas of “citizen science” and “open science” are no longer something to be laughed at.

At the root of this revolutionary approach to the scientific process is the understanding that access to scientific knowledge is the key to enabling meaningful public participation. In the wake of the open everything ethos that the internet has helped to foster it may be difficult to remember, but the debate over whether or not scientific data and discoveries should be locked away behind paywalls and kept within the cloistered confines of academia was one that was raging just a few short years ago. And it was a debate that cost at least one activist his life.

ALYONA MINKOVSKI: Well, today we have news for you about Aaron Swartz. He’s the executive director of Demand Progress, a co-founder of Reddit, and he’s been a frequent guest on this show. But yesterday he was arrested and charged with violating federal hacking laws for downloading four million document documents from JSTOR from MIT’s network. Now, if convicted of the felony charges Swartz could face up to 35 years in prison and a 1 million dollar fine.

JSTOR is a company that provides digitized copies of academic journals. It’s used in universities all over the country, and they’ve already come out saying that they did not refer this case to the feds and that all the information has been returned. But the arrest is once again shone a light on the fight for open access to information.

SOURCE: Aaron Swartz Arrested: The Open Access Debate

AMY GOODMAN: Aaron Swartz committed suicide on Friday. He hanged himself in his Brooklyn apartment. He was 26 years old.

His death occurred just weeks before he was to go on trial for using computers at MIT—that’s the Massachusetts Institute of Technology—to download millions of copyrighted academic articles from JSTOR, a subscription database of scholarly papers. JSTOR declined to press charges but prosecutors moved the case forward. Aaron Swartz faced up to 35 years in prison and million dollars in fines for allegedly violating the Computer Fraud and Abuse Act. When the case first came to light the United States Attorney for the District of Massachusetts Carmen Ortiz said, quote, “stealing is stealing whether you use a computer command or a crowbar, and whether you take documents, data or dollars.”

SOURCE: “An Incredible Soul”: Lawrence Lessig on Aaron Swartz After Leading Cyberactivist’s Suicide. 1 of 2

In 2008, internet pioneer and cyber visionary Aaron Swartz penned the “Guerilla Open Access Manifesto” laying out the basis for the Open Access Movement.

Information is power. But like all power, there are those who want to keep it for themselves. The world’s entire scientific and cultural heritage, published over centuries in books and journals, is increasingly being digitized and locked up by a handful of private corporations. Want to read the papers featuring the most famous results of the sciences? You’ll need to send enormous amounts to publishers like Reed Elsevier.

There are those struggling to change this. The Open Access Movement has fought valiantly to ensure that scientists do not sign their copyrights away but instead ensure their work is published on the Internet, under terms that allow anyone to access it.

The document ended with a call to action:

We need to take information, wherever it is stored, make our copies and share them with the world. We need to take stuff that’s out of copyright and add it to the archive. We need to buy secret databases and put them on the Web. We need to download scientific journals and upload them to file sharing networks. We need to fight for Guerilla Open Access.

As we now know, this document, innocuous as it may seem, led to tragedy, as Swartz’ own attempt to liberate the information from JSTOR—a digital library of academic journals—led to his arrest and, ultimately, his death. But the Open Access Movement did not die with Aaron Swartz. Today, an increasing number of researchers are committed to publishing in open access journals and in online spaces, like the Public Library of Science (PLoS) website, that are freely available to the public.

But the idea of open access is not about knowledge for its own sake. It is about the radical potential of such a movement to open the doors of academia’s ivory towers and to encourage a greater role for the public in the scientific process. Open access is just the first domino in a series of ideas that lead to a radically different view of science and its place in society.

The first level of public participation in the scientific process itself involves a “citizen scientist” model that is drawing increasing attention from the wider scientific community. In this model, interested amateurs help scientists to collect, store, process and even analyze data as part of a wider research project. The modern manifestation of this idea takes its cue from the life sciences, where outdoor enthusiasts have been called upon to help projects like the UK Butterfly Monitoring Scheme, tracking the range and size of local butterfly populations, and the North American Bird Phenology Program, keeping tabs on the location and migration patterns of various bird populations.

With the advent of personal computing and the internet, these initiatives were extended to even more arcane fields of scientific research. Pioneered by projects like SETI@home, which uses spare computing resources of volunteers on the internet to analyze radio signals for signs of extraterrestrial intelligence, citizen science portals such as Zooniverse have been created to allow non-specialists to participate in a wide array of research projects across nearly every conceivable discipline.

But this model of citizen science, heavily promoted on the Ted Talk circuit and in the mainstream scientific press, does not question the fundamental divide between scientists and the wider public. In these cases, volunteers are merely being used to collect data or to dedicate their spare computing power to analyzing data as part of a larger project directed by a team of scientists.

More radical still are ways that people are coming together to collaborate on solving problems themselves. In these projects, participation in every step of the process is encouraged and ideas are debated and discussed openly as a self-formed group discovers the answer to a question they themselves have asked.

MICHAEL NIELSEN: So my talk today is about open science, which sits in roughly the same relationship to science—basic scientific research, mostly academic research I’ll be talking about—as open source software does to the commercial software world. And so what I want to explore is the extent to which open-source principles or style principles can be applied to the practice of basic scientific research.

We’re going to start off with an example where this has been done successfully. So, the example starts with this man: Timothy Gowers. Gowers is a mathematician. He’s actually one of the world’s leading mathematicians. He’s, amongst other things, the recipient of the Fields Medal, which is often called the Nobel Prize in mathematics. Gowers, in addition to being a Fields Medal-winning mathematician is also a blogger. That’s not that uncommon actually amongst leading mathematicians. Of the 42 living Fields medalists, four of them in fact have started blogs. So that’s about one in ten, which I don’t know how that compares to the general population but it’s pretty good.

Anyway, in January of 2009, Gowers wrote this very interesting post with the title “Is massively collaborative mathematics possible?” And what he was proposing to do in this post was to use his blog as a medium to attack a difficult unsolved mathematical problem—a problem which he said he would “love to solve”—completely in the open using his blog as a way of posting his partial progress and his ideas. And what’s more, he issued an open invitation inviting anybody in the world who thought that they had an idea to contribute to post that idea in the comments section of the blog. So he called this experiment “the polymath project.”

Well, the polymath project got off to actually quite a slow start. In the first seven hours after he opened his blog up to comments, not a single person wrote in with any suggestions. But then a mathematician at the University of British Columbia named Jozsef Solymosi posted a suggestion—basically it’s a simplified variation of the original problem, which he was suggesting might be a bit easier to attack. And then 15 minutes after that, a high school teacher, in fact, from Arizona named Jason Dyer wrote a short suggestion. And just three minutes after that Terence Tao—also actually a Fields medalist, he’s a mathematician at UCLA—posted a suggestion. And so things were really off and running at this point.

Over the next 37 days, in fact, 27 different people would post 800 substantive mathematical comments containing 170,000 words. That’s a lot of mathematics done very quickly. It was hard actually . . . I was following along—I didn’t contribute substantively, but I was following along quite closely—and it was difficult simply to find the time just to read all the contributions. It was really going remarkably quickly. You’d see people you know they propose an idea in a very half-baked form and then often it will be very rapidly developed sometimes by other people. Sometimes of course it would be discarded, but other times it would then be incorporated into the canon of knowledge. Gowers described this process as being to normal research as driving is to pushing a car.

And at the end of the 37 days he used his blog to announce that the problem had most probably been solved. In fact, a generalization of the original problem which they were attacking. They still had to go back and check that they hadn’t made any silly mistakes. In fact, everything did indeed check out ultimately and they wrote two papers based on it. It took months more to do all the cleanup work, but the back of the problem had in fact been solved at this point.

Now of course the reason I’m talking about this polymath project is not really so much because of the particular mathematical problem. You know, it’s not important because it solved a particular mathematical problem; it’s, rather, important because of what it suggests. It suggests that we can use some of these sorts of tools as kind of cognitive tools to potentially speed up the solution, not of simple everyday problems but actually of problems which challenge some of the smartest people in the world. Yeah, that’s really exciting. These are problems right at the limit of human intellectual ability. And not just, you know, one particular problem, but perhaps broadly across many different fields.

SOURCE: Michael Nielsen: “Reinventing Discovery” | Talks at Google

The implications of this type of spontaneous, collaborative problem solving extend far beyond the field of mathematics. In a world that is increasingly being transformed by scientific pursuits—and where the cost of mistakes are correspondingly high—a public that is skeptical about scientific institutions, government regulators and other supposed “authorities” is increasingly taking responsibility for scientific fact-checking into their own hands.

One stark demonstration of this fact came in the wake of the Fukushima Daiichi nuclear meltdowns in March 2011. As we now know, Japanese officials withheld data from the government’s own “SPEEDI Network,”  a computer system that had been set up specifically to provide forecasts of nuclear radiation fallout in the event of an emergency. When the data was finally released months later, it was revealed that local officials, having been kept in the dark by government scientists, had evacuated residents directly into the path of the fallout.

The situation left residents and concerned citizens around the globe scrambling for accurate, up-to-date information about radiation readings, and distrustful of the government agencies who were interested in keeping that information from the public. The response was a spontaneous, volunteer-organized citizen science project called Safecast that designed a radiation measuring device that would be able to take radiation readings of an area every five seconds and upload that data to an open source database.

The product of this remarkable initiative has been the creation of the largest database of its kind in the world, one that has been independently verified as accurate. And it was started and continues to operate as an independent, decentralized, global volunteer project of concerned citizens, scientist and non-scientist alike.

SEAN BONNER: “[. . . ]And then [in] March 2011 in Fukushima, this earthquake tsunami and nuclear meltdown, like, triple thing happened, and everybody was very confused. We didn’t know what was going on and I had these connections in Tokyo, so I was reaching out to try to find out what’s going on.

Other people were scrambling around and nobody really knew what was happening, and the little bit of information that was starting to come out really made no sense. People would see a map like this would be published and, like, what does this even mean? I don’t even know. Nobody knew. And so I started talking to Joi again, and Joi introduced me to his friend Pieter, who lived in Tokyo and had lived in Tokyo for like 35 years and had family who was in one of the areas that got really severely impacted by the tsunami.

So we just started talking about how can we get some information together, because there’s no information available for people. Nobody knows what’s happening. And so we thought, “OK, let’s reach out to everybody we know. We’ve got to find somebody who knows something about this—the different pieces—and we can pull them together and, you know, continue this conversation somehow. So we all reached out to whoever we might know that might have some connection.

And so, for me that looked like my hacker friends at their crash base in Los Angeles and at Tokyo hackerspace. My friend Matt Alt, who I had done the Toei website with, who was now living in Tokyo, and he helped translate a lot of the Japanese stuff that was coming out from the official news sources on stuff. Bunny, who I knew from hacker conferences and who jumped in and started helping us build hardware. Haian, who was a designer that I knew from Ideo, and she was creating visualizations with the data we were putting together. And Paul, who I knew from the Metro blog in Dublin who jumped in and started helping us write the back-end software to manage it all. And Joi and Pieter had the same sort of thing. They found all these people and pulled them together. And so we all got together and created this thing that ended up being this organization called “Safecast.”

At first, we just duct taped some Geiger-counters to car windows and started driving around and tried to get an idea of what was happening. And [we] realized that those measurements were changing much faster and it was a little bit of a different story than kind of these big averages that were being published by any thing official . So we created a hardware and software platform and these little devices that have GPS on them and Geiger counters, and they take readings every five seconds and then upload it into this giant data-set. You could attach them to cars or bikes or anything and we could take them around. And so we started putting these maps together, and these circles are the evacuation zones.

So we started seeing this story where inside of the evacuation zones, maybe the levels weren’t necessarily that bad, but outside of the evacuation zones they were they were much worse. And this this was kind of conflicting because there were most certainly situations where people had been moved from areas with low radiation into areas with high radiation and we didn’t quite get what was going on.

So seven years on, this is what our data looks like in that area. We really mapped out every street and created this absolutely perfect picture of what’s happening. But an important piece of this is that we’re not going and measuring. Rather we created the tools and the platform so that the people there can measure on their own. So these areas are being measured by the people who live there and are impacted by it and this really gave them a chance to have a say in what was going on with it. They got to measure stuff, they weren’t getting answers from other places. But it also had some very interesting real-world impacts in that it forced the officials to do something.

They actually changed the evacuation zones after we published this data showing that these things were different. And we expanded this out and this is the data we have for Japan. It’s basically every single street in Japan. We’ve measured time and time and time again. But it turns out that the data that wasn’t available in Japan, also [it] wasn’t available anywhere else in the world. Nobody had this kind of stuff, so we started reaching out to other people, and people in other places are measuring.

So this is what we have in Europe and this is what we have in the US. And you can see these, they’re—you know, Sony attached a sensor to a car and went on a drive down a road, right? This is what we have around the world. And obviously there’s some major holes that we still need to help fill in, but it’s getting there and it’s already the largest data-set that’s ever existed of its kind in any way. Almost 100 million data points. And we put all of the data into the public domain. And it’s actually growing faster all the time, it’s not slowing down in any way.

So, if you remember, I said that maybe some people will kind of participate once something gets going. I’ve learned through this that sometimes some is all you need. You don’t need everybody to do it, you just need some people who are gonna be active with it. And with Safecast I tried to build in these things that that I noticed in all of these other things, where the people are independent on their own. We gave them the tools, we gave them the best practices, but they’re doing the thing on their own without any hindrance or control from outside on this.

There’s lots of different ways for people to help with the project. Some people are making visualizations, some people are collecting data, some people are building devices. All of these different things people can do with it and then again, it removes the reliance on some outside authority for the people in the areas that are measuring it.

But it’s not just about disaster and stuff. So this is Peter—who I mentioned before—and a few years ago we went to Washington DC to put on a workshop about Safecast and what we’re doing with this.

So if you wanted to see the publicly available radiation data for Washington, DC, the day before our event, it would have looked like this. There’s absolutely nothing available. So we had this two-day event where people came in and they built their own sensors, found out how they worked, understood it, got their sensors up and running, and then we sent them out [to] walk around Washington DC and just measure stuff and then come back and we’ll put everything together. And so this was the data that was available just after one day of people walking around. We mapped out the whole city and found some interesting stuff. There’s, like, the World War II memorial over here that was built with very radioactive granite and all these different things that you might not have known otherwise and that was really cool for people.

But a much more interesting thing happened shortly thereafter, in that the US government released their data-set of radiation in Washington, DC, right? So they had this data, but since they were the only people that had the data they kept it secret and then as soon as there was another comprehensive data-set available there was no reason for them to keep it secret anymore and so they released it.

And so it’s this kind of thing where releasing this open data actually creates even more public data than we had our hands in at all, which is where people start throwing around these kind of words like “revolution.” Which is cool, but the result of that is that, you know, this does in fact change the world in in all of these ways.

And so I’ve been talking a lot about radiation, but last year we actually started measuring air quality as well because that’s another thing that maybe if we’re putting sensors in it might be really useful to people. And so this is where we just put a bunch of them around Los Angeles last year and on the system right now you can see what’s happening right now, or five minutes ago, or historical over the last week, or over the last month. And you start seeing these trends, and where all this—and you start comparing the data from all the different sensors, and start kind of understanding what it is that people are breathing in the city.

So, to tie this back into the sort of citizen science idea right, I don’t really like separating this out like somehow “citizen science” is different than real science or something. Because if it’s valid science it’s valid science. It doesn’t matter who’s doing it as long as the results stand up.”

SOURCE: re:publica 2018 – Sean Bonner: Citizen Science and Environmental Data: Why Everybody beats Anybody

In some ways, Safecast is the fulfillment of the vision that Aaron Swartz laid out in the Guerilla Open Access  Manifesto. Open access, open source data, extended peer review and other such proposals for reforming the practice of science do not offer the public the chance to peek behind the curtain at the doings of the scientists; they help tear down that curtain, and the distinction between scientists and the wider public generally.

But the story of Safecast also provides a key insight into why citizen science is needed now more than ever. From nuclear energy to genetically modified foods to vaccines to gene editing to nanotechnology to autonomous weapons, the debate over scientific knowledge and discoveries is increasingly important, and political. The pace of science in the 21st century is dizzying, and as the abilities of science to transform our world accelerates, the debate over the proper place for these technologies in society is increasingly being handed over to the scientists themselves.

But this has the process exactly backwards. As philosophers of science like Andrea Saltelli and the co-authors of Science On The Verge point out, our naive conception of scientists as apolitical arbiters of truth is going to have to be adjusted to the reality of modern day science before the entire process of scientific knowledge production is undermined.

JAMES CORBETT: In this day and age, science has become specialized on models and statistics in a way that, I think, in the popular conception of “folk science,” is not the central pursuit of ultimate truth. In what could be termed “folk science” or the “Cartesian dream”—[which] are a couple of terms that are used in Science On The Verge—people tend to think of science in a certain mindset, but obviously that doesn’t apply to the way that science is conducted these days. What can you tell us about that difference between the popular conception of science and the way it is actually practiced in modern policy settings?

ANDREA SALTELLI: For me, this is a core problem of modernity. There is really a chasm between how science is perceived by the general public and by the scientists themselves for a large majority. A kind of positivistic vision of science as an offspring of the Enlightenment, which is concerned with the production of fact separate from values and emotion, and science as objective, and so on and so forth—and hence a science capable of informing policy with the production of disinterested and objective knowledge—and the reality of what science is, and the many uses to which science is put—from the construction of algorithms, to visual intelligence, to the production of various kinds of chemicals which may or may not be extremely dangerous, opioids, neonicotinoids for pesticide, and then the chapter of military technology and and so on and so forth. So we have a science today in the practice of the working scientist which is quite far from the vision of Enlightenment science, and I think this difference is a problem in the center and we should resolve it. Otherwise we risk having a very polarized discussion about science which can only have as a result a collapse of trust in science

CORBETT: And of course that is part and parcel of that “Crisis of Science” that I was gesturing towards recently on the podcast. And I did note a specific line jumped out at me from the preface of Science on the Verge which was written by Daniel Sarewitz. He wrote, “The use of science in guiding human affairs is always a political act.” Now that’s a bold statement because again I think that rubs up against the conception—the sort of folk science conception—that science is completely value neutral and we’re just looking at facts and evidence about the world. But the use of science and guiding human affairs is always a political act. What does that mean in the modern context, where we’re dealing with such incredibly important matters that have policy implications for everyone around the globe?

SALTELLI: Well there is a long chain of consideration which should be put down there. [The] first one is even when we are talking about a simple piece of datum—as Jerry Ravetz writes in one of his early books—before a single datum is collected, a lot of the work has already been done by way of framing the problem, defining what it is that needs to be tackled and how it can be measured and so on and so forth. So, when the social scientists say that data or evidence is a result of a social construction, this doesn’t mean that this is arbitrary. It’s simply what it means. It’s the result of a negotiation, a social construction but unfortunately there is—because of this postivism or neo-positivism very often found in natural sciences—this tendency to regard this as a dangerous intrusion of social sciences into natural sciences.

So that, for instance, typically—you may know that natural scientists strongly resent being the subject of study from the social sciences. When they go there as anthropologists and measure what science in action actually does, following the title of a famous book of Bruno Latour. So there is this kind of science war now always boiling in the underground, which makes this conversation a bit difficult because if it were not for that, the idea that the production of evidence for policy is a political affair, it would be a no-brainer! Of course! Because not only you have the datum, but then the datum becomes evidence, and then the evidence must be constructed as an argument. And this is not something which a policymaker does by himself, he does it with a scientist. So obviously it’s a high political affair.

SOURCE: Interview 1424 – Andrea Saltelli on The Crisis of Science

If science is always a political act, then drawing a line around scientific activity and preserving it as the special domain of an elite cadre of specialists is itself an act of disenfranchisement. By pushing the public away from the scientific field, those with a political or corporate agenda to push can use their money to subvert the scientific process behind the scenes, and hide behind the ivory tower walls when the public questions the pronouncements of the scientists.

This is why open access, open data, open science is so feared by the status quo establishment, which benefits from the symbiotic relationship between big business, big government and big science.

None of this is to say that the expertise of trained scientists will no longer be needed as radically decentralized scientific endeavours like Safecast rise to the fore. But it is a sign that the public no longer has to sit on its hands and watch helplessly as an unquestioned and unquestionable priest class hoards their data and their findings for the benefit of the corporations and governments who foot their bill.

Given the immensity of the challenges we face as humanity pushes the boundaries of the possible in ever bolder ways, it’s easy for those on the sidelines to throw their hands up and leave this all for the scientists to sort out. Or, worse yet, to turn their backs on science and the scientific method altogether. But these problems are bigger than the scientific community, and their solutions will involve all of us to engage in the process of redefining science and its place in society.

As concerned citizens, we either become part of the solution by engaging in the emergence of the open scientific community, or we become mere spectators as the big questions are increasingly asked and answered for us.

image_pdfimage_print
Share: