Logo della Fondazione Giannino Bassetti


Innovation is the ability to achieve the improbable

Intestazione stampa

Cristina Grasseni

The Anthropology of Innovation

Home > Cristina Grasseni > Is responsible innovation participatory, and if so, how?

Is responsible innovation participatory, and if so, how?

by Redazione FGB [1], 10 March 2007

Cristina Grasseni
Professor Galison [2], you have seen that, as a Foundation devoted to promote responsible innovation, we have set out to try and answer a few questions, which we have posted last year in a public call for comments [3]. What do think about public participation in particular? And how does this relate to your most recent work?

Peter Galison
The European Commission is responsible for formulating a report every 5 years or so, on many of its concerns. One of the branches of the European Commission is interested in the question of science in society. And through that branch of the EC, a group of us organized a meeting, well, we have a project that will last for several years, to think about new ways of understanding the relationship between science and politics. And trying to figure out what we think about this and, in particular, to go beyond the first formulations of the way this question has been framed within the EC. So in particular, we think that it's become somewhat routine to talk about the need to have a broader participation among the public, more public understanding of science, to smooth the way so that the kind of rough intersection between science and the public over questions like genetically modified organisms can somehow proceed in a better, smoother way. All this involves some of the issues that have surrounded the Precautionary Principle which has become a popular matter of discussion within Europe and broader questions of science, responsibility, accountability, transparency, regulation and so on.
We wanted to open up the discussion to thinking about new kinds of problems that haven't been part of how things have been discussed so far and to expand the domain of application beyond the hot issue of genetically modified organisms or just nuclear power and ‘yes or no', ‘it's a good idea or a bad idea'. So we got together a group of about 40 people from many different countries, mostly from History of Science background, but also some people from Philosophy, Sociology and Anthropology, to try to think about what we've learned about the past and how the relationship between science and politics in different contexts might shed light on the current situation and how to formulate new and, we thought/hoped, more interesting questions about going forward.
So, we had our first meeting in June 2006, and we grouped it into various categories involving science and managers, for instance, or the regulation of science, where we discussed among other things questions like secrecy and we thought it was quite successful in opening up some of these questions: questions from biomedicine, questions from physics, nuclear waste. A broad number of topics from many different domains.
More interesting than the topics, per se, was to try to expand the kind of questions we're asking. For example, ‘Is it really the case that expanding the number of people involved in formulating a consensus position always gives you a better position?' That shouldn't be axiomatic, that should be a question that we ask empirically. Are there different kinds of consensus building that are not merely a compromise among different groups? I think that this is just the beginning of a longer project. We're going to meet again at the Max Planck Institute for the History of Science in Berlin, probably with a smaller group now that we've had this broad-gauged discussion and try to focus our questions more specifically.

Can you maybe give some examples, especially about these last two points that you made? Is it actually the case that expanding the number of people involved betters the situation? Can you give some examples of cases in which it did not?

Well, an example that worries me somewhat is not from Europe but from the United States where people, historians, engineers, sociologists, community activists... People are beginning to look at what happened in the Katrina hurricane that destroyed much of New Orleans. Many of the systems of dykes that were established there, that failed during the hurricane, were built quite a while ago. When they were built there were representatives from many different stakeholders, as they like to say in the United States, from the City, from the State, the Federal Government, the Corps of Engineers, and others. For example, the people on the land near the canals didn't want these dykes to be too big because you would use up some of their land and make the use of their property more difficult or less valuable and there were many other people who were arguing different things. How to balance cost effectiveness, what should be and so on. It's pretty clear that the compromise that was reached by these groups was, in the end, rather catastrophic. So it would be interesting to look in detail at the process by which that was come to and to try to understand in this case where a lot of different groups were involved and compromises were made, whether certain compromises might have put in danger the solidity of the dyke construction.
So it's not to say that involving the public is necessarily a bad thing. I don't mean that at all, but it means that we have to look carefully at the system of checks, compromises, and how transparency works and what groups are there to oversee the safety of large-scale public scientific and technical constructions. All these questions seem to me to be very important for us and it's not enough to simply invoke the Precautionary Principle here. It's a difficult problem. I worry that a lot of the discussion within the European Union has been overly hopeful, in the sense that it seems to rely a lot on an almost reflexive confidence that if you involve a lot more people you'll always get a better solution. It sounds rather idealistic to me.

Maybe the crux of the question is how to learn from past experience, how to learn from these mistakes. Maybe this ties into what you just said, to look into different kinds of consensus building. Do you think, and this ties in with the mission, and if you like, with the preoccupation of the Foundation, which is that of asking ourselves can we only deliberate about responsibility ex post facto, after something has gone wrong? Or can we actually learn to build scenarios in which it is clear how responsibility is woven into the application of an innovation? Maybe the critique of the Precautionary Principle has something to do with this. I just wondered, for instance, if from this conference that you just mentioned, you have come out with some preliminary ideas about the different kind of consensus building that would allow us to create this scenario?

I think we will, we've just heard 40 different positions on these things and I think that we're going to meet again, those of us who are organizing this, to try to group together responses in different ways to try to synthesize these results. I don't want to speak for the whole of the group quite yet. We're still trying to figure this out. I think that we can learn from the past, not so much because I think that's the only time we can assign responsibility, but because we can look at some different models of how decisions about large-scale technical and scientific efforts have been made.
Shall we say it positively? I think it is possible for people, and groups, to have responsibility for aspects of scientific and technical projects, even in advance. But I think when one is grouping together the many different structures that go into making a large technical decision (regulatory agencies, scientists, engineers, chemists, geologists, biologists and so on) one of the real difficulties is knowing which groups should appropriately have a major say and which groups should not.
Let me give you an example in the case of nuclear waste at Hanford, which is one of the big nuclear weapons sites in the U.S. in Washington State. It was there that they produced plutonium for one of the two first bombs that were dropped, the bomb that was dropped on Nagasaki, in particular. It was used for many years after that. It produced an enormous amount of waste because you use very toxic chemicals to separate plutonium from uranium and the plutonium itself is extremely long-lived as radioactive waste. There were various estimates about how long it would take to reach the Columbia River, which was a very sensitive and important waterway and ecological region within the state. Probably the biggest single problem that they had, was that the physicists who knew a lot about radioactivity, probably much more than any other group, were not so good at understanding what happened in the ground water, in the ground, the rock, the sand that separated the plant from the river by a number of miles. So, the models that they made didn't understand some of the details of the rock formations. They thought it would take hundreds or maybe thousands of years to reach the Columbia River, and it got there within a couple of years. This was a big shock. It turns out that the people who really know about the movement of water through the ground are geologists. This isn't very surprising, I suppose, in retrospect, but at the time they made a wrong call. They thought, ‘We can model this. We know how to model systems of fluid flow.' They made a model which in abstract physics terms was very good, but it didn't correspond to the fact that there were tiny little fissures inside the rocks. Every time radioactive waste got into one of these fissures, it zipped right through the rock much faster than they expected. I think the example has a more general applicability, which is trying to figure out who knows about different aspects of these systems.
That's a difficult problem to know in advance; the physicists really thought they understood this. A disaster like Katrina can be thought of in much the same terms. That is to say, it's a very complicated system. It involves land use, the properties of these dykes, weather systems, tide systems, the effect of storm surge on cutting down a lot of the wetlands that are outside of New Orleans in order to get the oil tankers out, and so on. It had many, many features that range from the global climate situation to the local plant structure in the water near New Orleans. So, I think one of the hardest aspects is figuring out who one really needs in these complex systems and then respecting the right groups to make the aspects of these decisions. And finally integrating them. These are very difficult problems. But they don't seem to be identical with merely saying, ‘Everybody should have a voice,' or with the kind of terms we've become used to, which are rather unsophisticated like, ‘The public doesn't understand science, so what we need is more popularisation.' Or, ‘The best way to manage disagreements among constituent groups is to compromise'.

I think you've hit on the interest of the Foundation, especially when you underline how important it is to model complex scenarios, first of all, and how this is a necessary step to take in order to be able to allocate responsibility within a complex system. Also the fact that this ties in with the distribution of knowledge within that complex system. I just wonder how this applies to your most recent work on secrecy. Here you have a complex system which also very importantly integrates the agency or states, of State operators, and so is in a political agenda, and how much it ties in with the idea of surveillance.

Let me start further back with how I got interested in this because I know you know this earlier work that I've done, the film on the hydrogen bomb, and I've also done other writing on that with the historian, Barton Bernstein. I was interested in the H-bomb because it seemed to me, as complex and potentially deadly as it was, to be a relatively constrained, that is to say, a relatively focused question around a relatively small number of actors compared to some of these other questions. In particular the issue was this: in the atomic bomb during WWII, the atomic bomb project, there was very little dissent, really. There were a lot of reasons for this. Partly because it was a weapon built in the middle of an all-out war, partly because the physicists, chemists and metallurgists that were involved in it were very young and didn't have a lot of worldly experience, partly because there was already bombing on an unbelievable scale. When there was a war where 50 million people are killed, the only thing people could think of was how to get the bomb before the Nazis did. And so during the Atomic Bomb Project there was not a lot of dissent. There were a few groups late in the War, mainly in Chicago, who tried to get a test done that would be demonstrated on an area outside of cities, it would be tested in a way to impress the Japanese generals but not destroy a city. They were not successful in that idea for various reasons. But, on the whole, there was a consensus that this had to be done and that it had to be done as quickly as possible.
In the 1960s, 70s, 80s, 90s and in our current situation, nuclear weapon development, maintenance, and so on is done, in many of the countries that do it, mostly protected by a veil of secrecy and outside of the civilian area. These are people who work almost entirely within the separate sphere of weapons development and maintenance. And that's true in France, Britain, China, Russia and the other new-nuclear countries, like Pakistan and India. So, there was a very brief window after WWII until about 1954 when the people who were very important in building the new kinds of weapons, this proposal to build the hydrogen bomb, were both leading scientists in the civilian world, not under the constraints of war, and had the autonomy and the authority to speak of these questions and the experience of knowing that they had the authority, not only in their civilian lives. Hans Betha, for instance, won the Noble Prize for figuring out why the sun shines, but was also the head of the theory group at Los Alamos. So he had this double authority. That was true for many of the others: Oppenheimer, Teller, these were very important civilian scientists and also people who had fundamental and, indeed irreplaceable, knowledge in those years of nuclear weapons. I was interested in how this relatively small community, and the expanded number around them within the government, high officials who were cleared to know about this in the military, the government officials around President Truman and then the public, but the public came in somewhat later; I was interested in understanding how this community struggled over this issue about whether it was a moral and politically good idea to build this complicated new weapon that would be a thousand times more powerful than an atomic bomb. Some of them thought that if the Soviet Union got it first it would be a catastrophe for Europe and the U.S. and they would dominate the world; and others thought that if the U.S. built this weapon it would lead to an arms race that could annihilate human kind.
So, the stakes couldn't have been higher. Each side thought that they were defending the future of the world and yet they came from a relatively coherent community, people with basically the same physics background, the same experience in WWII. They were comparable; they weren't Martians and Venusians, right? They were a group of physicists who spoke roughly the same language and not just English, but the language of Physics, the language of weapons-making.

They shared a community.

They shared a community. Many of them had been quite good friends and, in fact, they split over this issue very, very, very strongly. Over it, for instance, came the so-called Oppenheimer Trial of April 1954, which destroyed him professionally, and probably personally, too, led by people who, like Teller, had been very friendly with him. Teller and Betha used to go on vacations together. They were very close friends. So it was a very big personal as well as political division; a moral division, a political division. This is a case, exactly on the question. It was a new technology, a weapons technology, and the question of the scientists' responsibility, what their role was, what secrecy, those deliberations, very high stakes, and it seemed to me a very interesting place to look for a clash, really a sort of unique moment in the sense that it was after the all-out war of WWII and before the Cold War sort of set in place an apparatus that made this kind of debate very difficult.

How was this developed into your later work? And how do you think it relates to the issue of responsibility?

Well in the case of the H-bomb, everyone talked about the scientists' responsibility. All of them thought that they held a particular responsibility, but they framed it in different ways. The physicists particularly well-understood the consequences of nuclear weapons, they understood better than anyone what had happened at Hiroshima and Nagasaki and they understood what would happen if one were to build a weapon a thousand times bigger and what it would mean, all-too-graphically for the larger cities instead of destroying one square kilometre in Hiroshima or Nagasaki - really it would destroy the totality of a very large city like Moscow, London or New York. So they saw this as being their responsibility. So the argument was that they had a particular responsibility because of what they had done and what they understood and what they could forecast about the future. Other people, especially among some of the people like Teller, who wanted desperately to have the H-bomb built because he thought it was the only way to save the world against Soviet aggression, said, ‘Scientists understand the technical aspects, they have a responsibility to make this weapon if it can be made based on the decision of the elected leaders of the country'. So according to him it was arrogant of them to think that they could take responsibility for not building the weapon. He and others said, ‘Can you imagine if the Russians threatened, or worse, used an H-bomb?... And the scientists then went to the people, in the first instance in the United States, and said, ‘Sorry, this country doesn't have one because we decided it was immoral and we didn't feel comfortable building it'. And he said that was an impossible moral-political thing to do.
Another argument that they made, the people for the H-bomb, was that anything of this kind that is physically possible eventually will get made by somebody. The idea that somehow humankind is going to explore technology. Technology, in some way, advances by itself, almost. No one is going to just stop thinking about these things. It's kind of the inertia of technical development and given that , they ought to do it, understand it, and do it first.
So it was really a battle over understanding what the scientists' responsibilities were. Do their special skills make it particularly incumbent on them to block things or is it an arrogation of power that's inappropriate in the face of either decisions of elected officials or the natural momentum of technological scientific systems? I think that, in that sense, this case raises problems that occur again and again. And you can imagine the same terms of debate, not maybe in Cold War terms, but somebody saying that, ‘if genetically modified crops can be done, they will be done, so we should understand them as much as possible. That's our obligation as scientists.' Other people saying, ‘No, we understand the dangers, this is exactly what we should not do'. And so on. I think that in many cases this sort of questions arise.
Now, one of the things I got interested in was secrecy and how this debate occurred in a secret bubble for a long time. Physicists started thinking about the H-bomb already in 1942. It only became public in 1950 and when it did it created a huge, huge debate. All over the world. It was discussed in newspapers, radio shows, Einstein appeared on television. Betha appeared on television. Television was relatively new, so this was a big deal. People were furious about it. There was a great deal of fear, and arguments, over this. It was at the height of McCarthyism in the U.S., it was a big deal. As I said, it led to the Oppenheimer trial, it led to the second laboratory for nuclear weapons in the U.S. because some of the scientists felt that Los Alamos was dragging its feet. It had major consequences for the arms race. Of course, the Soviet Union was working all out to build the H-bomb, too. There, obviously, the debate didn't exist because if you argued against such a thing, Beria, who was in charge of the project, would have you shot. So, that simplified the discussion.
But it made me think about secrecy and thinking about the history of nuclear weapons has always been, for me, entwined with the idea of how secrecy functions: Can secrecy work? Does secrecy work? How does it work? What does it mean to have knowledge about science that can't be shared? What is a lab? How do laboratories work under these circumstances? How did we come under the current circumstances with so much secrecy in the world? With Robb Moss, a filmmaker at Harvard, with whom I teach a course on film and science, we started to think about making a film about this least-filmable of subjects; we're interested in, not so much a completely historical film, but a film that starts with our current situation of having so much secrecy in the world and trying to understand how we came to be in this circumstance in the modern period, that is to say, from the time of the Manhattan Project on atomic weapons, through the Cold War period, to the period of counter-terrorism and terrorism. Our current situation.
Here we have, in some ways, a much more complicated problem. The nuclear weapons establishment is much larger than it was; it's one shared by 7 or 8 different countries, the danger of proliferation is extreme; instead of there being 1 or 2 nuclear weapons in the world there are now tens of thousands of nuclear weapons in the world. There are weapons of every conceivable kind, and it's really a problem from hell about how to control these things, even what it means to control these weapons when you have hundreds of thousands of people working on them and handling them. More broadly, how did the system around nuclear weapons come to include so much else in our lives: surveillance, monitoring of phone calls, financial transactions, travel, property ownership, communication and otherwise? I've been interested in these problems.

So in a sense, the hot stuff is no longer plutonium, but it's information in itself. I just wonder how much you can disclose about this scenario that you've come to work on, if you can tell me more about it or not. I don't know how secretive you want to be about it!...

No, I'm not secretive about it. No, I'm happy to tell you about the film. Again, my idea in the H-bomb film is still something I care about, which is to present a film not in the classical form of the public understanding of science, of simply making clear scientific ideas, nor to make a film about triumph: the initial idea, the setbacks, and then the triumph of a certain scientific formulation. Instead my idea is to try to show conflict and in such a way that people watching the film could understand enough about an issue to feel the force of argument on different thoughts. Not even necessarily just on two thoughts. To engage in something that is current and fundamental and useful for them, and reasoning about problems. Even to the extent that we have discussion about something like climate change, genetically modified organisms, cloning, the big topics, and many others that we know. They tend to be rather either pedantic, pedagogically oriented or triumphalist or propagandistic. I don't think that we can function as a democracy if people are spoon-fed this kind of information. I really believe, both as someone involved in the humanities and the social sciences side of things and from the scientific side, that the issues at hand that are technological and scientific are perfectly understandable to people who are not scientists and engineers. As long as they're explained without jargon.
I think people often make the mistake of thinking that what's needed is simplification, when in fact what's needed is the removal of jargon. I think that most people, and this has been my experience in many of the things that I have done, are very sophisticated about arguments. I actually have a lot of faith in the thoughtfulness of our fellow citizens and I think that it's a huge error to mistake jargon for complexity. Jargon makes it impossible for people to understand things. Not because it's complicated, but because it's insulated from them by not knowing what these terms mean. I think that's true of even non-controversial scientific ideas like relativity or quantum mechanics, what a black hole is. These things can be explained. You can explain a hard, scientific controversy if you cut out the jargon. You can do it using terms that have real sophistication and that are respectful of the positions that oppose the position that you're making a film or writing an article or making a speech about. So that's really the premise for all of my work, the sort of political side of my work.
In the secrecy project we've interviewed quite a number of people who work for the Central Intelligence Agency, for the National Security Agency, as well as journalists and people who are activists from the Federation of American Scientists, for example, against secrecy. We've interviewed people who are assembling archives of declassified documents so that people will have an opportunity to better understand aspects of historical and current events. Our idea in the film is to look at the increase of secrecy to try to understand how it came to function the way it does, to put forward the best and strongest arguments for certain kinds of secrecy and for openness in various ways. My own view in many of these debates is that one of the most important aspects of the current discussions about secrecy is the necessity of oversight. And that there be mechanisms that are set in place that make sure that when people, say a government agency, for instance, wants to take a certain action there is some independent group that is evaluating this.
Sometimes, this independent group may itself have to operate under secrecy. But just the fact that you have to face somebody who is not a member of your immediate agency, group or culture has a huge effect ....

..in terms of self-censoring.

Yes. And we know that from other walks of life. For instance, we have courts that operate under secrecy. All of the countries in Western Europe and the US do. There's secrecy for different reasons. We have sexual crimes, for instance, where we expect, in fact, we demand that the courts will respect a certain degree of privacy about the names of a child who has been abused, for instance, details of what happened, in order to protect the child. But we expect the courts to be able to deliberate and make these decisions under the law but to maintain restrictions on the flow of information. We have cases of libel where we expect restrictions about the distribution of information, but we expect our courts to function. We have courts that have to deal with arguments of proprietary knowledge. If you have a firm that steals the plans for building a radio from another firm the information cannot be plastered on the front page of every European newspaper or it would aggravate the crime that has already occurred. So we expect our courts to be able to handle proprietary information, but to do so under the law and independent of the companies involved.
In the U.S. we have what it's called the FISA (Foreign Intelligence Surveillance Act) court, that the government agencies have to go to when they're in an intelligence operation and they want to do certain things. They have to go to a court and say, ‘We want to do surveillance about this'. Now, attempts to go around that court have been at the heart of some of the big disputes in the United States over the last years. But the courts that were established in the late 1970s have played a very important role and have handled tens of thousands of cases. These are all examples of having oversight that's not completely public for very different reasons, but they're important and I think that we let the oversight function fall by the wayside. It's a tremendous risk. I think it's very dangerous and bad for government. Those procedures must be followed. Then there are other issues that are involved that are very complicated and which I don't think admit a simple solution.
For instance, in every Western European country and in the United States, there are times when the newspapers want to publish something and the government doesn't, involving an intelligence operation or actions by the government that are secret for other reasons. Each country handles this in different ways. In many European countries it's handled by a kind of self-censorship on the part of newspapers, radio, television. Sometimes it's handled by the fact that leaders of the media went to school with the people in the government and they get a phone call one evening that says, ‘We don't want you to publish that. We hear you're going to do that and we don't want you to. Don't do it.' And they don't. In Britain, the Official Secrets Act has tremendous power over what can be disclosed and it's taken very seriously. In most European countries historians won't touch the history of their country's nuclear weapons programs. We have in Britain no serious history of British nuclear weapons, there's none in France; it's completely typical. There's no good history of German intelligence agencies or the French intelligence agencies. So there are a lot of things that simply don't get discussed. In the U.S. there has been a great debate because of exposures of things and sometimes the things that get exposed are abuse. In those cases, if you ask me personally, I think they should be exposed. There are cases where something is exposed that is purely a method of collecting what most people would recognize as legitimate intelligence information and that seems relatively clear, I mean, it's clear to me, but it's not clear to many journalists who don't think there should be any limits. I think there are cases when it would be a bad idea to expose information. Then there are cases that are very complicated where some people see them as abuse and some people see them as legitimate intelligence.

And, to conclude, do you give examples in your film?

Yes, we do. We have people who debate these issues. So my hope is that this film will serve as, again, like the H-bomb film, as something that would treat these disagreements seriously and isn't simply a mocking position where one parodies one side or the other. What I'd like to see more of, in areas that other people know much more about than I do, is the presentation of some of these debates in ways that would have a forceful, the best possible presentation of different positions on issues so that more and more we, and I include myself, will get to know the major technical, scientific, moral, political arguments that bear on us.
Because I think that as we go forward over the next years, more of our decisions that we have to make within a democracy are going to have technical scientific components. Our political systems are not designed for that. We have a pretty good idea of how to argue out in public topics such as whether taxes should be increased, we don't always do it in a fair or good way, but we have a pretty good idea of how to do that. We have a pretty good idea about discussions of the more typical political sort, but not how to discuss questions such as genetically modified organisms or climate change or cloning or nuclear waste or nanotechnology, or.. We're really bad at that and I think, we've come full circle now, by simply saying, ‘Get everybody involved and have a kind of town meeting where everybody's opinion comes in and we all compromise and that'll be the best solution' is hopelessly naive. It really doesn't get at the question of how you can put people in a position where they can make decisions, informed decisions about what's going on. I think that the leftover vocabulary from 1968 or 1986 is just not good enough.

Show/Hide links in this document

Links in this document:

  1. 1] /schedabiografica/Redazione FGB
  2. 2] /it/pagine/2007/09/peter_galison.html
  3. 3] /en/grasseni/2006/03/quale_responsabilita_parte_2_c.html
CC Creative Commons - some rights reserved.
Peter Galison


Articles by:  Redazione FGB
Search by:
Search video by:

- Mailing list Subscription - Cookies Policy - Privacy Policy -

RSS Feed  Valid XHTML  Diritti d'autore - Creative Commons Gruppo Fondazione Giannino Bassetti in Facebook Gruppo Fondazione Giannino Bassetti in Linkedin Segui la Fondazione Giannino Bassetti in twitter

p.i. 12520270153