The description of CERN in "Growing through Sabotage"

Post your research in political economy, share ideas and workshop your empirical work.

Moderator: sanha926

The description of CERN in "Growing through Sabotage"

Postby YGodler » Fri Apr 27, 2018 5:18 am

In "Growing through Sabotage" Bichler and Nitzan rely on Ulf Martin's account of CERN (based in turn on Knorr-Cetina's account), and refer to it as characterized by only a weak impulse to command and as organizationally "flat" (pp. 28-29).
But physicist and radical thinker Jeff Schmidt has provided a very different picture of CERN in his book "Disciplined Minds". On p. 90 Schmidt writes that:

"...In particle physics, for example, it is not unusual for several hundred physicists from several dozen institutions to collaborate on a single experiment at an accelerator laboratory. Thus, one routine paper in Physics Letters, reporting results from a standard-model experiment at the European particle accelerator laboratory CERN, has 562 authors from 39 universities and government laboratories. (The names under the title consume three of the article's nine pages.) Each institutional group of physicists contributes a component of the experiment, and individual physicists within each group specialize even further. A professor might be preoccupied with budgets, production schedules or purchasing decisions. A research associate or postdoc might work on an electronics system, part of a particle detector or a computer program for receiving data. One might find a graduate student calibrating photomultiplier tubes, soldering the hundreds of wires in a wire-chamber particle detector, tracing impulses through an electronics system or sitting in front of a computer terminal debugging a program. In any case, the typical physicist in the collaboration does work in which no amount of creativity could significantly influence the overall course of the experiment. And with narrow work assignments, individual initiative is more likely to cross boundaries and to be seen as intrusive, even if it makes sense scientifically."

For further reading on the issues raised in this paragraph Schmidt refers to:
Pickering, Andrew R., and W. Peter Trower. "Sociological problems of high-energy physics." Nature 318 (1985): 243-245.

I think this is not simply a matter of exchanging footnotes or comparing quotes. The CERN example plays an important role in "Growing through Sabotage" in establishing that "[H]ierarchies...are not necessarily the most effective way to convert energy" (p. 29), but if the CERN example is actually hierarchical, then one would need a better example of flat coordination to sustain Bichler and Nitzan's tentative conclusion. In contrast, Schmidt argues that the "extreme specialization" and "hierarchical division of labor" (p. 90) at CERN leads to individual initiative and creativity being discouraged. Of course, Schmidt's description could be wrong or operate at a different level of analysis, but at least prima facie this contrasting account of CERN should be addressed, if Bichler and Nitzan's tentative conclusion is to be entertained.
YGodler
 
Posts: 12
Joined: Thu Nov 26, 2015 4:56 am

Re: The description of CERN in "Growing through Sabotage"

Postby uma » Mon May 07, 2018 2:27 am

Hi. Since I have brought this CERN stuff up, here is my short comment. What I have in mind will eventually require a longer elaboration. I've put Jeff Schmidt on my reading list.

Hierarchy

(Jeff Schmidt) It is not unusual for several hundred physicists from several dozen institutions to collaborate on a single experiment at an accelerator laboratory. Thus, one routine paper in Physics Letters, reporting results from a standard-model experiment at the European particle accelerator laboratory CERN, has 562 authors from 39 universities and government laboratories.


Indeed, look at the announcement of the discovery of the Higgs boson by the Atlas Collaboration [1]. The 2932 members of the collaboration at that time are listed in the end (pages 25 to 32). The rule is that if someone becomes member of the collaboration their name will start to appear after 3 month at any of their publications until 3 month after that person has left the collaboration. Also observe that the names are ordered alphabetically. There is no identification of heads of the collaboration vs lower orders. The alphabetical order is what any dictionary does, and just like in dictionaries does not imply importance (words beginning with "Z" are not less important than those with "A" even though they appear later). The idea behind this way of attribution is that there is nobody who can claim priority because everybody's work is equally important to the collaboration, whether its is great innovation or some bug fixing, theoretical or practical, etc. I could not imagine an attribution system which is as unhierarchical as this.

[1] The ATLAS Collaboration. Observation of a new particle in the search for the Standard Model Higgs boson with the ATLAS detector at the LHC. Physics Letters B, 716(1):129, September 2012.
http://arxiv.org/abs/1207.7214.

In contrast, in social sciences order of names implies importance which is why often the institution's head comes first even if his postdoc did all the work alone. It is the social sciences who seem to have a hierarchical attitude. And note that the institution sizes are often 2 orders of magnitude smaller (say 20 to 30 persons) than the Atlas Collaboration. (NB. I think this is why Bichler and Nitzan sometimes order their names Bichler/Nitzan and sometimes Nitzan/Bichler in order to communicate equality; but this creates the mess that their publications appear at two positions in the reference list instead of just one.)

(Jeff Schmidt) Each institutional group of physicists contributes a component of the experiment, and individual physicists within each group specialize even further. A professor might be preoccupied with budgets, production schedules or purchasing decisions. A research associate or postdoc might work on an electronics system, part of a particle detector or a computer program for receiving data. One might find a graduate student calibrating photomultiplier tubes, soldering the hundreds of wires in a wire-chamber particle detector, tracing impulses through an electronics system or sitting in front of a computer terminal debugging a program.


What Schmidt apparently calls extreme specialisation I call super specialization: nobody really knows what the other does. But this is precisely the point of Knorr-Cetina for why there is very few hierarchy: because nobody understands enough of the work of the other, they could not give orders to them [2]. The quoted passage from Schmidt in fact supports this view. It also supports the meritocratic aspect of entry to the collaboration: the candidate needs to proof that they are able to make unique contributions to the project and that they are world class in their field, but at the same time ready to do the shift work in the control room [2].

[2] Johann Grolle. Ein Urknall auf Erden. Spiegel, 2008(27):102-113, 2008.
http://hep.ph.liv.ac.uk/%7Emklein/lhcPR ... _Erden.pdf.
(Sorry for giving just a German reference at this point.)

The alleged hierarchical division of labor of which Schmidt speaks of (?) is difficult to make sense of in this context. Any large artifact is of course technically split into modules, these into submodules, sub-sub-modules and so on, the creation of each sub-...-module demanding specialised work. For example a house is sketched by the architect, calculated by the staticians, there is a construction supervisor who manages the general coordination, a team who installs the heating system and so on. But that doesn't imply a a political hierarchy as to who decides what to do at all levels, especially not, whether to build a house or not.

Creativity

(Creativity is overrated.)

(Jeff Schmidt) In any case, the typical physicist in the collaboration does work in which no amount of creativity could significantly influence the overall course of the experiment. And with narrow work assignments, individual initiative is more likely to cross boundaries and to be seen as intrusive, even if it makes sense scientifically.


I have a nice example for what Schmidt is talking about: the ROOT framework for data analysis created used at CERN and many other physics institutions [3]. I investigated this because I was comparing for data analysis frameworks. I thought: if it is used in such an experiment (and is free software) it might be a nice choice. But then there is the critique that it is actually a rather bad design from a usability as well as a software engineering point of view [4-6]. Indeed, we (are said to) live in the age of "big data" and ROOT is made for analysing really big data sets; and yet, no one outside the physics community seems to use it (unlike many many other framework that have appeared in the free software scene). Only with version 7 the maintainers of ROOT seem to address some of the most severe criticisms concerning fundamental design issues at the price of introducing backward incompatible changes. So here you have it: a part of the system which everybody agrees has severe defects but cannot just be changed.

[3] CERN. ROOT a Data Analysis Framework. https://root.cern.ch/, 2018.
[4] Andy Buckley. The problem with ROOT (a.k.a. The ROOT of all Evil). http://insectnation.org/articles/proble ... -root.html, August 2007.
[5] Hacker News. ROOT  Data Analysis Framework: Comments.
https://news.ycombinator.com/item?id=11529660, 2016.
[6] Quora. Why does CERN use ROOT?
https://www.quora.com/Why-does-CERN-use-ROOT?share=1, 2016.

The Atlas experiment (and of course, its sister, the CMS experiment!) is a big research machine, conceived in the mid 90's. And it is vastly complex, perhaps the most complex human artifact ever created, both in its conceptualisation (beginnig with quantum phyisics as its foundation) and the technological execution. It dwarfs cathedrals, Indian temples or the Alhambra or anything created before in this respect.
Building an artifact like this requires that decisions are taken at any time on how to proceed. And any decision taken rules out all the other possibilities that may have been considered or only realised afterward. This is called path dependency. It is a matter of fact in human life: you cannot undo the past.

Once the big artifact is created its operation relies on continuous maintenance. There is not much place anymore for creative changes that would introduce fundmantal breaks or a chain of other reworkings (of unknown outcomes). For example, the ROOT framework is part and parcel of the data analysis done at CERN, everybody there uses it (all the graphs of [1] were made with it) so it is not something you can just change just because it is bad software engineering (computer programming in general is mostly about bug fixing and maintenance, whether in free of closed source software). This is why a conservative attitude towards creativity is actually rational.

Path dependency is a matter of life in society in general. Take the frequent uprisings in the Banlieues of France. In the 1950s the French administration took the decision to solve the housing problem by building huge suburbs of urban housing. Now there are the big social problems. But you cannot just undo the wrong decision of half a century ago. Similar arguments could be made for the urban sprawl due to the scrapping of public transport in the US, or the Autobahn-based transportation in Germany.

Since hierarchy and creativity are the two points in the Jeff Schmidt quotation, we get the following

Summary

The Jeff Schmidt quotation does not indicate a problem with hierarchy but rather supports the views of Knorr-Cetina in this point (I have other issues with her work). And the alleged uncreativity of the work at CERN is actually a matter of life as such  the fact that artifacts once created need to continue to function and be maintained and has nothing to do with what makes CERN special or intersting for CasP.

On the other hand, the intention is neither to idealise CERN or any other human activity where we might throw a glimpse to aspects of an autonomous postcapitalist world, nor is the intention to take CERN as a model for such a world. (The arguments are similar to those Castoriadis made for why Athens isn't a model for the autonomous society but remains a most important inspiration for both, the issues raised (questions asked) and solutions institutionalised (answers found) [7-10].) And indeed, there are problems with modern science and its institutionalisation (at CERN and elsewhere), some of which have to do with the fact that science take place in the capitalist world and some maybe with the particular type of rationality of it. These would indeed be questions that need to be taken up in a more elaborate treatment.

[7] Cornelius Castoriadis. The Greek Polis and the Creation of Democracy. In Philosophy, Politics, Autonomy, pages 81123. Oxford University Press, New York, 1991 (1982).
[8] Castoriadis, C. The Athenian Democracy: False and True Questions. in The Rising Tide of Insignificancy (The Big Sleep) 311–328, 2003, http://www.notbored.org/RTI.html
[9] Cornelius Castoriadis. Done and To Be Done. In The Castoriadis Reader, pages 361-417. Blackwell, Oxford, 1997 (1989). https://becomingpoor.files.wordpress.co ... reader.pdf
[10] Cornelius Castoriadis. What Democracy? In Figures of the Thinkable, pages 195-246. Notbored.org, Brooklin NY, 2005 (1990). http://www.notbored.org/FTPK.pdf

Cheers, Ulf
uma
 
Posts: 13
Joined: Thu Dec 11, 2008 6:59 am

Re: The description of CERN in "Growing through Sabotage"

Postby YGodler » Tue May 08, 2018 10:29 am

Hi Ulf,

Many thanks for the thoughtful and detailed response, as well as for the reading recommendations. I learned a lot from them and will continue to.

Nonetheless, I have a few follow up questions.

You write

The idea behind this way of attribution is that there is nobody who can claim priority because everybody's work is equally important to the collaboration, whether its is great innovation or some bug fixing, theoretical or practical, etc. I could not imagine an attribution system which is as unhierarchical as this.


But wasn't Bichler and Nitzan's point more about the lack of a command hierarchy in decision-making [e.g."...so there is little room for ‘command’ to begin with (who could tell them what to do?)", p. 28], rather than about the absence of hierarchy in the attribution system?

At least in principle, the attribution system can be non-hierarchical but the decision-making process may be hierarchical nonetheless.


The insight about path dependency is interesting but, contrary to how I understand your concluding remarks, I think it might have severe implications for CasP.

You write on the one hand that

Once the big artifact is created its operation relies on continuous maintenance. There is not much place anymore for creative changes that would introduce fundmantal breaks or a chain of other reworkings (of unknown outcomes).


But then you conclude that

...the alleged uncreativity of the work at CERN is actually a matter of life as such  the fact that artifacts once created need to continue to function and be maintained and has nothing to do with what makes CERN special or intersting for CasP.


I want to focus on the relevance (or lack thereof) of this uncreativity to CasP. If we grant that this kind of uncreativity is the necessary product of the creation of any big artifact and juxtapose this insight against the dialectic between power and creativity on which CasP seems to rest, doesn't this mean that we are doomed to sacrifice creativity even in (arguably) non-hierarchical forms of social organization? -- if this is so, creativity can be constrained by conditions other than power.


Best, Yigal
YGodler
 
Posts: 12
Joined: Thu Nov 26, 2015 4:56 am


Return to Research

Who is online

Users browsing this forum: No registered users and 1 guest

cron