Science in the twenty-first century

27 July 2021

Science in the twenty-first century

The State Treasury, or the Hearth Cities, Jan Toorop, 1895. Credits: Rijks Museum

A momentary and felicitous time fault has made possible internet access to the history of science during the whole of the twenty-first century as written at the beginning of the twenty-third. Due to the intrinsic uncertainty of the future, two different versions have been retrieved. The first one is a story of downfall, precipitated by the world Catastrophe of the mid-21st century, followed by a cautious but hopeful recovery in a new emerging society, cleared from market economy and military power. The second one on the contrary shows a qualitative transformation of science due to the increasing pressure of economic demands, leading to a temporary eclipse of fundamental research, overshadowed by technical applications. One may only hope that other possible scenarios will become available.

Wishing to work on the history of science at the beginning of our 21st century, I recently sought to gather some bibliography on the internet. What was not my surprise to find, among works by contemporary researchers, a strange document, dated 2221. The reading of its English summary, proposed on the site, quickly convinced me that the date was not a typographical error, and that I was dealing with a text coming from the future, written two centuries from now, arrived through an exceptional temporal fault which had allowed this retrotransmission. Downloading it, I noticed that it was written in a language I could not understand. A fellow linguist identified it as a strange variant of Songhay - the place of publication indicated was Timbuktu - which she nevertheless managed to translate. It is this text that will first be given below.


A few days later, I tried to consult the same site for further study. But, to my surprise, the text had completely changed, except for the first paragraph. Now published in Singapore and printed in Chinese ideograms, it was written in a somewhat deviant Mandarin, but also translatable. Its content was quite different, as you will see when you read this second version. My subsequent attempts to explore this site all failed on the fateful screen display: "Error 404, file not found". Clearly, the temporal rift I had been the happy beneficiary of had closed.


At least, having two versions of the future, I may reassure myself that the possibility of either one does not guarantee its effective realisation and that the future is not predetermined. Let us furthermore hope that new time faults will make alternative histories available.



I. The 21st century: downfall and renascence


Extract from the collective work Introduction to the History of Science, University Press of Timbuktu, 2221.


Summary of the previous chapter


We have seen that the 20th century witnessed a remarkable development in the natural sciences. The physical sciences of the microcosm (from atoms to nuclei and quarks) and those of the macrocosm (from stellar systems to galaxies and quasars and the Universe itself), not forgetting those of the mesocosm (new states of matter, fluid dynamics) made immense progresses. The same was true of life sciences, both in the field of genetics, at the molecular level among others, and in the domain of life evolution. We have noted, however, that most of the fundamental advances, reflecting real epistemological breakthroughs, were essentially the work of the first half of the 20th century: relativity, quantum theory, evolutionary cosmology, molecular genetics, and foundations of computer science.


The second half of the 20th century saw an unprecedented expansion of technologies based on this new scientific knowledge. The use of nuclear energy, first for military and then civilian purposes, micro- and nanoelectronics, photonics (lasers), controlled chemical synthesis, computer sciences and telecommunications, antibiotic and antiviral medical treatments, cloning of higher mammals, etc., all bear witness to these advances. However, taking place in a society governed by an increasingly unregulated market economy, these technologies were more and more submissive to the law of profit (1). As a result, scientific institutions were gradually directed towards private economic objectives, aiming at short-term payoff*, to the detriment of open and unfocused research in the general interest.


At the same time, the social and human sciences were oriented towards the response to practical demands, in terms of "communication" (as advertising* and propaganda* were called at the time), commercial management, social organisation and political control. The intellectual ambitions of the sociologists and anthropologists of the 1950s and 1980s were derided, and their work was practically relegated to the status of cultural antiquities.


As a result, the Nobel Prizes now systematically rewarded technical innovations, sometimes minor, but considered beneficial, as, for example, the discovery of a catalytic process fuelled by direct solar energy for degrading the century old accumulated plastic waste in the oceans (Chemistry prize 2059)

The beginning of the 21st century


The commodification* of knowledge intensified. The large industrial multinational trusts* (weaponry, electronics, computers, pharmaceuticals, petrochemicals, agrochemicals) and military institutions took on major importance in financing scientific research, whether in their own laboratories or by backing public research organisations on contract bases. Amplifying a movement that had its origins in the Manhattan Project for the production of nuclear weapons during the Second World War, scientific research was increasingly organised on the model of industrial production, with the associated constraints of competitiveness and productivity. A major and ever-increasing share of scientific activity was devoted to technical applications capable of fuelling market economy* and power politics.


Research into open fundamental questions was gradually reduced to a bare minimum. Thus, the physics of matter constitution at the sub-nuclear level experienced its swan song in 2012 with the discovery of the Higgs boson. The cost of the equipments, the cumbersome nature of their collective organisation and the lack of utilitarian prospects led to the closure of the large particle accelerators (such as CERN's LHC) around 2030, and to the almost complete abandonment of this field. Research into cosmological enigmas, such as the nature of "dark matter", was similarly given up and remained unsolved due to a lack of technological perspectives and plans for new giant telescopes were laid aside, their cost being deemed prohibitive in the atmosphere of chronic economic uncertainty* that characterised the years 2020 to 2040.


In addition to economic constraints, epistemic dead-ends were also encountered. For example, the formal sophistication of the hypotheses proposed to solve the open problems in particle physics and the impossibility of testing them experimentally — as was the case, for example with the so-called string theory —, make it possible, with hindsight, to compare these attempts with the astronomical theories of Antiquity and the Middle Ages, which, with the help of their epicycles, deferents and other equants, were able to emulate observations without explaining them. Similarly, in the life sciences, epigenetics, which aroused immense interest during the first years of the 21st century by promising to overcome the simplistic visions of molecular biology, very quickly came up against insurmountable difficulties because of the complexity exhibited by the networks of the biochemical interactions involved. The same was true for the problems posed by ageing, cancer, cloning, morphogenesis, not to mention the structure and working of the brain, etc., where research stalled.


At the end of the 20th century, computer simulation was established as a fully-fledged scientific practice, extending the traditional duality of theory and experience into a genuine epistemological triality. However, despite the development of quantum computers, which remained very costly and fragile, this methodological approach only very partially fulfilled the hopes it had raised. It became clear that a numerical simulation, even a successful one, did not by itself provide much conceptual understanding. A well-known quip attributed to a famous theoretical physicist, Eugen Wigner, had anticipated this by the middle of the 20th century. To one of his students who proudly showed him the results of a computer calculation in molecular physics that correctly reproduced the experimental values, he commented: "I am glad to know that the computer has understood the problem. But I would like to understand it as well".


Thus, gradually ceasing to produce new fundamental discoveries, scientists were led to focus solely on the implementation of the knowledge already acquired, with a view to short-term technical applications, improving their efficiency, reliability and profitability* so as to constantly renew the market supply* of consumer goods* (telephones, computers, cars, televisions, medicines, textiles, etc.). If certain sectors, primarily pure mathematics, escaped this subjugation, it was at the price of an increasing isolation, which locked them up in proper intellectual ghettos.

The social and human sciences experienced a parallel decadence. The movement outlined in the previous section intensified, to the point that most of the basic research institutions in these fields were simply abolished. The teaching of these disciplines was essentially confined to practical training in business and management, in a purely utilitarian perspective.


The scientific professions were also experiencing a very serious renewal crisis. Not only did serious budget cuts affect research institutions, drastically reducing their recruitment possibilities, but vocations became scarcer, with the young men and women most interested in science turning away from jobs that were increasingly subject to institutional constraints, industrial models and functional tasks, to the detriment of any major intellectual ambition.



The Old Garden of Sorrows, Jan Toorop, ca. 1890. Credits: Rijks Museum

In parallel with the decline in the production of knowledge, a serious loss of confidence in scientific activities grew within society. As the practical effectiveness of 20th century technoscience manifested itself both in real advances in the well-being of mankind and in grave threats to its future, the scientistic ideology, with its belief in a progress of society essentially driven by the progress of science, lost all strength of conviction. Scientists had promised too much and kept their promises too little or too badly for their credibility to remain intact. The disavowal was amplified by insidious disparagement campaigns conducted by economic and industrial powers, whose profound dangerousness was revealed by independent scientific studies, for example concerning the threat of global warming or the misdeeds of industrial food.


With hindsight, this period can be analysed as the end of a historical phase lasting four centuries, beginning with the Scientific Revolution at the beginning of the 17th century. This stage, which as a matter of fact was very special in the history of science and, more generally, of cultures, may be characterised by the efficient combination of the will to understand the world and the desire to transform it. The Cartesian programme, "to become as masters and possessors of nature", was not realised until the end of the eighteenth century and finally took shape in the nineteenth and twentieth centuries. But this exceptional confluence of the theoria and praxis proved to be unstable, the latter imposing itself on the former. Science, now the source of new techniques, found itself trapped by its promises and was called upon to fulfil them without delay. Transformed into technoscience, it was thus gradually deprived of its intellectual ambition in favour of its material efficiency and, in a way, disappeared as such, or at least entered a period of latency.


Although this conversion was received at the time as an incomprehensible surprise by the thurifers of a science which they had come to regard as universal and eternal, it was just another episode in a long history of such eclipses and reappearances. Indeed, Greek science had died out after Alexandria, with Roman culture hardly taking over; and, to take a second example, Arab-Islamic science, after seven centuries of major developments, had collapsed in the 13th century, long before Europe took up the torch in the 17th (2).


The middle of the 21st century


This section can only be very brief: the Catastrophe (see boxed reminder) simply put an end to all collective scientific activity on the surface of the globe - as with so many other social practices. The extermination of a large part of the population did not spare scientists, institutions collapsed, facilities were destroyed, and much of the knowledge accumulated in libraries, both physical and digital, was lost forever.



Reminder. The Catastrophe (2040-2060) Although, at the beginning of this twenty-third century, the Catastrophe is the cornerstone of our historical memory, let us recall the essentials of the events. From 2030 onwards, the global warming that had begun by the end of the 20th century accelerated dramatically. Agricultural production in the temperate zones fell, threatening from food shortages even in the most developed nations. The United States, then led by neo-conservatives expressing the interests of the military-industrial complex, unilaterally embarked on vast geo-engineering operations, creating a screen of sulphur aerosols in the upper atmosphere in order to attenuate solar radiations. After some initial successes, the effects of this global manipulation proved disastrous, suppressing the Asian monsoon and worsening the agricultural disaster and the general economic crisis. Social conflicts escalated and authoritarian populist formations, exploiting people's resentment, took power in most major nations. The ensuing xenophobic and chauvinistic backlash only aggravated internal and external tensions.

It was against this background of instability that the cataclysmic events of 2045 began. On August 6, just a century after the bombing of Hiroshima, a terrorist group combining Japanese ultra revengers and neo-Jihadist extremists from the Middle East detonated small nuclear charges in New York, symbolically destroying the Statue of Liberty and Wall Street and contaminating the city to a large extent. Before the United States reacted, but probably with their approval, Israel launched a nuclear attack on the military sites of the Syrian-Iraqi caliphate deemed responsible. Outrage in the Muslim world prompted Pakistan to retaliate, completely destroying Israel, but triggering an Indian counterattack. Within hours, overt and covert military alliances spread the conflict between Islamic states supported by Russia on the one hand and India associated with Western powers on the other. The computerised automation of military detection and response systems left little room for human intervention. The thermonuclear exchanges, which soon enough came to an end when the stocks of bombs available on both sides were exhausted, were automatically replaced by the use of biological weapons that had been secretly prepared by all the warring parties for a long time. Anthrax, plague and other devastating infectious diseases struck down what remained of the populations, greatly weakened by the radioactivity of the nuclear fallouts. China, which had remained neutral, finally did not escape the Catastrophe.

The conflict ended after only a few weeks due to the mutual destruction of the armed forces. But about half the population of Eurasia and North America had disappeared and the entire northern hemisphere was devastated and polluted. The huge amounts of dust sent into the upper atmosphere by the explosions led to a dramatic reverse climate change ("nuclear winter"). Large parts of the territories had become uninhabitable and the survivors subsisted in miserable conditions on scattered patches.

The southern hemisphere hoped it could get away with the disaster, and for a brief period Africa, South America, Indonesia believed their time of power had come. But after a few years, intense migratory movements, despite the precariousness of the available means of transport, brought large masses of the pitiful remaining populations of the North to the lands of a more promising South, upsetting the demographic and cultural balance. The violent rejection of the indigenous people destabilised state institutions and led, through localised but innumerable conflicts, to widespread chaos. After two decades, the South, although less depopulated, was as devastated as the North.

Humanity, or what was left of it, gradually reorganised itself on the basis of autonomous regional communities, comprising between a few thousand and a few hundred thousand members, relatively isolated from each other because of the uninhabitable areas separating them. They gradually managed to re-establish links at a distance, without ever reconstituting collective units the size of the former nations. Thus was formed what is still today, more than a century and a half after the Catastrophe, the form taken by the human community: the Network, made up of diversified entities, without common borders and without any centralised institution. The autonomy of these multiple clusters guarantees the stability of the Network thanks to mutual tolerance based on the absence of territorial conflicts and the benefits of remote exchanges between groups with limited but diverse and complementary resources.



In a provocatively titled book, Power without knowledge?, the economist Jhen Sok-yen showed in 2056 that, from the 17th century to the 20th, the conjunction of progress in our ability to know the world and our possibilities to act on it was in fact a historical rarity.

The end of the 21st century


A number of the less precarious emerging communities, which included survivors of the former intellectual professions, witnessed a return of interest in theoretical knowledge, philosophical speculation and scientific issues. By collecting the books and journals that had been spared, it was possible to reconstitute some of the pre-disaster libraries. It should be noted that,although electronic documentation was overabundant and hegemonicat the beginning of the 21st century, it was completely lost most computers had not withstood the radiations, and the data banks and their communication networks even less so. Small and varied study and training groups were formed, gathering around the veterans young people eager for knowledge, in much the same way as the schools of Antiquity (Plato's Academy, Aristotle's Lyceum) or the cenacles of the Renaissance.



The Old Garden of Sorrows, Jan Toorop, 1889 – 1890. Credits: Rijks Museum.

But their activities, however modest, immediately aroused great suspicion and met with strong hostility. Most of their fellow citizens considered it a moral aberration that scientific research, which had made an essential contribution to the Catastrophe by enabling the development of weapons of mass destruction, should be resumed. Taking note of this mistrust, the new academies made it their duty to accompany their work with a profound ethical and political reflection. Looking back at the history of science in the 20th century, the protagonists of the revival understood the fatal chain that had, as we have seen, transformed science into technoscience and subjected the desire to know to the will to power, subjecting theoria to praxis. Admittedly, the new conditions of decentralised organisation of a now disarmed humanity and the non-market modalities of exchange no longer allowed the stranglehold of military and economic powers on knowledge. There was, however, a risk that a redeployment of the natural sciences might revive the Promethean project and the forms of social organisation that had made it a reality, ultimately reproducing its disastrous perverse effects.


These "seekers", as they were called, quickly convinced themselves that before reflating the natural sciences, they had to revive the sciences of mankind and the (new) society. The old hierarchy of disciplines was thus overturned, with the anthropic sciences (sociology, ethnology, anthropology, psychology, mediology, etc.) taking precedence over the physical sciences (of matter and life), and relying on a solid philosophical and cultural foundation. This approach had been announced by one of the great creators of the previous century, Bertolt Brecht, who as early as 1939 had prophetically written :


«The more we take from nature through the organisation of work, through great discoveries and inventions, the more we seem to fall into the insecurity of existence. It is not we who dominate things, it would seem, but the things that dominate us. This appearance is due to the fact that some humans, through things, dominate other humans. We will only be free from natural powers when we are free from the violence of humans. If we want to profit as humans from our knowledge of nature, we must add to our knowledge of nature, the knowledge of human society.» B. Brecht, L’Achat du cuivre, L’Arche, 1955 (translated by JMLL)


Not only did serious budget cuts affect research institutions, drastically reducing their recruitment possibilities, but vocations became scarcer, with the young men and women most interested in science turning away from jobs that were increasingly subject to institutional constraints, industrial models and functional tasks, to the detriment of any major intellectual ambition.

The development of a permanent self-reflexivity allowed the seekers to become aware of the epistemological limits of the former cutting-edge disciplines, in cosmology, sub-nuclear physics, subcellular biology, which had remained myopic about their conceptual shortcomings. Carried away by an increasingly sophisticated technicality, both in terms of their mathematical formalisms and their experimental procedures, these sciences had come to neglect any critical reflection on their notions and formulations. This was quite evident in the unsuitable vocabulary they had adopted for their latest advances: terms such as "relativity", "uncertainty principle", "deterministic chaos", "big bang", "inflation", "dark matter", "super-string", "genetic code", "adaptation" and many others proved to be so ambiguous that a linguistic overhaul was necessary, similar to those which the Linnaeus, Lavoisier, etc., had carried out in their time for the classical sciences, but on a much larger scale.


This project was carried out collectively, uniting the hitherto scattered groups of seekers into a flexible and open community. The discussions were long and arduous. Eventually a radical solution was agreed upon. In order not to favour any of the languages that had previously been dominant in the scientific field, it was decided to adopt a universal symbolic writing for the new terminologies using Chinese ideograms, allowing their verbalisation in any common language (the high proportion of seekers with Asian origins was obviously not unrelated to this choice). This major linguistic reform, through the critical attention it demanded to focus on the results of 20th century and even earlier research, led to the realisation that many promising avenues of investigation had remained unexploited. Thus the history of science, far from being a simple memento of the past, was able to play an active role in the production of new knowledge, as well as in the dissemination of older knowledge.


At the same time, the seekers realised that the development during the 20th century of a social body of scientific researchers, whose sole task was to produce knowledge, without necessarily having to reflect on and share it, had played a definite role in the decadence of the early 21st century. Thus, starting from the principle that it is only possible to understand what can be transmitted (and vice versa), the seekers made it their duty to carry out together the production and dissemination of knowledge. In the same perspective, they understood that the renewal of science required its re-cultivation and resourcing in the arts and letters, taking into account the diversity of civilizational traditions that had survived the Catastrophe.


We shall see in the next chapter how this new science was able to develop in the 22nd century.


Index:

  • Advertisement: publications, in one form or another, aiming to promote the consumption of goods or services so as increase the profits of the producers. See also propaganda.

  • Commodification: the transformation of some free resources into a selling good (so-called commodity)..

  • Consumer goods: typically, artefacts, either material or virtual, conceived for the consumption of the overall population, thus separating the use from the production.

  • Economic uncertainty: inability to predict economic trends generating difficulties to allocate investments to generate profit. See profit.

  • Market economy: economy organized on the confrontation between sellers and buyers (i.e., supply and demand), resulting in a social structure dominated by monetary exchanges.

  • Payoff: a return on investment, that is, when investment starts to generate profits. See profit.

  • Profit: the difference between the gains and the costs for a given process or good in an institution like a private company resulting in the increase of wealth by its owners.

  • Profitability: the degree to which an activity yields profit. See profit.

  • Propaganda: concerted propagation of social, cultural, ethical representations by private or governmental controlled media in order to further a political or ideological agenda. See advertisement.

  • Supply: what is proposed to buyers on a market

  • Trust: a company in a dominant position in a market sector.



Fundamental research saw its importance decrease. After several decades of effort, researchers were unable to solve the most serious problems they had identified. This was the case with cosmological puzzles, with the notions of "dark matter" and "dark energy" appearing as mere fig leaves that masked the need to rethink the very foundations of general relativity.

II. The 21st century: eclipse

Excerpt from the collective work Introduction to the History of Science, University Press of Singapore, 2221.


Summary of the previous chapter


We have seen that the 20th century witnessed a remarkable development in the natural sciences. The physical sciences of the microcosm (from atoms to nuclei and quarks) and those of the macrocosm (from stellar systems to galaxies and quasars and the Universe itself), not forgetting those of the mesocosm (new states of matter, fluid dynamics) made immense progresses. The same was true of life sciences, both in the field of genetics, at the molecular level among others, and in the domain of life evolution. We have noted, however, that most of the fundamental advances, reflecting real epistemological breakthroughs, were essentially the work of the first half of the 20th century: relativity, quantum theory, evolutionary cosmology, molecular genetics, and foundations of computer science.


It should be remarked, however, that, concerning the first half of the 20th century, most of these discoveries were made in small-scale fundamental research laboratories, that their theoretical content remained largely esoteric, and that their practical and economic impact was very limited. They had to wait the middle of the century, in the 1940s, before being applied massively into technology, essentially under the demands of the Second World War. War economy, allowing and even requiring considerable investments by the belligerent states, without subjecting them to financial criteria of profitability, ensured the development of numerous technical applications implementing the theoretical principles established in the preceding decades. The prime example is certainly that of nuclear weapons: nuclear fission, discovered in the laboratory on a modest experimental scale at the end of 1938, was exploited on a gigantic industrial scale from 1942 onwards, with investments equivalent to several tens of billions of contemporaneous dollars and a workforce of more than one hundred thousand people. The bombs dropped on Hiroshima and Nagasaki in the summer of 1945 were a striking demonstration that it was possible to bring science rapidly out of its academic ghettos so that it could have, through military-industrial forms of organization, crucial impacts on society at large. Telecommunications (electronics, radar), ballistic rockets (the German V2s), chemistry (classical explosives) are other examples.


The second half of the 20th century saw an unprecedented expansion of technologies based on new scientific knowledge. The market economy, whose science had hitherto remained relatively isolated, gradually penetrated the laboratories and learnt to turn them to its advantages. This is evidenced by the widespread use of nuclear energy, first for military and then civilian purposes, micro and then nanoelectronics, photonics (lasers), controlled chemical synthesis, computer science and telecommunications, antibiotic and then antiviral medical treatments, cloning of higher mammals, etc. Large private capitalistic companies set up their own research laboratories outside the academic system and were thus able to rapidly develop many highly profitable innovations. A typical example is that of AT&T (American Telephone & Telegraph), the successor to the Bell Telephone Company. Its research laboratories, the Bell Labs, filed tens of thousands of patents throughout the 20th century and were at the origin of the transistor, the laser, optical fibres, etc. Their work was of major importance in fields such as telecommunications (telephone networks, television transmissions, satellite communications, etc.) and computing (programming languages and operating systems: Unix, C, etc.). Nothing better illustrates the coupling between fundamental research and technological innovation than the numerous Nobel prizes in physics (more than ten) and the Turing awards in computer science (four) awarded to researchers at Bell Labs.


Thus, as the profitability of consumer goods increased thanks to fundamental scientific knowledge, private industry took technological development over from the State, while public institutions dwindled, being considered as trapped by archaic motivations, favouring speculative and unfocused research with no immediate applications.


As a result, scientific institutions were gradually directed towards private economic objectives, aiming at short-term payoff*, to the detriment of open and unfocused research in the general interest.

At the same time, social and human sciences, which until then had essentially sought theoretical knowledge on the collective behaviour of humanity, but too often indulged in esoteric school quarrels and contented themselves with a role of stooge in the service of cultural elitism, were told that their survival demanded that they too should be able to respond to the imperatives of liberal society. They therefore gradually turned to responding to practical demands, in terms of communication, commercial management, work organisation, political stabilisation, and progressively abandoned their cultural vocation.


The beginning of the 21st century


The involvement of scientific knowledge in the real world intensified, influencing social relations through the technical devices imposed on everyday life by the market economy. The large multinational companies (weaponry, electronics, IT, pharmaceuticals, petrochemicals, agrochemicals) took on major importance in financing scientific research, whether in their own laboratories or by financing public research organisations on a contract basis. Amplifying a movement that had its origins in the Manhattan project for the manufacture of nuclear weapons during the Second World War as mentioned above, the very organisation of scientific research was increasingly based on the model of industrial production, accepting the efficiency criteria linked to the requirements of competitiveness and related productivity. A major and ever-increasing part of scientific activity was devoted to technical applications capable of fuelling market economy and power politics.



Loss of Faith, Jan Toorop, 1894. Credits: Rijks Museum.

Fundamental research saw its importance decrease. After several decades of effort, researchers were unable to solve the most serious problems they had identified. This was the case with cosmological puzzles, with the notions of "dark matter" and "dark energy" appearing as mere fig leaves that masked the need to rethink the very foundations of general relativity. The conciliation of the latter with quantum theory failed in the same way, and string theory and loop gravity theory did not meet with the hoped-for success. The same was true for particle physics; despite the discovery of the Higgs boson, the essential questions, such as the scale of the masses of particles, did not make any progress. In mathematics itself, the ambitious intellectual programmes of synthesis outlined at the end of the 20th century by visionaries such as A. Grothendieck and R. Langlands stalled, and the millions of dollars promised by the Clay Institute for the resolution of the seven "millennium problems" remained without recipients. In the field of life sciences, the complexity of biochemical reaction networks did not allow for any synthetic advances in cell biology, the mechanisms of ageing proved opaque, as did the processes of carcinogenesis. As a result, public funding rapidly decreased and new projects for large scientific instruments (particle accelerators, giant telescopes, etc.) were abandoned.


Only the field of astrophysics was somewhat spared, serving as an intellectual and scientific alibi for the large space companies, which were in fact involved in the development of satellite telecommunications networks for both civilian and military purposes. Investments by private firms such as Space X and others drastically lowered the cost of launchers and satellites, reducing to a minimum the role of state agencies, at least in the United States (NASA) or Europe (ESA), except in the case of China, where the public-private distinction made little sense. An entire sector of the entertainment industry prospered by producing blockbusters and spectacular TV series exploiting the dramatic thrills of what was still presented as the "conquest of space". However, the new lunar expeditions met with only a meagre and short-lived success, and plans for human expeditions to Mars were never realised, as the promises of extraterrestrial resources proved illusory. And while the cost of remote automated planetary and cometary missions dropped considerably due to advances in probe miniaturisation, the scientific results were scant. In particular, neither in the rocks of Mars nor in the underground oceans of Europe were the slightest traces of extra-terrestrial life detected.


The middle of the 21st century


It became clear that the Cartesian ambition - "through science, to become as masters and owners of nature" - after its first major achievements in the 19th and 20th centuries, had to be abandoned.


At the level of human society as a whole, scientific research had certainly made it possible, even at the end of the 20th century, to highlight many problems, alerting to anthropogenic global warming, the threat of new emerging epidemic diseases, the harmful effects of synthetic chemical compounds in food, etc. But it was realised that the solutions to these problems were not within the remit of basic science, and depended on radical political decisions. As the case of medicine had long shown, correct diagnoses are far from systematically leading to effective therapies. Democratic regimes had proved powerless to undertake the necessary and far-reaching economic and social reforms, with short-term electoral stakes outweighing difficult long-term decisions. Ironically, the painful reorganisations that became necessary were finally carried out by the authoritarian oligarchic regimes that had gradually replaced the old systems of representative democracy. Although initially driven by retrograde nationalist and populist currents, these regimes, once they had spread to continental scales - China and the United States had shown the way, Europe, South America and Africa finally followed their example, through dramatic convulsions that it is not our purpose to recall here - proved to be robust enough to allow new, more perceptive leaders to impose on the reluctant populations the strict energy, food, demographic and other regulations necessary to maintain a relative welfare. By analogy with the Europe of the second half of the eighteenth century, whose modernisation had been made possible by authoritarian yet perceptive rulers, the decades 2040-2070 have been described as new avatars of an "enlightened despotism" that finally proved more effective than the "extinct democracy" it succeeded.


At the same time, the seekers realised that the development during the 20th century of a social body of scientific researchers, whose sole task was to produce knowledge, without necessarily having to reflect on and share it, had played a definite role in the decadence of the early 21st century.

Basic scientific research pursued its slow decline. However, certain academic institutions and public laboratories survived, where a few dedicated minds continued to work with rather limited means on the great unanswered questions, as did the monks in the medieval scriptoriums.



The Battle of the Nobels Nothing better characterises the change of epoch in the 21st century than the symbolic episode known as the "Battle of the Nobels", which broke out in 2051, on the occasion of the 150th anniversary of the Nobel Prize. Faced with the continuous deterioration of scientific activity, which had made highly problematic the awarding of prizes in previous years, a number of members of the scientific Nobel committees proposed, with the consent of the research community, to suspend the Nobel Prizes sine die, hoping in this way to raise public awareness and push the public authorities to redress the situation. However, a no less important fraction of the committees vigorously opposed this prospect and recalled that Nobel had instituted the prizes to distinguish those "who, over the past year, have rendered the greatest service to humanity". The debate raged and many commentators pointed out that most Nobel Prizes in the 20th and 21th centuries had previously rewarded researchers for discoveries indeed exceptional, but whose benefits for suffering humanity had been highly hypothetical. The Nobel Prize for Physics, awarded in 1912 to Nils Gustav Dalen "for his invention of automatic regulators used in conjunction with gas accumulators, devices used to illuminate lighthouses and buoys", an innovation that undoubtedly saved the lives of countless sailors, was contrasted with that awarded a century later to F. Englert and P. Higgs "for the theoretical discovery of a mechanism contributing to our understanding of the origin of the mass of subatomic particles", which had no practical implications. Within the Nobel Foundation, the "realists" eventually prevailed over the "utopians" who resigned en bloc, only to be immediately replaced by representatives of the major industrial foundations (Google, Amazon, Lenovo, etc.). As a result, the Nobel Prizes now systematically rewarded technical innovations, sometimes minor, but considered beneficial, as, for example, the discovery of a catalytic process fuelled by direct solar energy for degrading the century old accumulated plastic waste in the oceans (Chemistry prize 2059), a decisive improvement in the making of geopolymers significantly reducing the cost and carbon footprint of cement production (Physics 2076), or yet the finding of a genetic skin treatment greatly curtailing wrinkle formation and hair loss in aging people (Medicine 2083).



In fact, applied sciences continued to flourish, mainly in large laboratories affiliated to multinational companies in the chemical, pharmacological, food, energy, transport, etc. industries. As the second half of the twentieth century had already shown, there were many possible technical developments based on already accumulated knowledge, without the need for further fundamental advances. Thus, new technologies emerged, lighter and more economical than those that had dominated the 20th century: individual or collective means of electric transport, local production of renewable energy (solar, wind), decentralised and urbanised organic agriculture, natural therapies, etc. Of course, the heavy technologies did not disappear, as required by transport and communication networks.


The development of ever richer data banks, which progressively retrieved the entire cultural and scientific production of the entire human history, allowed the technical implementation of relevant ancient knowledge that had fallen into oblivion. In many fields, particularly concerning the understanding of social phenomena, which the human sciences had stumbled over in the previous century, the implementation of powerful algorithms exploiting the accumulated big data revealed very useful correlations. The effectiveness of this purely empirical knowledge silenced the critics who questioned the absence of theoretical interpretations and the abandonment of causal explanations, the demand for which was lampooned as a manifestation of archaism.


Thus, the previous four centuries came to be considered as a singular era in the development of humanity. In a provocatively titled book, Power without knowledge?, the economist Jhen Sok-yen showed in 2056 that, from the 17th century to the 20th, the conjunction of progress in our ability to know the world and our possibilities to act on it was in fact a historical rarity. The development of fundamental knowledge, dealing with questions of speculative interest without applied finality, was the result, he argued, of very particular, relatively ephemeral and temporally disconnected historical episodes: classical Greece, the Arab-Muslim world in its early days, Western Europe and then North America until the end of the XXth century. In general, most great civilisations had not based their technical powers on their scientific knowledge, contrary to the Baconian adage. Whether it be the Roman Empire, pre-Columbian societies, the Ottoman Empire, the great pre-colonial African kingdoms, Mughal India and, last but not least, the long and powerful ancient Chinese civilisation, their technical achievements, often remarkable, had little theoretical foundation and had developed autonomously. Medieval Europe itself had made essential practical advances long before the Scientific Revolution of the 17th century, such as water mills, the horse's collar, the stern rudder, paper and even the wheelbarrow (most of which came from elsewhere, in fact from China), all essential ingredients in the transition to modernity. Jhen, critically examining the relationship between scientific progress and economic development, took up and amplified Derek de Solla Price's 1976 thesis, quite heterodoxical at the time, on the "extrinsic value" of scientific research. Faced with the failure of all econometric attempts to demonstrate the efficiency of investments in basic research through their direct impact on economic development and GDP growth, de Solla Price had simply disqualified the question itself and proposed the idea that the existence of a body of professional scientists was justified, not by the possible results of their research, but by their role as guarantors and transmitters of sufficient theoretical and practical competence for society to be able to develop and exploit its technological innovations, most often independent of theoretical knowledge. Jhen's theory soon became consensual and served as a criterion for political decisions in the technosciences.


Taking note of this mistrust, the new academies made it their duty to accompany their work with a profound ethical and political reflection. Looking back at the history of science in the 20th century, the protagonists of the revival understood the fatal chain that had, as we have seen, transformed science into technoscience and subjected the desire to know to the will to power, subjecting theoria to praxis.

The end of the 21st century


However, in the 2080s, the work carried out by stubborn researchers in the modest fundamentalist laboratories that remained, led to several discoveries with important technological and economic repercussions. Here are just two examples. After almost a century of controversy following Fleischmann and Pons' doubtful work on "cold fusion", nuclear fusion reactions in condensed matter were finally discovered, suggesting the possibility of low-cost, small-scale nuclear power generation. In a completely different field, advances in molecular biology revealed a "genetic supercode" in DNA chains, generalising intron splicing mechanisms and governing the organisation of "words" (nucleotide triplets or codons) into real "sentences" carrying essential information, thus reopening the prospects of abandoned gene therapies.


As these advances could not be transformed into technological innovations without significant financial and human investment, the debate on the usefulness of fundamental research resumed and the dogma of Jhen's theory lost its force. However, it took some time before the political and ideological context allowed a reorganisation of large-scale basic scientific research on entirely new bases. This is what we shall see in the next chapter.



NOTES


1. Editor note: Here and in the following, we mark with an asterisk terms today incomprehensible, for which we offer a brief index at the end of this chapter. For more details, it is advised to refer to books on the history of economics.


2. Editor note: It is interesting to remark that categories like Greek science or the arabo-islamic world, in spite of their cultural and political shortcomings, are used in this text, probably for reasons of didactical simplicity. It is to be hoped that in the 23rd century, more scholarly books will avoid using them.

Related Articles

Relativisation de l’étiologie causale et irréductibilité des procédures expérimentales dans la médecine traditionnelle africaine

YAOVI AKAKPO

Du bon gouvernement de la recherche

ALAIN SUPIOT