In the 1980s, introduced by my private « guardian angel », William Skyvington, I began to work for a publicist agency, Copexen, creating business papers (Siemenscope) and participating in annual reports (for example, Schlumberger). One day, the head of Copexen suggested I write a history of electronics, at the time sponsored by Thomson. I accepted, on condition that I could make a true investigation, interviewing around the world, especially in Japan or the States, in Germany, in Great Britain, witnesses and actors of this technological revolution. As all that was very expansive, we chose to do it as an international co-production, and it was, I think, the first time that a popularization like that was written in French and directed by a French woman.
We decided also to create an advisory board, with Japanese, American and European Nobel Prize Winners or renowned scientists, and to ask industrialists to write insets on particular technical points the general text being very fluent and easy to read. It occurred to me, then, to follow somebody like John Bardeen, twice Nobel Prize winner (for the invention of transistor and for his works on superconductivity) throughout the States, from Chicago to Santa Barbara, to convince him to participate. He accepted … if only I would learn that a transistor was not a radio, but a piece of electronics. And he was a wonderful advisor very patient.
At the Radiotechnique.
In the space of fifty years, electronics has completely revolutionized communications, information techniques, medicine, war, and daily life. Behind the now familiar inventions : radar, television, radio, record player, transistor, laser and household computer, stand prophets, scientists, isolated inventors and risk-taking investors, far-seeing laboratories, and new strategies which have made them possible. The microprocessor or chip has allowed reductions in both size and price and has laid the foundations for the electronic civilization in which we live today. Now electronics is the basis of the second greatest technological revolution in the history of the planet : microbiology and biotechnology.
This is the first book to attempt a synthesis of the electronic epoch in both words and pictures. It recreates the discoveries, hopes and visions, and the thousand and one mishaps which characterize this exciting intellectual adventure. Articles contributed by internationally-acclaimed experts and accompanied by clear drawings to help understand the often complex theories behind electronics ; included are old photographs, some never before published, to recreate the ‘prehistoric’ phase of electronics as well as contemporary photos to help do away with the confines between American, Japanese and European laboratories and illustrate contemporary achievements and research.
Board of Scientific Advisors : Pierre Aigrain, Director of the Science and Technical Division of the Thomson Group, Pr. Hiroyasu Funakubo, University of Tokyo, William Gosling, School of Electrical Engineering, University of Bath, Frank Tetzner, President of the International Association of Radiotechnical and Electronic Media, and two Nobel prize winners, Philip Warren Anderson (Nobel 1977), The Bell Telephone Laboratories, and the inventor of transistor John Bardeen (Nobel 1956 and 1972), University of Illinois-Champaign.
« The Electronic Epoch is the first book to attempt a synthesis of the electronic epoch in both words and pictures. […] The book is beautifully printed (in Japan), skilfully laid out so that word ans image balance beautifully, and these is so much to learn from this excellent text.” Umbrella, March 1983.
“The marvels of modern technology are vividly shown in Elizabeth Antebi’s The Electronic Epoch, an encyclopaedic survey of the revolution brought about by electronics. […] An impressive and stimulating volume, it offers a wealth of fascinating lore and superb illustrations.” George L. George, Back Stage, March 1983.
“The pictorial research that went into this upbeat, encyclopaedic survey is certainly impressive. Hundreds of photos from American, Japanese and European labs and museum complement rare old prints depicting the birth of the electronic age. The up-to-date text, brimming with confidence in our streamlined future, starts with a look at the consumer market, then rapidly moves on to applications in medicine, warfare, telecommunications, data processing, scientific research, quantum and nuclear electronics. Articles contributed by an international roster of scientists, technicians, engineers and journalists provide fairly comprehensible explanations of lasers, radar, superconductors, Maxwell’s experiments, and so forth. This wide-angled overview is better at describing exciting technical developments than in unravelling their effects on society.” Publishers Weekly, Dec 17 1982.
“This book attempts to synthesize the electronic epoch in both words and pictures. It recreates and examines the discoveries and visions that came out of a variety of mishaps which characterized the intellectual adventure leading to the electronic revolution.” Lighting Design and Application, May 1983.
«Till now, nobody had ever published the world-wide history of this technology, that is revolutionizing our daily life. The book has three fundamental qualities : one, nothing is assumed already known or left in shadowy doubt: you don’t need to be an expert in anything to understand the explanations ; second, the enthusiasm of this laywoman (Bachelor in Literature and Art History!) discovering an unknown planet is infectious; moreover, the third and greatest quality of the book is the writing an alert, precise and dense style, avoiding the lingo of most computer scientists and electronics engineers. Magnificent photos and drawings contribute to the success of this unusual reference book. » Pierre Virolleaud, Usine Nouvelle, 11 November, 1982.
« From Paris to Tokyo, from New York to San Francisco, the Mecca of the microelectronic, Elizabeth Antébi has followed the stages of the “technological earthquake” that spreads over the planet and transforms our daily landscape. Her Electronic Epoch introduces us to the mutations that, from the most sophisticated machine to the most familiar gadget, happen in all kinds of fields like plants, health, telecommunications, information, war. For her, ‘things will disappear and be replaced more and more by functions and programs, in a world reduced to earphones and screens.’ Originally, it was a revolution set in motion by pioneers, alone and modest, but with an encyclopedic knowledge (Hertz, Marconi …), however the Second World War was a decisive turning point, where David seceded to Goliath, adventure to money, the lonely inventor to the team.’ From this time began the reign of the marketing men. It was also the beginning of the fragmentation of the world in which experts and technicians worked, each of them, as in the alveolus of a hive, from which they could no longer envisage the general destiny’. […] Could too much information in some way kill the information ? Why is reality not reducible to the language of science ? Could the Memory of the World be deciphered in a decoding machine as dreamed English mathematician Alan Turing, from 1936 on ? Such are the questions (and many others) asked in the book. » Frédéric de Towarnicki, Le Figaro, December 17, 1982.
« At the microphone of Michel Clerc, yesterday evening, a young woman of thirty-seven, Elizabeth Antébi, presented a huge book, of which she is, at the same time, the publisher and the author, The Electronic Epoch. That encyclopaedic survey, written in a lively style, is the first history ever written of the amazing technological saga that is actually turning upside down our way of life, our strategies, even our way of conceiving and dealing with the new economic order.» Le Nouveau Journal/RTL
« Amazing ! Who could choose electronics as a subject for a coffee-table book and Christmas gift ? The challenge was taken up : Elizabeth Antébi has succeeded in transforming this discipline into a space of dream.[…] A French encyclopedia on electronics translated and published in the States ? A true consolation for our defeat in the last Davis Cup … » Fabien Gruhier, Le Nouvel Observateur, 25 December, 1982.
« This book is a paradox. First, everybody talks about the electronic revolution, but nobody has ever written a book like that about it.[…] Second, Elizabeth Antébi has no scientific background but she knows how to investigate she already proved it when she published “The Right of Asylums in the Soviet Union” or when she wrote articles on venture-capitalists, and when she shot a movie on Science Fiction writers. The result is remarkable.» François de Witt, L’Expansion, September 23, 1983.
« This book is one of those precious works that one has to possess in his « tool kit ». And we must stop to consider the conclusion of Elizabeth Antébi : We must reflect on the destiny of our technical and scientific society in non-technical, nonscientific terms. Despite the nightmares described in science-fiction, machines will never dominate man, although it is possible that man, having spent all his ingenuity in creating them, has none left to imagine their creative use, and by refusing to see them for what they really are extraordinary tools to simplify daily life, sometimes provided with artificial intelligence but lacking an innate power to think will accord them a divinatory decision-making power that they were not designed to wield. » Sophie Séroussi, Libération, March 14, 1983.
« A ‘super-book’ for the fans of electronics, but also for the witnesses we are of this fantastic revolution of humanity.[…] The book represents the first synthesis in the world of The Electronic Epoch, through images and text. And if we listen to the (very flattering) scientific reactions around us in the press and in the laboratories, it’s a success.[…] The photos are beautiful, the drawings are elegant and precise, the explanations clear and intelligent.» E. Gogien, Tonus, January 10, 1983.
« A woman alone, facing the Revolution of the XXth century ! Elizabeth Antébi’s enterprise is daring, but it has succeeded. She gives us the amazing and wonderfully written history of a new continent of communication and knowledge. It is at once an encyclopaedia, a book of art and an indispensable meditation on the revolution of knowledge.» La Croix, December 24, 1982.
«This history has never before been written. Strange, don’t you think ? Happily it’s now been done by Elizabeth Antébi, 37 years old, Bachelor of both Literature and Art History, writer, TV director, who decided to travel around the world for three years to investigate.» Didier Williame, La Vie, 5 January, 1983.
« E. Antébi’s book is a monument. » Les Echos.
« Not a technical book, nor a popularization really; it gives us above all a history of Man, with, between the lines, the passion of the author for the study of the mechanisms of power.» J-F D., Industries et Techniques, February 20, 1983.
« A new vision universe around us.» Sciences et Avenir, February 1983.
« Near the end, Elizabeth Antébi aptly quotes Bertold Brecht : ‘Behind the drums trot the calves - the calves that supply the skin of the drums.’ And to conclude : ‘We have to avoid creating a scenario in which man trots behind the machines whose intelligence he has supplied without preserving his own ability to think.’. » Le Parisien Libéré, December 18-19, 1982.
« Her book is really a work of art, if we consider the beautiful photos facilitating our access to that incredible fairy Universe. » Le Monde Informatique, January 10, 1983.
« A new vision of the surrounding universe.» Sciences et Avenir, février 1983.
What they said :
« It is a unique contribution to the electronic field, and it is obvious that a great deal of work must have gone into its creation. ». Merrill I. Skolnik, Superintendant Radar Division, Naval Research Laboratory, Department of the Navy.
I have heard excellent comments about your book on The Electronic Epoch and am glad to know that it is in publication by Van Nostrand-Reinhold and is also being published in French and German. It must have been a tremendous task to assemble so much material and to write such a sound and very readable account of so many complex topics. The pictures are marvellous and add a great deal to the text.
John Bardeen, deux fois Prix Nobel de Physique pour linvention du transistor (avec Shockley et Brattain) et ses travaux sur la supraconductivité.
« This is to acknowledge the safe receipt of your magnificent book and to congratulate you on the successful outcome of this tremendous project the first history of electronics written from an international viewpoint. ». Pat Hawker, Independent Broadcasting Authority (IBA).
Mr. Lubalin, our Vice President of Investor Relations, read and admired very much your book, The Electronic epoch. Mr. Lubalin is in charge of the annual report for Avnet, Inc. Avnet is the worlds largest industrial distributor of electronic components and computer products.[
] We were wondering if you would be interested in undertaking this 8 page history. We would request that you be responsible for all research, photo procurement, writing, and editing to fit. Joan Karpinski, Avnets Investor Relations.
My very sincere congratulations for that impressive work. Images are very carefully selected, the text well written, the style, elegant. Bravo !» William Gosling, Plessey.
The origins of electronics may be traced back to many different moments in time. The early Greeks, for example, discovered the electrostatic properties of amber, for which Thales of Miletus (in ancient Ionia) used the Greek word “electron”. In the late nineteenth century, Thomas Edison noticed that under certain conditions of vacuum the hot filament of a tube emitted unexplained “electrical charges” that moved in this opposite direction to the main supply current. A little later, O.W. Richardson became interested in this Edison effect and in 1903 he set down his own theory of thermionic emission. The words “electronics” and “electron” cover a multitude of meanings and applications. In 1891 Johnstone Stoney suggested that electron be used to represent the basic electrical particle. Electronic, as a term, first appeared in the title of a confidential report written by John Fleming in 1902. This term received a wider airing in 1904 in the German magazine Jahrbuch des Radioaktivität und Electronik (Radioactivity and Electronics Yearbook). In 1930 the term was institutionalized when the American editor Ronald Fink used it as the title of his monthly magazine Electronics. These different discoveries and developments all stand out as milestones, but they leave the basic question unanswered: what is the real nature of electronics? Is electronics the branch of physics dealing with the behavior of electrons? Or is electronics a branch of technology that is basically concerned with the applications and essential characteristics of the electron: absence of inertia, sensitivity to exterior fields, and ability to amplify?
In fact, both definitions are valid. The history of electronics is the history of a science and its almost immediate technological applications, which in less than fifty years have completely transformed man’s traditions, his environment, and his way of thinking.
Electronics, in the modern sense of the word, means utilizing the flow of electrons either within a vacuum or inside matter. This electron flow can be used to transmit, receive, erase, or store information using the techniques of oscillation, modulation, detection, and amplification for the coding or decoding of messages. This information may be transmitted and utilized in the form of electromagnetic waves ranging from very low frequencies to lightwave frequencies to produce currents or electrical or magnetic fields, which are all subject to the laws of physics. These different techniques bring into play a number of different disciplines: physics of course, but also mathematics (theories of coding, information, noise, etc.); chemistry (properties of the materials used); and, particularly in the field of computer processing, formal logic, semiotics (semantics, and syntax).
Today electronics occupies a primary role within those sciences and technologies involved in the processing of information, in the broadest possible sense of the term. Electronics was born out of the curiosity of scientists and the ingenuity of engineers, but its evolution can often be linked to the needs of the military establishment and the increasing importance of industrial automation. The directions it has taken and the roles it has played and continues to play are not unconnected to political and social phenomena. Electronics has become a weapon in the fierce competition between companies in the private sector of industry and also in the public arena of national interests. This is something which cannot be ignored.
The history of modern electronics takes root in that great turning point in the thinking of both physicists and philosophers which occurred at the end of the nineteenth century and the dawn of the twentieth century. The Cavendish Laboratory at Cambridge University in England is a symbol of this great divide in though that provided the base for a reconstitution of twentieth-century physics, through the work of such scientists as James Clerk Maxwell in the nineteenth century, and J.J. Thomson, Lord E. Rutherford, Dutch physicist Balthasar von Der Pol, Sir Edward Appleton, and the inventor of the diode John Fleming in the twentieth century. After the formulation of Newton’s Laws of Mechanics, scientists attempted to isolate certain natural phenomena in order to study them independently of other phenomena and, thus, to formulate laws in the universe leading to the development of techniques that would become systematically more sensitive, accurate, and reliable. It was at this point that the “mathematical description of nature” proposed by Werner Heisenberg really came under examination; according to the German philosopher Martin Heidegger, it was a continuation of the “mathematical blueprint of nature” first outlined by Galielo in his contention that the most suitable language to explain the universe was the lingua mathematica, the language of mathematics. From that point on, political, psychological, and even philosophical language was modeled on mathematical or, more broadly speaking, scientific language. People began to believe that science would eventually be able to explain everything, since everything takes place in a world that is material, tangible, measurable, and capable of being broken down into its constituent elements.
But electronics is also the offspring of electricity and shares the same great ancestors: Faraday, Ampère, Maxwell, Hertz. The concept of the electrical current, which is both immaterial and intangible, had therefore already thrown doubt on this new belief. To avoid dangerous abstraction, scientists very quickly invented the material “ether”; this was supposed to transport the electrical current.
This concept of the “ether” was the subject of much controversy. Heinrich Hertz, for example, was totally convinced of its existence. It was finally proved to be a figment of the scientist imagination through an experiment undertaken by two men, Albert Michelson and Edward Morley, who had actually set out to prove its existence. These first years of the twentieth century saw a whole concept of the universe swing in the balance. Becquerel’s work on uranium and the work of Lorentz, Perrin, Wiechert, Kaufmann, Thomson, and many others on the electron attacked the seemingly untouchable edifice of Newtonian mechanics. Work on the electron dealt a blow to the theory which considered the indivisible atom to be the smallest particle of matter. Einstein’s restricted theory of relativity and quantum mechanics, which dealt with phenomena of infinitesimal magnitudes, marked a total break with the tradition conception of the physical universe. In the field of the microcosm, quantum theory presented a flagrant contradiction of the old Leibnitz postulation that “nature makes no sudden leaps”.
This break with traditional ideas aroused a great deal of controversy. The scientists who made these discoveries had to struggle not only against the violent objections of their contemporaries, but with their own personal and intellectual reticence. In his Memoirs, J.J. Thomson discloses his extreme reluctance to announce the discovery of the electron, while the far-from-timid Wilhelm Röntgen, who discovered x-rays, only confirmed his support of the electron hypothesis after a great deal of procrastination. On the philosophical level, the whole concept of determinism seemed to be under attack. This can be shown by Einstein’s celebrated opinion: “God doesn’t play dice with the world!” and the long correspondence in which he debated this topic with Max Born. The debate still goes on today.
Man is no longer content just to observe matter. We violate it, bombard it, make it explode inside particle accelerators; we are now exploring ideas (anti matter, quarks, etc.) which even more fiercely contradict the systems constructed by physicists. This means that we are slowly getting used to the idea that science does not propose one overall explanation but a number of possible models. Today, scientists do not refer to a particle without also referring to the outside influence; they look at it in terms of the conditions of the experiment. As Werner Heisenberg writes :
When we look at the objects making up our daily environment, the physical process by which this observation is made possible plays only a secondary role. But each and every process of scientific observation causes considerable disturbance to elementary particles of matter. We can no longer continue to talk about the behavior of a particle without taking into account the observation process itself. As a consequence the natural laws, which are formulated mathematically in quantum theory, are no longer concerned with elementary particles themselves so much as the knowledge which we have of them. We can no longer ask whether these particles always exist “in themselves” in time and space in exactly the same way; in fact we can only talk of the events which occur when by means of the reciprocal action of the particle and some other system, such as the measuring instruments used, we attempt to define the particle’s behavior. (Heisenberg, 1962: 18).
This reflection has a double significance. On the one hand it takes mathematical theory as enunciated by Descartes, for example, a good deal further, since the scientist can, in name of technical imperatives, desert the simple study of the nature of the phenomenon and the rules of traditional scientific observation in favor of calculating the interrelated mathematical relationships between these “events which take place”.
The same reflection can also open up a totally different philosophical domain. “Natural sciences”, as Heisenberg says, “always presuppose the existence of man”. Here he leaves the door open for Heidegger’s research. To arrive at a way of thinking about nature which escapes science (or, rather, does not come within its jurisdiction) the only method left to us seems to be philosophy or poetry, because science is incapable of reflecting upon itself. It is the impossibility of studying the behavior of the elementary particle in itself, of objectifying it, and the necessity to limit ourselves to the knowledge we have of it, that now preoccupies electronics specialists tempted by the dream of replacing God when they attempt to explain their inventions by analogy to the various parts of the human body; coupling oscillators and the heart, coded modulation and the nervous system, computers and the brain. In fact, we have to put ourselves in the position of consciously controlling a transformation that can no longer be denied; and we must learn to control it carefully, so we are not reduced to mindless cogs in a machine gone haywire. For many years, scientists considered it impossible to bring these questions down to the level of the layman, and this very choice of words signifies the contempt which scientists and technicians feel for anything less than a completely “scientific” explanation for phenomena. But now, more than ever before, these questions have a great bearing on the daily life of mankind. The ongoing fragmentation of techniques affects even the technocrats themselves; they have become links in a long chain, incapable of seeing either its beginning or its end. For many years they have been conscious of possessing a power based on the ignorance of their fellow men; now they themselves feel increasingly isolated and manipulated. They have become victims of an ignorance which they themselves fostered. The end-link of the chain is made up of the end-users of these electronic “miracles” who have “resigned” from the game. Either they have given in to the blind consumerism that sociologists so vociferously denounce, or they have rejected the whole “technological package”. In both cases they refuse to even try to understand.
This “generation gap” is already widening into a yawning abyss. A whole generation of people born after 1950 was brought up with television. Another generation, that of the 1980s, is already familiar with the computer. We have all seen those television documentaries in which the interviewer is totally disconcerted by his encounter with young people or even children who speak a very different language from his, but who are perfectly at ease in a dialog with the computer. Some of these adolescents have even succeeded in producing quite extraordinary technical inventions. The old-fashioned question “What kind of a job do you want when you grow up?” has very little meaning for them in the context of the wide range of possibilities from which they will be able to choose.
Nevertheless, we should realize that this is not a total break with the past, but merely a continuation of the “revolution” that started at the beginning of the century. Every generation now alive has grown up and been affected by the radio whose signals reached into even the most distant and wretched corners of our planet. The radio in its various roles an arm of political propaganda used by dictators, an instrument of advertising suggestion, a mirror of cultural taste, an arbiter of fashion was the first electronic product to abolish frontiers and transform social behavior. But today, we are still only marginally aware of the important role radio has played in the past and is playing now.
Any technological innovation on this level arouses fear at first, a fear based on ignorance. A science-fiction writer recounts the story of a letter he received after the historic landing on the moon: “So where will our dead go now?” Thus, it is not only timely, but urgent for us to take up the challenge. To orient and control our own lives, which are increasingly affected by developments in electronics, we must come to understand the basic concepts and history of electronics. We must become aware of the importance of science and technology and of the choices they imply. On the political level we must look at implications in terms of power centralized or decentralized control of telecommunication networks or data banks, the possibility of technologies transfer, increased centralization and bureaucracy. On the social level we have to examine implications in terms of employment and the complementary education of minds and social behavior. On the philosophical level we must look at the effects of electronics on personal freedom, creativity, and thought. On the economic level we must determine the speed of assimilation of technological change and decision making. For an example let us look at one of the most important economic struggles going on in the world today; it centers around a basic component found in almost all electronic components: the integrated circuit.
According to a number of complementary estimates, the world market in integrated circuits which in 1970 was worth under 1 billion dollars, and in 1980 was 9 billion will rise to almost 15 billion dollars by 1985. The world turnover for data processing and peripheral data processing equipment reached almost 55 billion dollars in 1978. In December 1979, Business Week published a Dataquest survey that estimated that by the beginning of the 1980s the Japanese would have taken over some 60 percent of the American market for those integrated circuit memories most in demand at the time (64 Kbits ROM). Dataquest showed that NEC, Hitachi, and Fujitsu had, in fact, captured the American market.
The Japanese Ministry of Commerce and Industry immediately earmarked 100 million dollars for a program designed to reduce the size of integrated circuits even more. Most of the leading Japanese electronics firms had already taken part in an earlier joint research project, which cost some 250 million dollars, between 1976 and 1979. New research centers are springing up, particularly in Japan and the United States, within industry and on university campuses. Several recent developments give us food for thought: the defeat of the Swiss watchmaking industry at the hands of Seiko, one of the first companies to understand the importance of a radical technological change in the quartz watch, or the short-sightedness of a number of huge East Coast American companies that did not convert in time to semiconductors and saw the bulk of the electronics industry settle in the West.
We have begun by drawing up a report on electronics through an examination of its applications throughout the world consumer goods, medicine, defense, telecommunications, industry, and research. Modern electronics is the offspring of what we have somewhat hastily termed “revolutions”: the revolution of ’48 (the transistor), the revolution of ’68 (microelectronics, the laser and quantum electronics). The development of semiconductors, insulating materials which become conductive under certain conditions, has made extreme miniaturization possible. But most of the electronics equipment we use today (radio, television, electron microscope, radar, etc.) was developed before or during World War II, before the completion of basic research on semiconductors, and are in fact based on the technology of the thermionic tube.
If electronics seems at first glance to be a jumble of technical applications, it is nonetheless closely linked to scientific research as carried out by scientists such as J.J. Thomson, J. Perrin, A. Millikan, J.C. Maxwell, and H. Hertz, who had the courage to question the whole traditional view of physics. These were the men who made it possible to define two very important electronic phenomena: the photoelectric effect and the cathode ray oscilloscope, for example. In conclusion, electronics is the history of men who were pioneers and visionaries and of their work in research laboratories all over the world, both on the campuses of large universities and at the giant industrial complexes of the private sector.
A.F. If your house was on fire, what favorite
object would you make sure to save?
J.C. I think I would save the fire.
Electronics disturbs and confuses because it compels us to break with former habits of work, leisure, and daily occupation. This break with traditional activities has aroused a number of fears, some perfectly justified and others little more than myths. One of the most frequently expressed anxieties is that man will become subject to the law of the mighty computer, reduced to little more than a computer file of his past, his impulses and desires and personal life. But what we tend to forget is that right down through the ages it has been the habit of political authorities to keep files and to try to manipulate individuals the only difference is that the means are now a little more sophisticated. The computer therefore poses the problem of a shift in political balance, particularly in the democratic countries, by paying off the old centers of power against new institutions. This fear is also related to the image men have of themselves in a civilization in which social exchanges are often based purely on appearance. From this point of view it would be men themselves who may tend to reduce themselves to a simple combination of the information given by the computer. This need for identification seems to stem from the same spirit that makes us identify ourselves to others by giving our name, our age, and our profession. People do this every day, and we know that it is a very approximate means of social communication. The computer is merely an extension of this same desire.
Another very frequently expressed fear concerns the changes in the employment situation which any technological upheaval necessarily generates. Generally speaking, this fear is not very soundly based, and the old adage “the machine kills employment” is not necessarily correct. First, according to a number of statistical studies, the number of unemployed people in the United States has not increased since automation of industry first began, and in Japan between 1948 and 1976 the number of job opportunities actually rose by 60 percent. Another important point is that some sectors have even been saved by the advent of electronics (watches and typewriters being two examples), and that this in turn safeguarded jobs. The problems are actually more on the level of necessary strategies for job transfer and the retraining of the labor force for new markets (manufacture of data processing equipment, software, telematics, etc.). This urgent priority has been held back by ideological or even demagogical fears and taboos which must be recognized for what they are. The old saying “the machine kills employment” depicts the machine as a Machiavellian and perverse being; what is even worse it suggest that work, rather that well-being, is an end in itself. This doctrine may be legitimate in a planned economy; but we have for a long time been aware of its effects on the standard of living of workers and above all on the quality of their lives. Why, for example, in a country like the Soviet Union, have some techniques such as the laser, satellites, and electronic weapons systems been perfectly mastered, while others such as data processing have remained extremely weak? The first reply that comes to mind is that the Soviet Union, anxious to preserve its position as a world power in comparison with the United States, has concentrated its energies toward techniques of defense or aggression. But there may also be another reason. The machine liberates man from repetitive daily tasks, thereby giving him more possibilities for initiative and a perhaps disturbing freedom of spirit. We could go even further and say that the responsibility of programmers would be even greater and less easily controlled than the isolated initiatives of individuals working alone. In this sense, the machine can become synonymous with the broadest definition of liberty rather than a symbol of bureaucratic oppression. It is up to man himself to decide to what use he is going to put computers and it is a political choice of great importance.
We could also point out, to the credit of the West, the rise in living standards that accompanies mass production, the possibility of reinvesting extra funds; and the necessity for a competitive economy both on the domestic and the international level.
At the end of the eighteenth century, the concept of happiness advanced by Saint-Just was a completely new idea in Europe. Two centuries later happiness has gone from an idea to an ideology, from an ideology to a personal ideal and from there to a reality well within our reach. Electronics calls first and foremost for a new strategy of job transfer, but it also suggests a modification of behavior and desires, and this is perhaps its most revolutionary aspect. It allows man to reduce the time he spends working and to improve the conditions of his work. Above all, it allows him for the first time to dominate machines and systems not with the force of his arms, as he has done with steam machines and electrical machines by the force of his intelligence. It is also the first time that competition on a world scale tends to be based on the idea of quality of content and no longer that of the product, in fields such as telematics.
We cannot continue trying to assimilate this world, modified by electronics, with the mental structures of the past. We can no longer “enter the future by going backwards”. As the sociologist Alfred Sauvy says: “The predominance of dogma and assumption and smug, closed attitudes is what is blocking society and preventing it from compensation for the large gaps between it, the techniques it has developed and its own social ideals.” Paradoxically, man is afraid of abandoning the sphere of mechanical activity to the machine. He is in the grip of the same atavistic anxiety as the scribes of the Middle Ages with their hostility to printing. It seems as if he is afraid of his new liberty, of this possible obligation to engage in activities concerned with the mind, with creativity and imagination. Forced to question his own nature and role, he falls into a kind of metaphysical anguish.
This sense of aversion and refusal is demonstrated by the great caution with which the new services offered by electronics are greeted: television is too often used to retransmit plays or circus acts, or discussions which could be just as well broadcast by radio, because people have not succeeded in matching the new technique with a new content. Video tape recorders are often used merely to record television programs; electronic music often reproduces the sounds made by traditional instruments instead of finding its own new sounds. Perhaps this prudence and caution is dictated by the fact that new prophets have not yet replaced the old: MacLuhan has not replaced Gutenberg; nor Cage, Mozart; nor Peter Foldès, Rembrandt.
These fears are also conditioned by the abolition of frontiers and the telescoping of time. Radio allowed men to carry on a dialog with men at the opposite end of the earth, and to receive information from the farthest reaches of the universe. Television allows him to participate in the life and death of other human beings on this planet from the comfort of his armchair. The electron microscope and the radiotelescope allow him to explore the infinitely large and the infinitely small. The particle accelerator subjects nature to a kind of torture in order to uncover her secrets. Satellites push back the frontiers of the explored universe and the computer can multiply the speed of calculation and accelerate decision making.
In the era of satellites and telematics, national independence will be even more difficult to maintain and will tend to be replaced by the interdependence of all nations in the world. But this interdependence may trigger and accelerated culture shock and a telescoping of time. Will it be possible to preserve the “souls” of the different nations, to allow them to maintain their own specific cultural identity? One of the solutions proposed to end the cleavage between poor and rich nations, between the owners of energy and the masters of technology, will be technology transfer in exchange for raw materials. But this will mean that some races and peoples will have to make a leap of several decades, even centuries. There is the risk of provoking a violent rejection by cultures not prepared for the idea of the image as we have seen from the followers of Islam or other peoples whose ancestral customs forbid pictorial representations of reality. If we fail to respect the national identity of countries to whom we want to “offer” these new technologies, we risk creating serious and violent crises. The interference of technology and access to information also pose the problem of monopolies held by one large country or a multinational company which can then dominate distribution networks and data banks. These problems may perhaps be solved by the setting up of a system of international laws, but also by an instinct for coherence and competition: this is what happened when the French group Matra-Hachette or the German group Bertelsman set themselves up in opposition to the Los Angeles Times which possesses its own forests, paper factories, newspapers, television channels, record houses, publishing companies, and video networks. But even such regroupings can pose the problem of freedom of creation and expression vis-à-vis the industrial or political powers. Creative people writers, producers, and artists must become aware of these problems and arm themselves with the means of protecting these freedoms. It is a great temptation, and one often succumbed to, to put science and politics in the same bag and say: “It’s all impossible to understand.” But this is the best way of delivering oneself over to the powers that be, which for their part, are never likely to grant a jot more than the people demand.
To avoid this danger of “homogenized” information, both workers and consumers must be vigilant and try to grasp the new mechanisms at work in the system in which they are obliged to live. We should not forget that electronics did not suddenly appear out of the blue. It must be seen within the context of the Western attempt to conquer and dominate nature and as part of the great journey towards knowledge and technical progress that inspired the French Encyclopedists in the eighteenth century.
Some ten years ago, Joseph Needham, in his book on Chinese Science, asked a fundamental question:
Why did modern science, the mathematization of hypotheses about Nature, with all its implication for advanced technology, take its meteoric rise only in the West at the time of Galileo? (Needham, 1969: 16).
He replied to this question by making an allusion to the legal practice of the Middle Ages which did not hesitate to hang a cock accused of laying an egg.
The Chinese were not so presumptuous as to suppose that they knew sufficiently well the laws laid down by God for non-human thins to obey, to enable them to indict an animal at law for transgressing them. On the contrary, the Chinese reaction would undoubtedly have been to treat these rare and frightening phenomena as chhien kao (reprimands from heaven), and it was the emperor or the provincial governor whose position would have been endangered, not the cock. (Ibid., 329).
This little anecdote is quite illustrative of the behavior of the Western world, the heir to the logical Euclidian tradition. It is true that another whole current of Greek thought is in fact opposed to that Euclidian vision, the foundation for our own technological era, reinforced by Cartesian dualism that considered the universe as inert matter quite distinct from man and capable of being enslaved by science: a system of thought which provided a framework for Newtonian mechanics. Quantum physics has renewed the debate by changing our angle of observation and integrating the observer himself into the universe which he observes. Nevertheless the language of probabilities is still a mathematical language, as distinct from poetic or philosophical language which cannot be reduced to fractions. The danger lies in losing sight of this basic distinction: science cannot reflect on science, and scientists and technicians alone cannot resolve the global problems posed by technology.
The profound revolution that quantum physics has wrought in our way of scientifically regarding the world, is that we are no longer concerned with suggesting explanations of nature, but rather in establishing scales of probability. Even the particle itself is no longer an atom of matter; it does not exist, it is only manifested as a “possibility of existence”. In March 1927 Werner Heisenberg enunciated the principle of uncertainty according to which one could not simultaneously determine both the speed and the position of a particle. This brought into question one of the basic principles of conventional science: determinism. On September 26 of the same year, Niels Bohr confirmed the principle of complementarity: a phenomenon such as light can appear, depending on the experimental equipment used to observe it, as a particle-type entity or a wave-type entity. These tow complementary aspects for the same phenomenon, almost impossible to study in the same experimental situation, are necessary to the total comprehension of the quantum phenomenon.
The principle of complementary led to the rejection of another basic cornerstone of classical science. This is objectivity, the possibility of describing physical reality independently of its observation. Faced with these abstractions, some physicists and researchers have referred to texts belonging to the Oriental mystical tradition. Many of them, including Crookes and Lodge, Costa de Beauregard and Brian Josephson, have been attracted by parapsychology. This is where the ambiguity arises: we cannot really talk about “reality”, rather we are dealing with possible or probable reality, although it is reality all the same. We are concealing a completely different approach that would tend to relate anything other than the “reducible” reality of physics, even quantum physics, to thought itself, which of its very nature escapes the bonds of science.
The scenarios drawn up by the Club of Rome, MIT, or various German futurologists, even though they do provide a kind of useful navigational aid for the future, fall down precisely because of the rigidity of this attempt at crystallization of thought. All these scenarios have quite often been thrown out of kilter by the advent of the irrational, the so-called human factor. These errors perhaps prove the impossibility of ever making medium or long-term forecasts. In any case they have only served to increase our confusion.
This confusion is further aggravated by the fact that the machine is tending to become less a “manipulated object” and is increasingly becoming part of the fabric of our daily lives. The same terminal can perform and integrate a number of functions, the idea of the “machine” is giving way to the idea of “services”. The machine processes and transmits information of all kinds, thus it can no longer be considered as a neutral object totally divorced from its user, but rather as a vehicle for an “intelligent” program or even a “dialog” with man. But in fact all these computer-related terms programming, intelligence, dialog, and memory can lead us into error because they may bring back to life the errors aroused in another by such less sophisticated age by such fantastic monsters as Golem and Frankenstein and Goldorak and Doctor Strangelove. These fears may even be exaggerated by the fact that electronics, born of scientific theory the layman can understand only with the greatest of difficulty, has succeeded in making the machine almost invisible, the size of a “chip”, and that it brings into use techniques (x-rays, infrared, and ultrasound) almost impossible to imagine without recourse to the kind of abstraction with which the average human being is not at all familiar. But the truth is that intellectual comfort has never been one of the rights of man. There is no question today of creating the “electronic man” or machine that can solve every problem put to it by means of its computing skills, such as the machine already conceived by Raymond Lulle in the Middle Ages for demonstrating the existence of God. We should perhaps ask ourselves whether the reason for man’s present disorientation is that he believed for more than four centuries that science would resolve all the questions he put to it, but now finds himself even less well-armed than before to solve the ancestral riddle of the world and his own way of looking at it. Modern Everyman suffers from vertigo, is entrapped in a maze of confused or over-complex ideas, and is in the process of turning away from the basic options before him because of his inability to comprehend them.
Let us reconsider Brecht’s famous verse:
Behind the drums
trot the calves
the calves that supply
the skin of the drums.
We have to avoid creating a scenario in which man trots behind the machines whose intelligence he has supplied without preserving his own ability to think. Today, because of the multiplication of techniques and the rapidity of inventions and innovations, decisions escape the experts, who by their very nature are incapable of following the intricate thread of Ariadne out of their labyrinth, of grasping all the links in the chain. They, more than the layman, are incapable of achieving an overall and homogeneous view of the situation. We must reflect on the destiny of our technical and scientific society in nontechnical, nonscientific terms. Despite the nightmares described in science-fiction, machines will never dominate man, although it is possible that man, having spent all his ingenuity in creating them, has none left to imagine their creative use, and by refusing to see them for what they really are extraordinary tools to simplify daily life, sometimes provided with artificial intelligence but lacking innate power to think will accord them a divinatory decision-making power that they were not designed to wield.