Dawn of the 20th century
'... the emergence, among the rank and file of the working-class world, of the conviction that education may be used as an instrument of social emancipation...' - RH Tawney, 1924
A new century
The beginning of the 20th century marks a time in which the political and social advancement of ordinary working people was on the rise. The formation of the Labour Representation Committee in 1900, and the subsequent election of twenty-nine Labour MPs in '06, were just two indications that a new era had come.
As a central figure in early 20th century adult education (and later leader of the Labour Party) Arthur Greenwood observed: 'The time was ripe for a development of adult education. A generation of compulsory education had begun to bear fruit, and working-class organisations, no longer struggling for mere existence, had become an integral part of the background of working class life.' 1
Oxford and the WEA
The Workers' Educational Association (WEA) began to take shape in the first years of the new century, and its founder, Albert Mansbridge, pictured right, was 'an almost archetypal lower-middle-class scholar...intelligent, but educationally frustrated, and locked into low-grade, white-collar employment.' 2 Mansbridge had been forced to leave school at fourteen; he worked as a clerk and a cashier, and he was an enthusiastic frequenter of university extension lectures.
The WEA was founded in 1903 at a conference held in Oxford, and from the start had the strong support of the University, particularly at Balliol, St John's and New Colleges, many dons of which had been stalwarts of the 19th century University Extension movement.
Though the WEA quickly grew into a federation of entities for the furtherance of adult education - trade unions, adult schools, several University Extension authorities, co-operative societies, literary societies - the support from Oxford was crucial to its early and rapid success.
'The WEA may have begun in Mansbridge's kitchen in Ilford, but in a very real sense its early home was Oxford, and the credibility it won with Oxford dons was crucial to its strategy of winning public acceptance and state funding for its initiatives.' 3
Next: read how Vera Brittain's first experience of Oxford was via the Extension Lecture programme.
- Arthur Greenwood, 'Labour and Adult Education', in St John Parry's Cambridge Essays on Adult Education, 1920
- Lawrence Goldman, Dons and Workers, 1995, p 105
- Ibid., p 109
The text in these 'History of the Department' pages is to be found in the book 'Dons and Workers: Oxford and Adult Education Since 1850', by Dr Lawrence Goldman, Fellow and Tutor in Modern History at St Peter's College, Oxford, and a former member of the Department for Continuing Education.
Technology from 1900 to 1945
Recent history is notoriously difficult to write, because of the mass of material and the problem of distinguishing the significant from the insignificant among events that have virtually the power of contemporary experience. In respect to the recent history of technology, however, one fact stands out clearly: despite the immense achievements of technology by 1900, the following decades witnessed more advance over a wide range of activities than the whole of previously recorded history. The airplane, the rocket and interplanetary probes, electronics, atomic power, antibiotics, insecticides, and a host of new materials have all been invented and developed to create an unparalleled social situation, full of possibilities and dangers, which would have been virtually unimaginable before the present century.
In venturing to interpret the events of the 20th century, it will be convenient to separate the years before 1945 from those that followed. The years 1900 to 1945 were dominated by the two World Wars, while those since 1945 were preoccupied by the need to avoid another major war. The dividing point is one of outstanding social and technological significance: the detonation of the first atomic bomb at Alamogordo, N.M., in July 1945.
There were profound political changes in the 20th century related to technological capacity and leadership. It may be an exaggeration to regard the 20th century as “the American century,” but the rise of the United States as a superstate was sufficiently rapid and dramatic to excuse the hyperbole. It was a rise based upon tremendous natural resources exploited to secure increased productivity through widespread industrialization, and the success of the United States in achieving this objective was tested and demonstrated in the two World Wars. Technological leadership passed from Britain and the European nations to the United States in the course of these wars. This is not to say that the springs of innovation went dry in Europe. Many important inventions of the 20th century originated there. But it was the United States that had the capacity to assimilateinnovations and take full advantage from them at times when other countries were deficient in one or other of the vital social resources without which a brilliant invention cannot be converted into a commercial success. As with Britain in the Industrial Revolution, the technological vitality of the United States in the 20th century was demonstrated less by any particular innovations than by its ability to adopt new ideas from whatever source they come.
The two World Wars were themselves the most important instruments of technological as well as political change in the 20th century. The rapid evolution of the airplane is a striking illustration of this process, while the appearance of the tank in the first conflict and of the atomic bomb in the second show the same signs of response to an urgent military stimulus. It has been said that World War I was a chemists’ war, on the basis of the immense importance of high explosives and poison gas. In other respects the two wars hastened the development of technology by extending the institutional apparatus for the encouragement of innovation by both the state and private industry. This process went further in some countries than in others, but no major belligerent nation could resist entirely the need to support and coordinate its scientific-technological effort. The wars were thus responsible for speeding the transformation from “little science,” with research still largely restricted to small-scale efforts by a few isolated scientists, to “big science,” with the emphasis on large research teams sponsored by governments and corporations, working collectively on the development and application of new techniques. While the extent of this transformation must not be overstated, and recent research has tended to stress the continuing need for the independent inventor at least in the stimulation of innovation, there can be little doubt that the change in the scale of technological enterprises had far-reaching consequences. It was one of the most momentous transformations of the 20th century, for it altered the quality of industrial and social organization. In the process it assured technology, for the first time in its long history, a position of importance and even honour in social esteem.
Fuel and power
There were no fundamental innovations in fuel and power before the breakthrough of 1945, but there were several significant developments in techniques that had originated in the previous century. An outstanding development of this type was the internal-combustion engine, which was continuously improved to meet the needs of road vehicles and airplanes. The high-compression engine burning heavy-oil fuels, invented by Rudolf Diesel in the 1890s, was developed to serve as a submarine power unit in World War I and was subsequently adapted to heavy road haulage duties and to agricultural tractors. Moreover, the sort of development that had transformed the reciprocatingsteam engine into the steam turbine occurred with the internal-combustion engine, the gas turbine replacing the reciprocating engine for specialized purposes such as aero-engines, in which a high power-to-weight ratio is important. Admittedly, this adaptation had not proceeded very far by 1945, although the first jet-powered aircraft were in service by the end of the war. The theory of the gas turbine, however, had been understood since the 1920s at least, and in 1929 Sir Frank Whittle, then taking a flying instructor’s course with the Royal Air Force, combined it with the principle of jet propulsion in the engine for which he took out a patent in the following year. But the construction of a satisfactory gas-turbine engine was delayed for a decade by the lack of resources, and particularly by the need to develop new metal alloys that could withstand the high temperatures generated in the engine. This problem was solved by the development of a nickel-chromium alloy, and, with the gradual solution of the other problems, work went on in both Germany and Britain to seize a military advantage by applying the jet engine to combat aircraft.
The principle of the gas turbine is that of compressing and burning air and fuel in a combustion chamber and using the exhaust jet from this process to provide the reaction that propels the engine forward. In its turbopropeller form, which developed only after World War II, the exhaust drives a shaft carrying a normal airscrew (propeller). Compression is achieved in a gas-turbine engine by admitting air through a turbine rotor. In the so-called ramjet engine, intended to operate at high speeds, the momentum of the engine through the air achieves adequate compression. The gas turbine has been the subject of experiments in road, rail, and marine transport, but for all purposes except that of air transport its advantages have not so far been such as to make it a viable rival to traditional reciprocating engines.
As far as fuel is concerned, the gas turbine burns mainly the middle fractions (kerosene, or paraffin) of refined oil, but the general tendency of its widespread application was to increase still further the dependence of the industrialized nations on the producers of crude oil, which became a raw material of immense economic value and international political significance. The refining of this material itself underwent important technological development. Until the 20th century it consisted of a fairly simple batch process whereby oil was heated until it vaporized, when the various fractions were distilled separately. Apart from improvements in the design of the stills and the introduction of continuous-flow production, the first big advance came in 1913 with the introduction of thermal cracking. This process took the less volatile fractions after distillation and subjected them to heat under pressure, thus cracking the heavy molecules into lighter molecules and so increasing the yield of the most valuable fuel, petrol or gasoline. The discovery of this ability to tailor the products of crude oil to suit the market marks the true beginning of the petrochemical industry. It received a further boost in 1936, with the introduction of catalytic cracking. By the use of various catalysts in the process, means were devised for still further manipulating the molecules of the hydrocarbon raw material. The development of modern plastics followed directly on this (see belowPlastics). So efficient had the processes of utilization become that by the end of World War II the petrochemical industry had virtually eliminated all waste materials.
All the principles of generating electricity had been worked out in the 19th century, but by its end these had only just begun to produce electricity on a large scale. The 20th century witnessed a colossal expansion of electrical power generation and distribution. The general pattern has been toward ever-larger units of production, using steam from coal- or oil-fired boilers. Economies of scale and the greater physical efficiency achieved as higher steam temperatures and pressures were attained both reinforced this tendency. Experience in the United States indicates the trend: in the first decade of the 20th century, a generating unit with a capacity of 25,000 kilowatts with pressures up to 200–300 pounds per square inch at 400–500 °F (about 200–265 °C) was considered large, but by 1930 the largest unit was 208,000 kilowatts with pressures of 1,200 pounds per square inch at a temperature of 725 °F, while the amount of fuel necessary to produce a kilowatt-hour of electricity and the price to the consumer had fallen dramatically. As the market for electricity increased, so did the distance over which it was transmitted, and the efficiency of transmission required higher and higher voltages. The small direct-current generators of early urban power systems were abandoned in favour of alternating-current systems, which could be adapted more readily to high voltages. Transmission over a line of 155 miles (250 km) was established in California in 1908 at 110,000 volts, and Hoover Dam in the 1930s used a line of 300 miles (480 km) at 287,000 volts. The latter case may serve as a reminder that hydroelectric power, using a fall of water to drive water turbines, was developed to generate electricity where the climate and topography make it possible to combine production with convenient transmission to a market. Remarkable levels of efficiency were achieved in modern plants. One important consequence of the ever-expanding consumption of electricity in the industrialized countries has been the linking of local systems to provide vast power grids, or pools, within which power can be shifted easily to meet changing local needs for current.
Until 1945, electricity and the internal-combustion engine were the dominant sources of power for industry and transport in the 20th century, although in some parts of the industrialized world steam power and even older prime movers remained important. Early research in nuclear physics was more scientific than technological, stirring little general interest. In fact, from the work of Ernest Rutherford, Albert Einstein, and others to the first successful experiments in splitting heavy atoms in Germany in 1938, no particular thought was given to engineering potential. The war led the Manhattan Project to produce the fission bomb that was first exploded at Alamogordo, N.M. Only in its final stages did even this program become a matter of technology, when the problems of building large reactors and handling radioactive materials had to be solved. At this point it also became an economic and political matter, because very heavy capital expenditure was involved. Thus, in this crucial event of the mid-20th century, the convergence of science, technology, economics, and politics finally took place.
Industry and innovation
There were technological innovations of great significance in many aspects of industrial production during the 20th century. It is worth observing, in the first place, that the basic matter of industrial organization became one of self-conscious innovation, with organizations setting out to increase their productivity by improved techniques. Methods of work study, first systematically examined in the United States at the end of the 19th century, were widely applied in U.S. and European industrial organizations in the first half of the 20th century, evolving rapidly into scientific management and the modern studies of industrial administration, organization and method, and particular managerial techniques. The object of these exercises was to make industry more efficient and thus to increase productivity and profits, and there can be no doubt that they were remarkably successful, if not quite as successful as some of their advocates maintained. Without this superior industrial organization, it would not have been possible to convert the comparatively small workshops of the 19th century into the giant engineering establishments of the 20th, with their mass-production and assembly-line techniques. The rationalization of production, so characteristic of industry in the 20th century, may thus be legitimately regarded as the result of the application of new techniques that form part of the history of technology since 1900.
Improvements in iron and steel
Another field of industrial innovation in the 20th century was the production of new materials. As far as volume of consumption goes, humankind still lives in the Iron Age, with the utilization of iron exceeding that of any other material. But this dominance of iron has been modified in three ways: by the skill of metallurgists in alloying iron with other metals; by the spread of materials such as glass and concrete in building; and by the appearance and widespread use of entirely new materials, particularly plastics. Alloys had already begun to become important in the iron and steel industry in the 19th century (apart from steel itself, which is an alloy of iron and carbon). Self-hardening tungsten steel was first produced in 1868 and manganese steel, possessing toughness rather than hardness, in 1887. Manganese steel is also nonmagnetic; this fact suggests great possibilities for this steel in the electric power industry. In the 20th century steel alloys multiplied. Silicon steel was found to be useful because, in contrast to manganese steel, it is highly magnetic. In 1913 the first stainless steels were made in England by alloying steel with chromium, and the Krupp works in Germany produced stainless steel in 1914 with 18 percent chromium and 8 percent nickel. The importance of a nickel-chromium alloy in the development of the gas-turbine engine in the 1930s has already been noted. Many other alloys also came into widespread use for specialized purposes.
Methods of producing traditional materials like glass and concrete on a larger scale also supplied alternatives to iron, especially in building; in the form of reinforced concrete, they supplemented structural iron. Most of the entirely new materials were nonmetallic, although at least one new metal, aluminum, reached proportions of large-scale industrial significance in the 20th century. The ores of this metal are among the most abundant in the crust of the Earth, but, before the provision of plentiful cheap electricity made it feasible to use an electrolytic process on an industrial scale, the metal was extracted only at great expense. The strength of aluminum, compared weight for weight with steel, made it a valuable material in aircraft construction, and many other industrial and domestic uses were found for it. In 1900 world production of aluminum was 3,000 tons, about half of which was made using cheap electric power from Niagara Falls. Production rose rapidly since.
Electrolytic processes had already been used in the preparation of other metals. At the beginning of the 19th century, Davy pioneered the process by isolating potassium, sodium, barium, calcium, and strontium, although there was little commercial exploitation of these substances. By the beginning of the 20th century, significant amounts of magnesium were being prepared electrolytically at high temperatures, and the electric furnace made possible the production of calcium carbide by the reaction of calcium oxide (lime) and carbon (coke). In another electric furnace process, calcium carbide reacted with nitrogen to form calcium cyanamide, from which a useful syntheticresin could be made.
The quality of plasticity is one that had been used to great effect in the crafts of metallurgy and ceramics. The use of the word plastics as a collective noun, however, refers not so much to the traditional materials employed in these crafts as to new substances produced by chemical reactions and molded or pressed to take a permanent rigid shape. The first such material to be manufactured was Parkesine, developed by the British inventor Alexander Parkes. Parkesine, made from a mixture of chloroform and castor oil, was “a substance hard as horn, but as flexible as leather, capable of being cast or stamped, painted, dyed or carved.” The words are from a guide to the International Exhibition of 1862 in London, at which Parkesine won a bronze medal for its inventor. It was soon followed by other plastics, but—apart from celluloid, a cellulose nitratecomposition using camphor as a solvent and produced in solid form (as imitation horn for billiard balls) and in sheets (for men’s collars and photographic film)—these had little commercial success until the 20th century.
The early plastics relied upon the large molecules in cellulose, usually derived from wood pulp. Leo H. Baekeland, a Belgian American inventor, introduced a new class of large molecules when he took out his patent for Bakelite in 1909. Bakelite is made by the reaction between formaldehyde and phenolic materials at high temperatures; the substance is hard, infusible, and chemically resistant (the type known as thermosetting plastic). As a nonconductor of electricity, it proved to be exceptionally useful for all sorts of electrical appliances. The success of Bakelite gave a great impetus to the plastics industry, to the study of coal tar derivatives and other hydrocarbon compounds, and to the theoretical understanding of the structure of complex molecules. This activity led to new dyestuffs and detergents, but it also led to the successful manipulation of molecules to produce materials with particular qualities such as hardness or flexibility. Techniques were devised, often requiring catalysts and elaborate equipment, to secure these polymers—that is, complex molecules produced by the aggregation of simpler structures. Linear polymers give strong fibres, film-forming polymers have been useful in paints, and mass polymers have formed solid plastics.
The possibility of creating artificial fibres was another 19th-century discovery that did not become commercially significant until the 20th century, when such fibres were developed alongside the solid plastics to which they are closely related. The first artificial textiles had been made from rayon, a silklike material produced by extruding a solution of nitrocellulose in acetic acid into a coagulating bath of alcohol, and various other cellulosic materials were used in this way. But later research, exploiting the polymerization techniques being used in solid plastics, culminated in the production of nylon just before the outbreak of World War II. Nylon consists of long chains of carbon-based molecules, giving fibres of unprecedented strength and flexibility. It is formed by melting the component materials and extruding them; the strength of the fibre is greatly increased by stretching it when cold. Nylon was developed with the women’s stocking market in mind, but the conditions of war gave it an opportunity to demonstrate its versatility and reliability as parachute fabric and towlines. This and other synthetic fibres became generally available only after the war.
The chemical industry in the 20th century put a wide range of new materials at the disposal of society. It also succeeded in replacing natural sources of some materials. An important example of this is the manufacture of artificial rubber to meet a world demand far in excess of that which could be met by the existing rubber plantations. This technique was pioneered in Germany during World War I. In this effort, as in the development of other materials such as high explosives and dyestuffs, the consistent German investment in scientific and technical education paid dividends, for advances in all these fields of chemical manufacturing were prepared by careful research in the laboratory.
Pharmaceuticals and medical technology
An even more dramatic result of the growth in chemical knowledge was the expansion of the pharmaceutical industry. The science of pharmacy emerged slowly from the traditional empiricism of the herbalist, but by the end of the 19th century there had been some solid achievements in the analysis of existing drugs and in the preparation of new ones. The discovery in 1856 of the first aniline dye had been occasioned by a vain attempt to synthesize quinine from coal tar derivatives. Greater success came in the following decades with the production of the first synthetic antifever drugs and painkilling compounds, culminating in 1899 in the conversion of salicylic acid into acetylsalicylic acid (aspirin), which is still the most widely used drug. Progress was being made simultaneously with the sulfonal hypnotics and the barbiturate group of drugs, and early in the 20th century Paul Ehrlich of Germany successfully developed an organic compound containing arsenic—606, denoting how many tests he had made, but better known as Salvarsan—which was effective against syphilis. The significance of this discovery, made in 1910, was that 606 was the first drug devised to overwhelm an invading microorganism without offending the host. In 1935 the discovery that Prontosil, a red dye developed by the German synthetic dyestuff industry, was an effective drug against streptococcal infections (leading to blood poisoning) introduced the important sulfa drugs. Alexander Fleming’s discovery of penicillin in 1928 was not immediately followed up, because it proved very difficult to isolate the drug in a stable form from the mold in which it was formed. But the stimulus of World War II gave a fresh urgency to research in this field, and commercial production of penicillin, the first of the antibiotics, began in 1941. These drugs work by preventing the growth of pathogenic organisms. All these pharmaceutical advances demonstrate an intimate relationship with chemical technology.
Other branches of medical technology made significant progress. Anesthetics and antiseptics had been developed in the 19th century, opening up new possibilities for complex surgery. Techniques of blood transfusion, examination by X-rays (discovered in 1895), radiation therapy (following demonstration of the therapeutic effects of ultraviolet light in 1893 and the discovery of radium in 1898), and orthopedic surgery for bone disorders all developed rapidly. The techniques of immunology similarly advanced, with the development of vaccines effective against typhoid and other diseases.
Food and agriculture
The increasing chemical understanding of drugs and microorganisms was applied with outstanding success to the study of food. The analysis of the relationship between certain types of food and human physical performance led to the identification of vitamins in 1911 and to their classification into three types in 1919, with subsequent additions and subdivisions. It was realized that the presence of these materials is necessary for a healthy diet, and eating habits and public health programs were adjusted accordingly. The importance of trace elements, very minor constituents, was also discovered and investigated, beginning in 1895 with the realization that goitre is caused by a deficiency of iodine.
As well as improving in quality, the quantity of food produced in the 20th century increased rapidly as a result of the intensive application of modern technology. The greater scale and complexity of urban life created a pressure for increased production and a greater variety of foodstuffs, and the resources of the internal-combustion engine, electricity, and chemical technology were called upon to achieve these objectives. The internal-combustion engine was utilized in the tractor, which became the almost universal agent of mobile power on the farm in the industrialized countries. The same engines powered other machines such as combine harvesters, which became common in the United States in the early 20th century, although their use was less widespread in the more labour-intensive farms of Europe, especially before World War II. Synthetic fertilizers, an important product of the chemical industry, became popular in most types of farming, and other chemicals—pesticides and herbicides—appeared toward the end of the period, effecting something of an agrarian revolution. Once again, World War II gave a powerful boost to development. Despite problems of pollution that developed later, the introduction of DDT as a highly effective insecticide in 1944 was a particularly significant achievement of chemical technology. Food processing and packaging also advanced—dehydration techniques such as vacuum-contact drying were introduced in the 1930s—but the 19th-century innovations of canning and refrigeration remained the dominant techniques of preservation.
Important development occurred in civil engineering in the first half of the 20th century, although there were few striking innovations. Advancing techniques for large-scale construction produced many spectacular skyscrapers, bridges, and dams all over the world but especially in the United States. The city of New York acquired its characteristic skyline, built upon the exploitation of steel frames and reinforced concrete. Conventional methods of building in brick and masonry had reached the limits of feasibility in the 1800s in office blocks up to 16-stories high, and the future lay with the skeleton frame or cage construction pioneered in the 1880s in Chicago. The vital ingredients for the new tall buildings or skyscrapers that followed were abundant cheap steel—for columns, beams, and trusses—and efficient passenger elevators. The availability of these developments and the demand for more and more office space in the thriving cities of Chicago and New York caused the boom in skyscraper building that continued until 1931, when the Empire State Building, with its total height of 1,250 feet (381 metres) and 102 stories, achieved a limit not exceeded for 40 years and demonstrated the strength of its structure by sustaining the crash impact of a B-25 bomber in July 1945 with only minor damage to the building. The Great Depression brought a halt to skyscraper building from 1932 until after World War II.
Concrete, and more especially reinforced concrete (that is, concrete set around a framework or mesh of steel), played an important part in the construction of the later skyscrapers, and this material also led to the introduction of more imaginative structural forms in buildings and to the development of prefabrication techniques. The use of large concrete members in bridges and other structures has been made possible by the technique of prestressing: by casting the concrete around stretched steel wires, allowing it to set, then relaxing the tension in the wires, it is possible to induce compressive stresses in the concrete that offset the tensile stresses imposed by the external loading, and in this way the members can be made stronger and lighter. The technique was particularly applicable in bridge building. The construction of large-span bridges received a setback, however, with the dramatic collapse of the Tacoma Narrows (Washington) Suspension Bridge in the United States in 1940, four months after it was completed. This led to a reassessment of wind effects on the loading of large suspension bridges and to significant improvements in subsequent designs. Use of massed concrete has produced spectacular high arch dams, in which the weight of water is transmitted in part to the abutments by the curve of the concrete wall; such dams need not depend upon the sheer bulk of impervious material as in a conventional gravity or embankment dam.
Some of the outstanding achievements of the 20th century are provided by transportation history. In most fields there was a switch from steam power, supreme in the previous century, to internal combustion and electricity. Steam, however, retained its superiority in marine transport: the steam turbine provided power for a new generation of large ocean liners beginning with the Mauretania, developing 70,000 horsepower and a speed of 27 knots (27 nautical miles, or 50 km, per hour) in 1906 and continuing throughout the period, culminating in the Queen Elizabeth, launched in 1938 with about 200,000 horsepower and a speed of 28.5 knots. Even here, however, there was increasing competition from large diesel-powered motor vessels. Most smaller ships adopted this form of propulsion, and even the steamships accepted the convenience of oil-burning boilers in place of the cumbersome coal burners with their large bunkers.
On land, steam fought a long rearguard action, but the enormous popularity of the automobile deprived the railways of much of their passenger traffic and forced them to seek economies in conversion to diesel engines or electric traction, although these developments had not spread widely in Europe by the outbreak of World War II. Meanwhile, the automobile stimulated prodigious feats of production. Henry Ford led the way in the adoption of assembly-line mass production; his spectacularly successful Model T, the “Tin Lizzie,” was manufactured in this way first in 1913, and by 1923 production had risen to nearly two million per year. Despite this and similar successes in other countries, the first half of the 20th century was not a period of great technological innovation in the motorcar, which retained the main design features given to it in the last decade of the 19th century. For all the refinements (for example, the self-starter) and multitudinous varieties, the major fact of the automobile in this period was its quantity.
The airplane is entirely a product of the 20th century, unlike the automobile, to which its development was intimately related. This is not to say that experiments with flying machines had not taken place earlier. Throughout the 19th century, to go back no further, investigations into aerodynamic effects were carried out by inventors such as Sir George Cayley in England, leading to the successful glider flights of Otto Lilienthal and others. Several designers perceived that the internal-combustion engine promised to provide the light, compact power unit that was a prerequisite of powered flight, and on Dec. 17, 1903, Wilbur and Orville Wright in their Flyer I at the Kill Devil Hills in North Carolina achieved sustained, controlled, powered flight, one of the great “firsts” in the history of technology. The Flyer I was a propeller-driven adaptation of the biplane gliders that the Wright brothers had built and learned to fly in the previous years. They had devised a system of control through elevator, rudder, and a wing-warping technique that served until the introduction of ailerons. Within a few years the brothers were flying with complete confidence, astonishing the European pioneers of flight when they took their airplane across the Atlantic to give demonstrations in 1908. Within a few months of this revelation, however, the European designers had assimilated the lesson and were pushing ahead the principles of aircraft construction. World War I gave a great impetus to this technological development, transforming small-scale scattered aircraft manufacture into a major industry in all the main belligerent countries, and transforming the airplane itself from a fragile construction in wood and glue into a robustmachine capable of startling aerobatic feats.
The end of the war brought a setback to this new industry, but the airplane had evolved sufficiently to reveal its potential as a medium of civil transport, and during the interwar years the establishment of transcontinental air routes provided a market for large, comfortable, and safe aircraft. By the outbreak of World War II, metal-framed-and-skinned aircraft had become general, and the cantilevered monoplane structure had replaced the biplane for most purposes. War again provided a powerful stimulus to aircraft designers; engine performance was especially improved, and the gas turbine received its first practical application. Other novel features of these years included the helicopter, deriving lift from its rotating wings, or rotors, and the German V-1 flying bomb, a pilotless aircraft.
The war also stimulated the use of gliders for the transport of troops, the use of parachutes for escape from aircraft and for attack by paratroops, and the use of gas-filled balloons for antiaircraft barrages. The balloon had been used for pioneer aeronautical experiments in the 19th century, but its practical uses had been hampered by the lack of control over its movements. The application of the internal-combustion engine to a rigid-frame balloon airship by Ferdinand von Zeppelin had temporarily made a weapon of war in 1915, although experience soon proved that it could not survive in competition with the airplane. The apparently promising prospects of the dirigible (that is, maneuverable) airship in civil transport between the wars were ended by a series of disasters, the worst of which was the destruction of the Hindenburg in New Jersey in 1937. Since then the airplane has been unchallenged in the field of air transport.
The spectacular transport revolution of the 20th century was accompanied by a communications revolution quite as dramatic, although technologically springing from different roots. In part, well-established media of communication like printing participated in this revolution, although most of the significant changes—such as the typewriter, the Linotype, and the high-speed power-driven rotary press—were achievements of the 19th century. Photography was also a proved and familiar technique by the end of the 19th century, but cinematography was new and did not become generally available until after World War I, when it became enormously popular.
The real novelties in communications in the 20th century came in electronics. The scientific examination of the relationship between light waves and electromagnetic waves had already revealed the possibility of transmitting electromagnetic signals between widely separated points, and on Dec. 12, 1901, Guglielmo Marconi succeeded in transmitting the first wireless message across the Atlantic. Early equipment was crude, but within a few years striking progress was made in improving the means of transmitting and receiving coded messages. Particularly important was the development of the thermionic valve, a device for rectifying (that is, converting a high-frequency oscillating signal into a unidirectional current capable of registering as a sound) an electromagnetic wave. This was essentially a development from the carbon-filament electric lightbulb. In 1883 Edison had found that in these lamps a current flowed between the filament and a nearby test electrode, called the plate, if the electric potential of the plate was positive with respect to the filament. This current, called the Edison effect, was later identified as a stream of electrons radiated by the hot filament. In 1904 Sir John Ambrose Fleming of Britain discovered that by placing a metal cylinder around the filament in the bulb and by connecting the cylinder (the plate) to a third terminal, a current could be rectified so that it could be detected by a telephone receiver. Fleming’s device was known as the diode, and two years later, in 1906, Lee De Forest of the United States made the significant improvement that became known as the triode by introducing a third electrode (the grid) between the filament and the plate. The outstanding feature of this refinement was its ability to amplify a signal. Its application made possible by the 1920s the widespread introduction of live-voice broadcasting in Europe and America, with a consequent boom in the production of radio receivers and other equipment.
This, however, was only one of the results derived from the application of the thermionic valve. The idea of harnessing the flow of electrons was applied in the electron microscope, radar (a detection device depending on the capacity of some radio waves to be reflected by solid objects), the electronic computer, and in the cathode-ray tube of the television set. The first experiments in the transmission of pictures had been greeted with ridicule. Working on his own in Britain, John Logie Baird in the 1920s demonstrated a mechanical scanner able to convert an image into a series of electronic impulses that could then be reassembled on a viewing screen as a pattern of light and shade. Baird’s system, however, was rejected in favour of electronic scanning, developed in the United States by Philo Farnsworth and Vladimir Zworykin with the powerful backing of the Radio Corporation of America. Their equipment operated much more rapidly and gave a more satisfactory image. By the outbreak of World War II, television services were being introduced in several countries, although the war suspended their extension for a decade. The emergence of television as a universal medium of mass communication is therefore a phenomenon of the postwar years. But already by 1945 the cinema and the radio had demonstrated their power in communicating news, propaganda, commercial advertisements, and entertainment.
It has been necessary to refer repeatedly to the effects of the two World Wars in promoting all kinds of innovation. It should be observed also that technological innovations transformed the character of war itself. One weapon developed during World War II deserves a special mention. The principle of rocket propulsion was well known earlier, and its possibilities as a means of achieving speeds sufficient to escape from Earth’s gravitational pull had been pointed out by such pioneers as the Russian Konstantin Tsiolkovsky and the American Robert H. Goddard. The latter built experimental liquid-fueled rockets in 1926. Simultaneously, a group of German and Romanian pioneers was working along the same lines, and it was this team that was taken over by the German war effort in the 1930s and given the resources to develop a rocket capable of delivering a warhead hundreds of miles away. At the Peenemünde base on the island of Usedom in the Baltic, Wernher von Braun and his team created the V-2. Fully fueled, it weighed 14 tons; it was 40 feet (12 metres) long and was propelled by burning a mixture of alcohol and liquid oxygen. Reaching a height of more than 100 miles (160 km), the V-2 marked the beginning of the space age, and members of its design team were instrumental in both the Soviet and U.S. space programs after the war.
Technology had a tremendous social impact in the period 1900–45. The automobile and electric power, for instance, radically changed both the scale and the quality of 20th-century life, promoting a process of rapid urbanization and a virtual revolution in living through mass production of household goods and appliances. The rapid development of the airplane, the cinema, and radio made the world seem suddenly smaller and more accessible. In the years following 1945 the constructive and creative opportunities of modern technology could be exploited, although the process has not been without its problems.
The years since World War II ended have been spent in the shadow of nuclear weapons, even though they have not been used in war since that time. These weapons underwent momentous development: the fission bombs of 1945 were superseded by the more powerful fusion bombs in 1950, and before 1960 rockets were shown capable of delivering these weapons at ranges of thousands of miles. This new military technology had an incalculable effect on international relations, for it contributed to the polarization of world power blocs while enforcing a caution, if not discipline, in the conduct of international affairs that was absent earlier in the 20th century.
The fact of nuclear power was by no means the only technological novelty of the post-1945 years. So striking indeed were the advances in engineering, chemical and medical technology, transport, and communications that some commentators wrote, somewhat misleadingly, of the “second Industrial Revolution” in describing the changes in these years. The rapid development of electronic engineering created a new world of computer technology, remote control, miniaturization, and instant communication. Even more expressive of the character of the period was the leap over the threshold of extraterrestrial exploration. The techniques of rocketry, first applied in weapons, were developed to provide launch vehicles for satellites and lunar and planetary probes and eventually, in 1969, to set the first men on the Moon and bring them home safely again. This astonishing achievement was stimulated in part by the international ideological rivalry already mentioned, as only the Soviet Union and the United States had both the resources and the will to support the huge expenditures required. It justifies the description of this period, however, as that of “space-age technology.”
The great power innovation of this period was the harnessing of nuclear energy. The first atomic bombs represented only a comparatively crude form of nuclear fission, releasing the energy of the radioactive material immediately and explosively. But it was quickly appreciated that the energy released within a critical atomic pile, a mass of graphite absorbing the neutrons emitted by radioactive material inserted into it, could generate heat, which in turn could create steam to drive turbines and thus convert the nuclear energy into usable electricity. Atomic power stations were built on this principle in the advanced industrial world, and the system is still undergoing refinement, although so far atomic energy has not vindicated the high hopes placed in it as an economic source of electricity and presents formidable problems of waste disposal and maintenance. Nevertheless, it seems probable that the effort devoted to experiments on more direct ways of controlling nuclear fission will eventually produce results in power engineering.
Meanwhile, nuclear physics was probing the even more promising possibilities of harnessing the power of nuclear fusion, of creating the conditions in which simple atoms of hydrogen combine, with a vast release of energy, to form heavier atoms. This is the process that occurs in the stars, but so far it has only been created artificially by triggering off a fusion reaction with the intense heat generated momentarily by an atomic fission explosion. This is the mechanism of the hydrogen bomb. So far scientists have devised no way of harnessing this process so that continuous controlled energy can be obtained from it, although researches into plasma physics, generating a point of intense heat within a stream of electrons imprisoned in a strong magnetic field, hold out some hopes that such means will be discovered in the not-too-distant future.
Alternatives to fossil fuels
It may well become a matter of urgency that some means of extracting usable power from nuclear fusion be acquired. At the present rate of consumption, the world’s resources of mineral fuels, and of the available radioactive materials used in the present nuclear power stations, will be exhausted within a period of perhaps a few decades. The most attractive alternative is thus a form of energy derived from a controlled fusion reaction that would use hydrogen from seawater, a virtually limitless source, and that would not create a significant problem of waste disposal. Other sources of energy that may provide alternatives to mineral fuels include various forms of solar cell, deriving power from the Sun by a chemical or physical reaction such as that of photosynthesis. Solar cells of this kind are already in regular use on satellites and space probes, where the flow of energy out from the Sun (the solar wind) can be harnessed without interference from the atmosphere or the rotation of the Earth.
The gas turbine underwent substantial development since its first successful operational use at the end of World War II. The high power-to-weight ratio of this type of engine made it ideal for aircraft propulsion, so that in either the pure jet or turboprop form it was generally adopted for all large aircraft, both military and civil, by the 1960s. The immediate effect of the adoption of jet propulsion was a spectacular increase in aircraft speeds, the first piloted airplane exceeding the speed of sound in level flight being the American Bell X-1 in 1947, and by the late 1960s supersonic flight was becoming a practicable, though controversial, proposition for civil-airline users. Ever larger and more powerful gas turbines were designed to meet the requirements of airlines and military strategy, and increasing attention was given to refinements to reduce the noise and increase the efficiency of this type of engine. Meanwhile, the gas turbine was installed as a power unit in ships, railroad engines, and automobiles, but in none of these uses did it proceed far beyond the experimental stage.
The space age spawned important new materials and uncovered new uses for old materials. For example, a vast range of applications have been found for plastics that have been manufactured in many different forms with widely varied characteristics. Glass fibre has been molded in rigid shapes to provide motorcar bodies and hulls for small ships. Carbon fibre has demonstrated remarkable properties that make it an alternative to metals for high-temperature turbine blades. Research on ceramics has produced materials resistant to high temperatures suitable for heat shields on spacecraft. The demand for iron and its alloys and for the nonferrous metals has remained high. The modern world has found extensive new uses for the latter: copper for electrical conductors, tin for protective plating of less-resistant metals, lead as a shield in nuclear power installations, and silver in photography. In most of these cases the development began before the 20th century, but the continuing increase in demand for these metals is affecting their prices in the world commodity markets.
Automation and the computer
Both old and new materials were used increasingly in the engineering industry, which was transformed since the end of World War II by the introduction of control engineering, automation, and computerized techniques. The vital piece of equipment has been the computer, especially the electronic digital computer, a 20th-century invention the theory of which was expounded by the English mathematician and inventor Charles Babbage in the 1830s. The essence of this machine is the use of electronic devices to record electric impulses coded in the very simple binary system, using only two symbols, but other devices such as punched cards and magnetic tape for storing and feeding information have been important supplementary features. By virtue of the very high speeds at which such equipment can operate, even the most complicated calculations can be performed in a very short space of time.
The Mark I digital computer was at work at Harvard University in 1944, and after the war the possibility of using it for a wide range of industrial, administrative, and scientific applications was quickly realized. The early computers, however, were large and expensive machines, and their general application was delayed until the invention of the transistor revolutionized computer technology. The transistor is another of the key inventions of the space age. The product of research on the physics of solids, and particularly of those materials such as germanium and silicon known as semiconductors, the transistor was invented by John Bardeen, Walter H. Brattain, and William B. Shockley at Bell Telephone Laboratories in the United States in 1947. It was discovered that crystals of semiconductors, which have the capacity to conduct electricity in some conditions and not in others, could be made to perform the functions of a thermionic valve but in the form of a device that was much smaller, more reliable, and more versatile. The result has been the replacement of the cumbersome, fragile, and heat-producing vacuum tubes by the small and strong transistor in a wide range of electronic equipment. Most especially, this conversion has made possible the construction of much more powerful computers while making them more compact and less expensive. Indeed, so small can effective transistors be that they have made possible the new skills of miniaturization and microminiaturization, whereby complicated electronic circuits can be created on minute pieces of silicon or other semiconducting materials and incorporated in large numbers in computers. From the late 1950s to the mid-1970s the computer grew from an exotic accessory to an integral element of most commercial enterprises, and computers made for home use became widespread in the ’80s.
The potential for adaptation and utilization of the computer seems so great that many commentators have likened it to the human brain, and there is no doubt that human analogies have been important in its development. In Japan, where computer and other electronics technology made giant strides since the 1950s, fully computerized and automated factories were in operation by the mid-1970s, some of them employing complete workforces of robots in the manufacture of other robots. In the United States the chemical industry provides some of the most striking examples of fully automated, computer-controlled manufacture. The characteristics of continuous production, in contrast to the batch production of most engineering establishments, lend themselves ideally to automatic control from a central computer monitoring the information fed back to it and making adjustments accordingly. Many large petrochemical plants producing fuel and raw materials for manufacturing industries are now run in this way, with the residual human function that of maintaining the machines and of providing the initial instructions. The same sort of influences can be seen even in the old established chemical processes, although not to the same extent: in the ceramics industry, in which continuous firing replaced the traditional batch-production kilns; in the paper industry, in which mounting demand for paper and board encouraged the installation of larger and faster machines; and in the glass industry, in which the float-glass process for making large sheets of glass on a surface of molten tin requires close mechanical control.
In medicine and the life sciences the computer has provided a powerful tool of research and supervision. It is now possible to monitor complicated operations and treatment. Surgery made great advances in the space age; the introduction of transplant techniques attracted worldwide publicity and interest. But perhaps of greater long-term significance is research in biology, with the aid of modern techniques and instruments, that began to unlock the mysteries of cell formation and reproduction through the self-replicating properties of the DNA molecules present in all living substances and thus to explore the nature of life itself.
Food production has been subject to technological innovation such as accelerated freeze-drying and irradiation as methods of preservation, as well as the increasing mechanization of farming throughout the world. The widespread use of new pesticides and herbicides in some cases reached the point of abuse, causing worldwide concern. Despite such problems, farming was transformed in response to the demand for more food; scientific farming, with its careful breeding, controlled feeding, and mechanized handling, became commonplace. New food-producing techniques such as aquaculture and hydroponics, for farming the sea and seabed and for creating self-contained cycles of food production without soil, respectively, are being explored either to increase the world supply of food or to devise ways of sustaining closed communities such as may one day venture forth from the Earth on the adventure of interplanetary exploration.
One industry that has not been deeply influenced by new control-engineering techniques is construction, in which the nature of the tasks involved makes dependence on a large labour force still essential, whether it be in constructing a skyscraper, a new highway, or a tunnel. Nevertheless, some important new techniques appeared since 1945, notably the use of heavy earth-moving and excavating machines such as the bulldozer and the tower crane. The use of prefabricated parts according to a predetermined system of construction became widespread. In the construction of housing units, often in large blocks of apartments or flats, such systems are particularly relevant because they make for standardization and economy in plumbing, heating, and kitchen equipment. The revolution in home equipment that began before World War II has continued apace since, with a proliferation of electrical equipment.
Transport and communications
Many of these changes were facilitated by improvements in transport and communications. Transport developments have for the most part continued those well established in the early 20th century. The automobile proceeded in its phenomenal growth in popularity, causing radical changes in many of the patterns of life, although the basic design of the motorcar has remained unchanged. The airplane, benefiting from jet propulsion and a number of lesser technical advances, made spectacular gains at the expense of both the ocean liner and the railroad. However, the growing popularity of air transport brought problems of crowded airspace, noise, and airfield siting.
World War II helped bring about a shift to air transport: direct passenger flights across the Atlantic were initiated immediately after the war. The first generation of transatlantic airliners were the aircraft developed by war experience from the Douglas DC-3 and the pioneering types of the 1930s incorporating all-metal construction with stressed skin, wing flaps and slots, retractable landing gear, and other advances. The coming of the big jet-powered civil airliner in the 1950s kept pace with the rising demand for air services but accentuated the social problems of air transport. The solution to these problems may lie partly in the development of vertical takeoff and landing techniques, a concept successfully pioneered by a British military aircraft, the Hawker Siddeley Harrier. Longer-term solutions may be provided by the development of air-cushion vehicles derived from the Hovercraft, in use in the English Channel and elsewhere, and one of the outstanding technological innovations of the period since 1945. The central feature of this machine is a down-blast of air, which creates an air cushion on which the craft rides without direct contact with the sea or ground below it. The remarkable versatility of the air-cushion machine is beyond doubt, but it has proved difficult to find very many transportation needs that it can fulfill better than any craft already available. Despite these difficulties, it seems likely that this type of vehicle will have an important future. It should be remembered, however, that all the machines mentioned so far, automobiles, airplanes, and Hovercraft, use oil fuels, and it is possible that the exhaustion of these will turn attention increasingly to alternative sources of power and particularly to electric traction (electric railroads and autos), in which field there have been promising developments such as the linear-induction motor. Supersonic flight, for nearly 30 years an exclusive capability of military and research aircraft, became a commercial reality in 1975 with the Soviet Tu-144 cargo plane; the Concorde supersonic transport (SST), built jointly by the British and French governments, entered regular passenger service early in 1976.
In communications also, the dominant lines of development continue to be those that were established before or during World War II. In particular, the rapid growth of television services, with their immense influence as media of mass communication, was built on foundations laid in the 1920s and 1930s, while the universal adoption of radar on ships and airplanes followed the invention of a device to give early warning of aerial attack. But in certain features the development of communications in the space age has produced important innovations. First, the transistor, so significant for computers and control engineering, made a large contribution to communications technology. Second, the establishment of space satellites, considered to be a remote theoretical possibility in the 1940s, became part of the accepted technological scene in the 1960s, and these have played a dramatic part in telephone and television communication as well as in relaying meteorological pictures and data. Third, the development of magnetic tape as a means of recording sound and, more recently, vision provided a highly flexible and useful mode of communication. Fourth, new printing techniques were developed. In phototypesetting, a photographic image is substituted for the conventional metal type. In xerography, a dry copying process, an ink powder is attracted to the image to be copied by static electricity and then fused by heating. Fifth, new optical devices such as zoom lenses increased the power of cameras and prompted corresponding improvements in the quality of film available to the cinema and television. Sixth, new physical techniques such as those that produced the laser (light amplification by stimulated emission of radiation) made available an immensely powerful means of communication over long distances, although these are still in their experimental stages. The laser also acquired significance as an important addition to surgical techniques and as an instrument of space weaponry. The seventh and final communications innovation is the use of electromagnetic waves other than light to explore the structure of the universe by means of the radio telescope and its derivative, the X-ray telescope. This technique was pioneered after World War II and has since become a vital instrument of satellite control and space research. Radio telescopes have also been directed toward the Sun’s closest neighbours in space in the hope of detecting electromagnetic signals from other intelligent species in the universe.
Military technology in the space age has been concerned with the radical restructuring of strategy caused by the invention of nuclear weapons and the means of delivering them by intercontinental ballistic missiles. Apart from these major features and the elaborate electronic systems intended to give an early warning of missile attack, military reorganization has emphasized high maneuverability through helicopter transport and a variety of armed vehicles. Such forces were deployed in wars in Korea and Vietnam, the latter of which also saw the widespread use of napalm bombs and chemical defoliants to remove the cover provided by dense forests. World War II marked the end of the primacy of the heavily armoured battleship. Although the United States recommissioned several battleships in the 1980s, the aircraft carrier became the principal capital ship in the navies of the world. Emphasis now is placed on electronic detection and the support of nuclear-powered submarines equipped with missiles carrying nuclear warheads. The only major use of nuclear power since 1945, other than generating large-scale electric energy, has been the propulsion of ships, particularly missile-carrying submarines capable of cruising underwater for extended periods.
The rocket, which has played a crucial part in the revolution of military technology since the end of World War II, acquired a more constructive significance in the U.S. and Soviet space programs. The first spectacular step was Sputnik 1, a sphere with an instrument package weighing 184 pounds (83 kilograms), launched into space by the Soviets on Oct. 4, 1957, to become the first artificial satellite. The feat precipitated the so-called space race, in which achievements followed each other in rapid succession. They may be conveniently grouped in four chronological although overlapping stages.
The first stage emphasized increasing the thrust of rockets capable of putting satellites into orbit and on exploring the uses of satellites in communications, in weather observation, in monitoring military information, and in topographical and geological surveying.
The second stage was that of the manned space program. This began with the successful orbit of the Earth by the Soviet cosmonaut Yury Gagarin on April 12, 1961, in the Vostok 1. This flight demonstrated mastery of the problems of weightlessness and of safe reentry into the Earth’s atmosphere. A series of Soviet and U.S. spaceflights followed in which the techniques of space rendezvous and docking were acquired, flights up to a fortnight were achieved, and men “walked” in space outside their craft.
The third stage of space exploration was the lunar program, beginning with approaches to the Moon and going on through automatic surveys of its surface to manned landings. Again, the first achievement was Soviet: Luna 1, launched on Jan. 2, 1959, became the first artificial body to escape the gravitational field of the Earth, fly past the Moon, and enter an orbit around the Sun as an artificial planet. Luna 2 crashed on the Moon on Sept. 13, 1959; it was followed by Luna 3, launched on Oct. 4, 1959, which went around the Moon and sent back the first photographs of the side turned permanently away from the Earth. The first soft landing on the Moon was made by Luna 9 on Feb. 3, 1966; this craft carried cameras that transmitted the first photographs taken on the surface of the Moon. By this time excellent close-range photographs had been secured by the United States Rangers 7, 8, and 9, which crashed into the Moon in the second half of 1964 and the first part of 1965; and between 1966 and 1967 the series of five Lunar Orbiters photographed almost the entire surface of the Moon from a low orbit in a search for suitable landing places. The U.S. spacecraft Surveyor 1 soft-landed on the Moon on June 2, 1966; this and following Surveyors acquired much useful information about the lunar surface. Meanwhile, the size and power of launching rockets climbed steadily, and by the late 1960s the enormous Saturn V rocket, standing 353 feet (108 metres) high and weighing 2,725 tons (2,472,000 kilograms) at lift-off, made possible the U.S. Apollo program, which climaxed on July 20, 1969, when Neil Armstrong and Edwin Aldrin clambered out of the Lunar Module of their Apollo 11 spacecraft onto the surface of the Moon. The manned lunar exploration thus begun continued with a widening range of experiments and achievements for a further five landings before the program was curtailed in 1972.
The fourth stage of space exploration looked out beyond the Earth and the Moon to the possibilities of planetary exploration. The U.S. space probe Mariner 2 was launched on Aug. 27, 1962, and passed by Venus the following December, relaying back information about that planet indicating that it was hotter and less hospitable than had been expected. These findings were confirmed by the Soviet Venera 3, which crash-landed on the planet on March 1, 1966, and by Venera 4, which made the first soft landing on Oct. 18, 1967. Later probes of the Venera series gathered further atmospheric and surficial data. The U.S. probe Pioneer Venus 1 orbited the planet for eight months in 1978, and in December of that year four landing probes conducted quantitative and qualitative analyses of the Venusian atmosphere. Surface temperature of approximately 900 °F reduced the functional life of such probes to little more than one hour.
Research on Mars was conducted primarily through the U.S. Mariner and Viking probe series. During the late 1960s, photographs from Mariner orbiters demonstrated a close visual resemblance between the surface of Mars and that of the Moon. In July and August 1976, Vikings 1 and 2, respectively, made successful landings on the planet; experiments designed to detect the presence or remains of organic material on the Martian surface met with mechanical difficulty, but results were generally interpreted as negative. Photographs taken during the early 1980s by the U.S. probes Voyagers 1 and 2 permitted unprecedented study of the atmospheres and satellites of Jupiter and Saturn and revealed a previously unknown configuration of rings around Jupiter, analogous to those of Saturn.
In the mid-1980s the attention of the U.S. space program was focused primarily upon the potentials of the reusable space shuttle vehicle for extensive orbital research. The U.S. space shuttle Columbia completed its first mission in April 1981 and made several successive flights. It was followed by the Challenger, which made its first mission in April 1983. Both vehicles were used to conduct myriad scientific experiments and to deploy satellites into orbit. The space program suffered a tremendous setback in 1986 when, at the outset of a Challenger mission, the shuttle exploded 73 seconds after liftoff, killing the crew of seven. The early 1990s saw mixed results for NASA. The $1.5 billion Hubble Space Telescope occasioned some disappointment when scientists discovered problems with its primary mirror after launch. Interplanetary probes, to the delight of both professional and amateur stargazers, relayed beautiful, informative images of other planets.
At the dawn of the space age it is possible to perceive only dimly its scope and possibilities. But it is relevant to observe that the history of technology has brought the world to a point in time at which humankind, equipped with unprecedented powers of self-destruction, stands on the threshold of extraterrestrial exploration.