Information Technologies in Industry and Society
LARS RAMQVIST
BENJAMIN DISRAELI ONCE SAID, “The most successful man is the one who has the best information.” This remark summarizes the business of information technologies—the production, processing, storing, communication, and use of information.
Information technologies have resulted in the development of one of the world’s largest industries. Global production of electronics equipment in 1985 exceeded $400 billion, as consumption of semiconductors neared $25 billion (see Tables 1 and 2). By 1990 these production and consumption figures are expected to expand to at least $600 billion and $65 billion, respectively.
Today, cutting-edge technologies such as computers, software and artificial intelligence, fiber optics, networks, and standards have an immense impact on information technologies. Among the many applications of information technologies, three of particular importance are traditional telephony, mobile cellular telephony, and data processing and communication. Information technologies, in turn, affect many industries and society as a whole.
THE INFLUENCE OF CUTTING-EDGE TECHNOLOGIES
Very-Large-Scale-Integration Technology
Recent achievements in information technologies build on a rich history (see Table 3). The development of chip technology, for example, has been exceptional over the last three decades. Today, a million or more transistors can be included in one chip. In fact, the number of devices per chip has increased by 100 times per decade since 1958. If this pace of development could be applied to the automobile industry, it has been estimated that six
TABLE 1 Production of Electronic Equipment in 1985 (billions of U.S. dollars)
Industry |
United States |
Western Europe |
Japan |
Rest of World |
Total |
Data processing |
80.4 |
21.2 |
17.3 |
9.9 |
128.8 |
Communications |
28.6 |
17.7 |
7.5 |
4.6 |
58.4 |
Industrial |
34.9 |
17.8 |
9.5 |
4.3 |
66.5 |
Consumer |
16.2 |
10.1 |
36.1 |
12.4 |
74.8 |
Military |
49.2 |
11.0 |
— |
1.0 |
61.2 |
Transportation |
8.5 |
2.0 |
2.8 |
3.0 |
16.3 |
Total |
217.8 |
79.8 |
73.2 |
35.2 |
406.0 |
Rolls Royces could be put on the head of a pin and that each of them would cost about $3.00, give over 3,000 miles to a gallon of gas, and have enough power to drive the Queen Elizabeth II.
In very-large-scale-integration (VLSI) technology at the cutting edge of development, there are challenges in feature size, design complexity, and facilities for production. VLSI technology today includes feature sizes of less than 1 μm on the chip. On a biological scale, this is in the range of red blood cells and yeast cells to the smallest bacteria. However, feature dimensions as small as the human immunodeficiency (HIV or AIDS) virus, which is about 1,000 Å, have still not been reached (see Figure 1). The smaller the feature size, the faster the processing capacity and design complexity of the chip. Thus, the feature size is critical for the price-performance development in microelectronics.
The equivalent of hundreds of worker-years is now put into the design of a complex chip roughly 40–100 mm2 in size. This implies making full use of advanced computer-aided design technology, including cell libraries and macrocells—that is, tools by which defined and tested blocks, such as a
TABLE 2 Consumption of Semiconductor Components in 1985 (billions of U.S. dollars)
Industry |
United States |
Western Europe |
Japan |
Rest of World |
Total |
Data processing |
3.7 |
0.9 |
2.6 |
0.4 |
7.6 |
Communications |
1.4 |
1.2 |
0.8 |
0.2 |
3.6 |
Industrial |
1.5 |
1.1 |
0.8 |
0.2 |
3.6 |
Consumer |
0.7 |
1.0 |
4.0 |
0.8 |
6.5 |
Military |
1.5 |
0.4 |
— |
0.0 |
1.9 |
Transportation |
0.8 |
0.3 |
0.3 |
0.1 |
1.5 |
Total |
9.6 |
4.9 |
8.5 |
1.7 |
24.7 |
TABLE 3 Key Achievements in Information Technologies (with inventors noted as appropriate)
Date |
Invention |
Inventor |
1455 |
Gutenberg press |
Johannes Gutenberg |
1844 |
Telegraph |
Samuel F.B.Morse |
1876 |
Telephone |
Alexander Graham Bell |
1889 |
Strowger selector |
|
1901 |
Transatlantic wireless telegraphy |
Guglielmo Marconi |
1906 |
Triode vacuum tube |
Lee De Forest |
1910 |
Teletraffic theory |
Agner K.Erlang |
1923 |
Telephotography, the iconoscope |
Vladimir K.Zworykin |
1927 |
Feedback amplifier |
Harold S.Black |
1930 |
Telex, the coaxial cable |
|
1937 |
Pulse code modulation |
Alec H.Reeves |
1937 |
Xerography |
Chester F.Carlson |
1946 |
ENIAC computer |
John W.Mauchly and J.Prosper Eckert |
1947 |
Transistor |
John Bardeen, Walter H. Brattain, and William Shockley |
1958 |
Integrated circuit |
|
1962 |
Telstar satellite |
|
1965 |
Stored program control switch |
|
1966 |
Step index optical fiber conception |
K.C.Kao |
1970 |
Optical fiber lab test |
Robert D.Maurer |
1971 |
Microprocessor |
|
1976 |
Fiber optical transmission |
|
1990s |
Possible software breakthrough with declarative languages |
|
processor unit or a memory function, can be used to build more complex designs. As chips become more complex, the trend is toward larger chips—even three-dimensional circuits—and optimization of the feature interconnect.
More limiting than feature size is the interconnect, and any drastic development in chip complexity will depend on a solution to this problem. Since the transistor or gate functions can be produced in small feature sizes, less than 1 μm, the interconnects between these features need to be of an equivalent size. However, high packaging densities and sizes that are too small will introduce such problems as increased resistance and decreased speed in the functions of the chip.
The frontier products of the late 1980s will be the 4- and 16-megabit (Mbit) dynamic random access memory (DRAM) chips, with the latter being only in the prototype stage over the next few years. In the next 8 to 10 years, however, DRAMs larger than 100 Mbits will be available on the market.
Producing these complex chips requires production facilities with a su-

FIGURE 1 VLSI development (CMOS=complementary metal-oxide semiconductor; BIP= bipolar).
perclean atmosphere. In normal hospitals, it is possible to achieve a class 10,000 in the operating rooms, that is, 10,000 particles per cubic foot. Currently, class 100 conditions are normally applied in chip production. However, the most advanced production areas in the VLSI industry of today must provide class 10 conditions, the demand for which will be a driving force into fully automated production without people in the labs. Automated production will gradually be introduced during the next decade.
Nevertheless, VLSI technology still needs people; they are the key factor to success in this area. For example, in the United States, a relatively small number of skilled designers and engineers drive the $6 billion integrated circuit industry today.
Computers
Computers are one area where VLSI technology has been fully adopted. The development of digital computing started with the ENIAC project in the 1940s, followed by the machine productivity stage started by the IBM System/ 360. During this stage the number-crunching capacity of computers was steadily increased. In the 1980s we have only begun the third step of development—simplifying information handling and user interfaces.
This development has been made possible by advances in processing power, defined in millions of instructions per second (MIPS), in the central processing unit. A modern supercomputer, for example the Cray XMP, has a maximum capacity of 200–240 MIPS (see Figure 2).
VLSI technology has also dramatically increased the processing power in individual workstations. Three MIPS are already being used in commercial workstations and 100-MIPS workstations will be available in the early 1990s.
This processing power can now be used to solve problems at the human-machine interface. In the 1990s it seems likely that 80 to 90 percent of the computer’s power will be used to support user interface functions.
New computer architectures driven by evolving software technologies will make more efficient use of the VLSI technology. Examples of such architectures are reduced-instruction-set computers (RISC) and various parallel processing arrangements, which might lead to a 10-fold increase in efficiency.
Software and Artificial Intelligence
The revolution in the software industry is still to come. We often think of the “software crisis” as a problem that can be solved only through the efforts of millions of programmers. But developments in hardware offer an opportunity to solve the software problem by combining good tools with engineering skills.
The overall development of software technology can be described as a series of discontinuities. The technology of computer languages has progressed from the machine level, where programs are written more or less as ones and zeros, to the assembler level, which affords the programmer a somewhat friendlier notation but still requires the programmer to describe the logic of the program in the same minute detail. Next are the so-called high-level languages, such as Fortran, COBOL, Pascal, and Ada, which imitate the languages of mathematics, accounting, or whatever the application area is and allow the programmer a higher level of expression. However, these languages still require the programmer to explain exactly how the computer is going to solve a problem, and therefore, the order of the lines

FIGURE 2 Digital computing development.
of code is very important. Finally, there are very high level or application-oriented languages, such as LISP, Prolog, and other so-called fourth-generation languages (4GL), in which the programmer declares what the computer should do, not how it should do it.
All the “how” languages, up to and including Ada, are called procedural, or imperative, languages. The very high level languages are declarative, or applicative, languages. They differ from the “how” languages in clarity, suitability for parallel execution, computing power requirements, and applications.
Declarative languages are usually more concise and clearer than procedural languages. They are also intrinsically suited to parallel execution, whereas procedural languages can exploit parallelism in a problem only with great difficulty. A drawback of declarative languages, until now, has been their need for substantial computing power, or preferably a new computer architecture. Procedural languages, on the other hand, are quite efficient in traditional “von Neumann” computers.
Partly because they need so many MIPS, declarative languages have found relatively few applications in industry until now. But there are reasons to believe that this is about to change—that the procedural languages are like the dinosaurs, growing larger and larger toward their extinction, and that the present declarative languages are like the first mammals, still small and hiding in the bushes, but poised ready to take over the world.
Language is only one factor that influences software efficiency and quality. The methods and tools used to support software development and handling are as important as the structure of the hardware and software.
In the telecommunications industry, large real-time systems with software written in millions of lines of code are needed to support our public switching system (AXE). These data bases sum up to more than 400 gigabytes. Another way to think of the size and complexity of this system is to consider that we have installed more than 10 million telephone lines in more than 50 countries.
The software content of a single AXE installation is on the order of 2–5 megabytes, and there are numerous versions to meet different market requirements. This calls for very good tools for releasing different versions and updates. As a result, it is absolutely vital to use results from information technologies research. With large systems, use of these results, in turn, requires extremely good software management and planning, as well as new ways of structuring systems. Reusable software and different kinds of software tools for different parts of a computerized system are needed. Today new technologies are continually being introduced; for example, artificial intelligence technology could be valuable for creating a good human-machine interface to a “conventional” computer system (see Figure 3).
Artificial intelligence combines such mechanistic concepts as repetition, precision, and data handling, and then uses this combination in the broader

FIGURE 3 Software methodology and tools.
applications of expert systems and knowledge engineering. An expert system implies a combination of a knowledge base and data linked to a general problem solver. This is in sharp contrast to conventional programming where the data are processed by “hardwire application knowledge” programs, that is, where the algorithms are processed in specially designed hardware.
Prototyping and new languages and means for specification make the early phases of development more specific; this process is important because a major task in software development is the fundamental system design. But fourth-generation languages are not always the only solution. As cutting-edge technologies advance, it is becoming more and more important to begin developing standards, formal or de facto, such as those put forth by standards-setting organizations and in operating systems of large manufacturers. Adhering to standards allows an organization to concentrate resources in areas where it can add substantial value.
The next crisis in computing will be the need to handle the rapidly growing amount of information that will be available in distributed data bases. This poses many challenges for research. For example, we need new ways of describing data and classifying relationships between data and finding and retrieving data already stored in data bases.
Fiber Optics
Increasingly, more powerful computers and complex software and artificial intelligence need advanced communications. The solution is fiber-optic transmission. The key achievements in fiber optics and related industries began in 1970 with the development by Corning of optical fiber. During the 1970s complete fiber-optic telephone networks were already up and running in the

FIGURE 4 Types of optical fibers.
United States and Sweden. The commercial breakthrough came in 1984. By 1988 the Atlantic cable will be in operation. For the 1990s we expect a breakthrough in subscriber networks, the so-called local area networks (LAN), especially for high-definition television (HDTV) and data communications.
The technical development of optical fiber includes the step-index, the graded-index, and the single-mode fibers (see Figure 4). For advanced applications and broad bandwidth networks, the single-mode fiber will be used because of its remarkable capacity to transport narrow signals.
Dramatic achievements have been reached in the practical use of fiber-optic transmission systems. We will soon have the ability to carry a million or more telephone calls on one fiber over distances of more than 1,000 kilometers without repeaters (see Figure 5). By the end of this century, a fiber pair will probably handle 100–1,000 channels, each with a bandwidth of 1 gigabit per second (Gbit/s). This can be contrasted with coaxial cable transmission technology, which permits a maximum of 10,000 telephone calls per pair, with a maximum distance of 4 kilometers between repeaters.
To build a fully optical network, it is necessary to be able to switch light. One technology is the electro-optical directional coupler (EDC), with which light can be switched in frequencies of many gigabits per second. This coupler is made of lithium niobate (LiNbO3), in which certain optical properties, such as the refractive index, change as a function of a magnetic field induced into the material.

FIGURE 5 Optical fiber transmission.
Networks
With fast optical transmission tools, it is possible to build networks for the future. Let us look at the bit rates needed for telecommunication services. All the bit rates are in one way or another linked to the original 64-kilobit-per-second (Kbit/s) rate of normal telephone services. Although in the future we will see HDTV and complex inter-LAN/PBX communication systems, it may be possible to incorporate the transmission capacity of these systems in 140-Mbit/s systems. Many qualified image-compression techniques are needed to keep the bandwidth below 140 Mbit/s, but that is already almost possible (see Figure 6).
From today’s telephony and data communication systems, broadcast TV, and terminal communications, we will see the gradual emergence of the integrated services digital network (ISDN), fiber-optical-based cable TV, and

FIGURE 6 Bit rates for telecommunication services.

FIGURE 7 Functional distribution in future ISDN/broadband network.
LANs. Finally, all systems will develop into one broadband network, having a bandwidth of 140 Mbit/s as described above.
The future ISDN/broadband network will consist of intelligent nodes. The functional distribution in such a network will be such that the services for voice, data, text, and image could be divided between the different nodes in the network. The subscribers’ view of the network, however, will be that they have continuous access to a fully equipped node and network for the services rendered (see Figure 7).
Standards
To be able to make full use of the information technologies discussed so far, one must communicate with other people, other companies, and other countries. Thus, standards are needed. The standards in information technology are based on open system interconnection (OSI). Figure 8 represents the outlook for future developments in information technology in seven layers, from the physical layer to the application layer.
APPLICATIONS OF INFORMATION TECHNOLOGIES
Powerful computers, advanced software, and fast and reliable communications represent all the basic technologies necessary for a modern information system. The ingredients of such a system will be the coexistence, connectivity, interworking, and standards of voice, data, text, and image with full reliability, availability, maintainability, and above all, ease of use. VLSI and computer technologies have made these conditions possible.
Three of the main applications of information technologies today are normal voice telephony, mobile telephony, and data communications. Regardless of ISDN and broadband network discussions and plans, much remains to be done in normal telephony before the world’s needs will be met. For
many years to come, a basic customer requirement will be voice telecommunications.
Normal Telephony
At present, normal telephony is the driving force in information technologies. The shift from analog to digital switching and transmission technologies will have a tremendous impact as the telephone network now becomes an integrated service network. It is also important to remember that the majority of the world’s inhabitants do not have access to plain voice communication.
As shown in Figures 9 and 10, there were 454 million main lines in service worldwide in 1987 and 37 million lines in the local exchange market. These numbers appear large in themselves, but in relation to the number of potential users worldwide, they show another story (see Table 4).
Not surprisingly, a clear relationship exists between the telephone density and the gross national product per capita for the different countries of the world. Telecommunications are a vital part of the development of the infrastructure of a country. For many decades to come, there will be a genuine need for basic services in public telephony.

FIGURE 8 Standards in information technology (open system interconnection).
TABLE 4 Number of Telephone Main Lines in Service per 100 Inhabitants Worldwide
Country or Continent |
Lines/100 People |
United States |
48 |
Australia |
38 |
Europe |
29 |
South America |
5 |
Asia |
4 |
Africa |
1 |
Mobile Telephony
Basic voice communication has taken a big step forward with the mobile cellular telephone system, which allows customers to communicate by telephone while in transit. Mobile telephony techniques were not developed earlier because the infrastructure lacked the necessary advanced technology. Since the 1960s, manual open systems have developed into automatic systems, which in turn evolved into cellular systems with full flexibility and reliable continuous communications regardless of where the subscriber moves within the areas covered by the base radio stations. This development was totally dependent on advances in microprocessors, digital synthesis with semiconductors, and stored program control switch technology (see Figure 11).
In January 1987 less than 3 million mobile telephones were in service worldwide, with the highest use being in North America (see Figure 12). The cumulative growth forecast for cellular phones projects a total of 7.5 million by 1990, including 2.3 million in Europe and 3.8 million in the United States.
Mobile cellular telephony is still in its developmental phase, as shown by

FIGURE 11 Background of cellular telephony.

FIGURE 12 Cumulative growth forecast (cellular phones).
the numbers of phones per 100 inhabitants in 1986 (see Table 5). In urban areas the penetration of cellular phones is higher. In 1986, for example, the number of cellular phones per 100 people ranged from 2.5 in Stockholm, to 1.0 in London, to 0.1 in Tokyo. The degree of penetration is of course a function of time, as shown in Figure 13.
It is possible that in the early 2000s, digital mobile cellular telephony will play a dominant role in telecommunications. An important step has already been taken, in that a standard for a Pan-European system was set in Copenhagen in September 1987 by the principal telecommunication administrations in Europe.
Data Communications
The pace of change in information systems is increasing in all areas. All significant trends—expanding customer needs, the proliferation of workstations, and the globalization of the business environment—point to a need for increased communication facilities and integration of various systems.
TABLE 5 Number of Cellular Telephones per 100 Inhabitants
Country or Region |
Phones/100 People |
Scandinavia |
1.4 |
Austria |
0.2 |
United Kingdom |
0.15 |
Japan |
<0.14 |
United States |
<0.14 |
Rest of Europe |
<0.1 |

FIGURE 13 Use of cellular telephones (number per 100 inhabitants) in several major cities.
Traditionally, information systems have been used to reduce administrative costs. In recent years, however, this role has been changing as leading companies adopt new strategies with information systems as central components. To carry out such strategies, the information systems must handle both technical and economic information from many systems at many geographical locations, all in real time.
The use of screen-based workstations among white-collar workers has increased dramatically over the past 5 years. Today there are more than 15 million workstations in use among the 60 million white-collar workers in the United States (see Figure 14). By 1990 the number of workstations is expected to grow to at least 35 million. The majority of these workstations are data terminals or personal computers, although the line between these categories is blurring. It is clear that in large and medium-size organizations, almost all of these workstations will be communicating with each other.
The developments in VLSI and software technologies have made workstations considerably less expensive. In 1979 the cost of a personal computer was 25 percent of the total salary of an office worker, but by 1987 this cost will be down to about 6 percent (see Figure 15). Workstations, like telephones, are now considered necessary tools for the workplace.
SOCIETAL IMPACT OF INFORMATION TECHNOLOGIES
Cutting-edge technologies have driven the development of information technologies, which in turn have driven the development of society. This process can be seen in Sweden’s development from 1880–1990 as it evolved

FIGURE 14 Projected workstation penetration of U.S. white-collar workers in 1990.
from agriculture and manufacturing into an information society. The entire Western World has experienced similar development. An information society, with guiding values of quality and the meaning of life and human relations, presents us with many new opportunities. The only limiting factors are the need for more networking and cooperation among organizations, small teams, and individuals.

FIGURE 15 Personal computing entry cost. SOURCE: Gartner Group, Inc., Stamford, Conn.
With the new information technologies at hand, we will together be able to form a society that contradicts the frightening visions of the future described in George Orwell’s 1984 or Aldous Huxley’s Brave New World. Communication implies the dissemination of information, and thus of understanding, which will be the basis for democracy and peaceful development in the future.