Skip to main content

Currently Skimming:

4 TECHNOLOGY OPTIONS AND CAPABILITIES: WHAT DOES WHAT, HOW
Pages 115-160

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 115...
... THE CHANGING NATURE OF TECHNOLOGY AND COMMUNICATIONS Communications services have been transformed by a long series of innovations, including copper wire, coaxial cable, microwave transmission, and optical fiber. Each has expanded the available bandwidth, and therefore carrying capacity, at reduced unit cost.
From page 116...
... But increased processing power can also often be used to greater advantage to increase flexibility and generality, attributes that are key to much of the ongoing transformation of communications technology and thus the communications industry itself. Three specific trends relating to increased flexibility and generality are relevant to the steering committee's assessment: the increasing use of software rather than hardware for implementation of functions, the increasing modularity of design, and the increasing ability to process and transform the data being transported within the communications system.
From page 117...
... A third key trend resulting from increasing processing power is the ability to process and transform data carried in the communications system. One consequence is increased interoperation among previously incompatible systems.
From page 118...
... It is these trunks that permitted the construction of the Internet and the switched packet networks such as frame relay and switched multimegabit data service, and in the past, X.25 networks. These trunks permitted the construction of on-line information service networks and the private networks that today serve almost all of the major corporations.
From page 119...
... Building Services on Each Other Just as the separation of infrastructure facilities from services permits the construction of a range of services on top of a common infrastructure, so, too, can one service be constructed by building it on top of another. This layered approach to constructing services is a consequence of the trends discussed above increased processing power, and more modular design with defined interfaces to basic infrastructure facilities.
From page 120...
... Beneath these services and overlays of other services lie the physical facilities, such as the fiber trunks, the hybrid fiber coaxial cable and cable systems, the local telephone loops, and the satellites, as well as the switches that hook all of these components together. These are the building blocks on which all else must stand, and it is thus the technology and the economics of this sector that require detailed study and understanding.
From page 121...
... RESOLVING THE TENSION: THE INTERNET AS AN EXAMPLE The simple approach to separation of infrastructure from service involves defining an interface to the basic infrastructure facilities and then constructing on top of that interface both cost-reduced support for the mature services and general support for new applications. Thus, the telephone system provides interfaces directly to the high-speed trunks, and the television cable systems define an interface to the analog spectrum of the cable.
From page 122...
... See Box 4.1. Based on its assessment of industry trends, the steering committee thus concluded that the call for an open, technology-independent bearer service as a basis for emerging applications, as voiced in RTIF, was correct, and that a more concrete conclusion is now justified: the Internet standards are being widely used to enable new applications and are seen by a great majority of commercial players as the only viable option for an open, application-independent set of service interfaces at this time.
From page 124...
... The section titled "The Internet," included below in this chapter, further clarifies what the Internet really is and elaborates on some of its future directions. The Coexistence of New and Mature Services Experience with the Internet shows the power of a general bearer service as an environment in which new applications can come into existence.
From page 125...
... The standards specific to that application of course predate the Internet standards. The inEastruc~re and Interface standards supporting We telephone system are mature and stable and have been engineered to provide very cosLedecOve delivery of We service.
From page 126...
... Since the eventual business structure of an unproven innovation is usually unclear, it seems reasonable by default to innovate in an open context, which the steering committee sees as maximizing the chance of success. The interface that needs to be open is the application-independent interface (the bearer service, or in specific terms the Internet protocol)
From page 127...
... For such access networks, a very important technical approach currently embraced by many cable providers and also being evaluated by telephone companies is hybrid fiber coaxial cable (HFC; often abbreviated as hybrid fiber coax)
From page 128...
... Divided among 40 homes, the downstream capacity is about 35 Mbps per home, which is sufficient to support several video streams to each homed Given the considerable interest in what sort of data services might be possible with an HFC approach, it is useful to look at what can be achieved with current products. In the last year a number of devices called cable modems were offered to the cable industry that use one or more video channels in each direction to carry bidirectional data across coax and HFC systems (Robichaux, 1995b)
From page 129...
... At least some of these cable modems transmit data toward the home using a frequency above that of all of the channels carried by current cable systems today. Data services thus can be added to an existing cable system without requiring the cable operator to give up any of its existing cable video offerings.
From page 130...
... that could serve as a practical infrastructure for emerging applications, or more specifically to support the Internet standards for this purpose. Despite some disagreement, a reasonable conclusion is that although the HFC systems may or may not provide enough capacity to serve a fully developed market of as yet unknown applications, there is enough capacity to explore the marketplace and to let consumers who wish to lead the market purchase enough capacity to get started.
From page 131...
... However, systems such as FTTC with coaxial cable to the home are capable of delivering very substantial data rates into the home, potentially hundreds of megabits per second, depending on the specific approach used. Thus a lack of fiber all the way to the home should not be equated with an inevitable bottleneck for bandwidth in the path.
From page 132...
... These digital loop carrier systems can be more economical than individual wire pairs to each home, but they complicate the problem of sufficient broadband access by allocating to each home only the equivalent of one voice channel of digital capacity. This constraint prevents the use of the copper pair to support higher data rates, a capability that telephone providers have sought for several years, in order to be able to support delivery of video or interactive data access.
From page 133...
... These newer forms of digital subscriber line technology could be used over the installed copper wires,-where distance and other characteristics permit, for interactive data access (such as connection to the Internet) and would provide higher data rates than does ISDN.
From page 134...
... Thus, from the perspective of a telephone provider, the Internet is a service that would operate on top of one of the services listed above. While ISDN was originally defined as a switched service fully capable of carrying data across the entire telephone system, today it is important to the information infrastructure primarily as a fast method for accessing private data networks and the Internet.
From page 135...
... Asynchronous Transfer Mode The aim of ATM is to provide a flexible format for handling a mix of future traffic on the one hand, streams of high-speed, real-time information like video, and, on the other, packetized, non-real-time information like electronic mail. ATM represents one attempt to handle both kinds of traffic efficiently based on a technical compromise- small packets or "cells" with a fixed length of 48 bytes, and a connection-oriented approach built around "virtual channels" and "virtual paths" so that routes for packets are preestablished and packets within a given connection experience similar delays on their trips through the network.
From page 136...
... These issues are not trivial, and much work is currently being done to resolve the efficiency of Internet transport through an ATM system. But if these issues can be resolved, both ATM and the Internet protocols should benefit, since the telecommunications industry will acquire a new technology that it can deploy for high-speed data transfer, and the Internet will acquire a new and advanced infrastructure over which it can operate.
From page 137...
... ATM has always been based on switching at the hub, but both 10- and 100-Mbps Ethernet now also offer switching as a product option. Wireless Wireless communication offers a number of options for local networking as well as for advancing access to the information infrastructure.
From page 138...
... This focus reflects the relative maturity of those markets, which permits somewhat better business planning. It also reflects the fact that, given current technology options, installing wireless infrastructure that meets more general service requirements such as higher-speed interactive data access is rather more
From page 139...
... (Note, however, that some options for cellular and PCS systems involve relatively small cell sizes and small antenna structures that do not need towers.) Although the options for higher-speed wireless data transmission are
From page 140...
... This technical advance opens up a large range of spectrum for consumer devices. Other products are available that offer short-range LAN emulation and point-to-point data communications over a few miles at data rates of up to LAN speeds (a few megabits per second)
From page 141...
... The development of standards for digital television signals can permit four channels at the same resolution to be transmitted in the space where one was carried before. These new digital channels represent an asset that can be used to provide additional traditional television channels, or perhaps deflected into some new service.
From page 142...
... Terrestrial broadcast standards, which must take into account worse noise conditions, are currently using a more robust scheme, called 8-VSB, that fits 19 Mbps of data into the same channel. In contrast, satellite television broadcast, because of the extreme limits on power and resultant poor noise conditions, uses a very robust but bandwidth-consuming scheme, called QPSK, that uses four analog video channels to carry 27 Mbps of data.
From page 143...
... These wireless approaches may provide an interim way for telephone companies, broadcasters, and other players interested in video delivery to compete with current cable providers without building a complete hybrid fiber coaxial cable or fiber-to-the-curb infrastructure. The NII is intended to provide for interactivity between the consumer and the program provider.
From page 144...
... In the business communications market, very small aperture terminal (VSAT) systems are used by firms such as hotels, department store chains, and car dealerships to conduct data communications between a central office and remote sites, at variable data rates depending on the nature of the system; some have sufficient capacity to support video transmission.
From page 145...
... Reducing customers' energy usage would reduce overall power demand and enable utilities to delay investing in new power-generation facilities. As discussed in the white paper by John Cavallini et al., only a very small fraction of the capacity of a fiber network that reached into customer premises would be needed to support energy services.
From page 146...
... Indeed, a major part of the success of the Internet has been its ability to change as new technologies and service requirements emerge. At the present time, the explosive growth of the public Internet has forced a redesign of the central protocol the Internet protocol (IP)
From page 147...
... The Internet, by its open architecture, has permitted a great number of individuals to conceive and try out innovative ideas, and some of these, like the World Wide Web, have taken root and become a basic part of the success of the Internet. The process by which Internet standards are set reflects this philosophy of open involvement.
From page 148...
... The format of the information is viewed as a higher-level problem. Of course, the Internet standards included a description of how those bytes were to be formatted for critical applications such as electronic mail.
From page 149...
... ~ooF oF~ ~# ^~ 1lllltllllllllllllllllllW4atlllllllWillllll1 IN lh~llllllll! l.~-e<>t 1 REP ~111111111111-
From page 150...
... Second, one objective of the design of the standards is to make it as easy as possible for networks within the Internet both public networks of Internet service providers and private networks of corporations, institutions, and individual-to connect together. Thus the Internet is open to providers as well as users.
From page 151...
... are coordinated by the American National Standards Institute. Standard interfaces allow new products related to information infrastructure to intemperate with each other and with existing products.
From page 152...
... In the computer industry, this trend has been driven by market competition, the rapid pace of computer technology change, and the "bandwagon" effect that leads consumers to adopt technologies that appear to be emerging as widespread standards rather than risk being left unable to interoperate with other users and systems.~5 In practice, standards development exists within a continuum. Many computer industry standards are formalized in national and international standards organizations, such as the International Organization for Standardization although these standards frequently lag the de facto processes of the market.
From page 153...
... It remains to be seen how IETF and informal Internet standards-setting processes will evolve and function in the future.20 There is strong private sector motivation for effective standards setting. Many participants at the forum and workshop said in effect that while the government should act to facilitate effective standards setting, it must not create roadblocks to such efforts by imposing government-dictated standards processes.2: Government use of private, voluntary standards in its own procurement, however, can be supportive.22 The process of setting standards is only one part of the delay in getting a new idea to market.
From page 154...
... Under this scheme, an interpreter for such a language would be installed on all the relevant computers. Once this step was taken, a new application could be written in this language and immediately transferred automatically to any prepared computer.
From page 155...
... This shift might help ameliorate the economic challenge of providing NII access for the less affluent. However, many are skeptical that this shift of processing power back from the end node and into the network, which runs counter to the recent history of the computer industry, will prove effective.
From page 156...
... At the same time that some are calling for more regimented approaches to Internet management and control, others argue that the Internet style of control is preferable to the model that more closely derives from the traditions of the telephone company. The current ATM standards have been criticized by some for this reason.
From page 157...
... It is thus the case that there is still a significant set of technical disagreements and uncertainties about the best approach to network management and operation, both for the maturing Internet and for the next generation of technologies for the mature services such as voice and video. An issue that is now receiving considerable attention is pricing and cost recovery in the Internet (see Chapter 3 for more discussion)
From page 158...
... 12. Internet standards are discussed and set by the Internet Engineering Task Force (IETF)
From page 159...
... and spearheaded by the American National Standards Institute, has attempted since mid-1994 to bring together a large number and variety of organizations and entitites concerned with standards relating to the national and global information infrastructure. In late November 1995, the IISP issued a list of 35 "standards needs," ranging across such areas as reliability, quality of service, provision of protections (e.g., security)
From page 160...
... . It notes that "there is already existing government policy which covers the preference and advantages to government selection of voluntary standards (e.g., consensus standards)


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.