Skip to main content

Currently Skimming:

6 WHAT IS COMPUTER SCIENCE AND ENGINEERING?
Pages 163-216

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 163...
... The contributions of solid-state physicists and materials scientists to the increase of computer power are undeniable; their efforts have made successive generations of electronic components ever smaller, faster, lighter, and cheaper. But the ability to organize these components into useful computer hardware (e.g., processors, storage devices, displays)
From page 164...
... The key intellectual themes in CS&E are algorithmic thinking, the representation of information, and computer programs. An algorithm is an unambiguous sequence of steps for processing information, and computer scientists and engineers tend to believe in an algorithmic approach to solving problems.
From page 165...
... into a "low-level" version that the computer can carry out (unexecuted; the execution of a program by a computer is what allows the algorithm to come alive, instructing the computer to perform the tasks the person has requested. Computer programs are thus the essential link between intellectual corrstructs such as algorithms and information representations arid the computers that enable the information revolution.
From page 166...
... Algorithmic thinking, information representation, and computer programs are themes central to all subfields of CS&E research. Box 6.1 illustrates a typical taxonomy of these subfields.
From page 167...
... Those who design computer languages (item two in Box 6.1) with which people write programs also concern themselves with algorithms and information representation.
From page 168...
... . ABSTRACTIONS IN COMPUTER SYSTEMS While algorithmic thinking, computer programs, and information representation are the key intellectual themes in the study of CS&E, the design, construction, and operation of computer systems require talents from a wide range of fields, such as electrical engineering for hardware, logic and mathematical analysis for writing programs, and psychology for the design of better user interfaces, to name just a few.
From page 169...
... For example, the fastest computers today have around 100 billion transistors, while the personal computer on one's desk may have "only" a few million. People routinely use computer programs that are hundreds of thousands of lines long, and the largest software systems involve tens of millions of lines; printed at 50 lines per paper page, such a system might weigh several tons.
From page 170...
... The user-computer communication hardware for personal computers is the keyboard, mouse, and video screen, while examples of machine-machine communication hardware are telephone modems and networks such as Appletalk and Ethernet. While the characteristics of the underlying hardware are what ultimately determine the computational power of a computer system, direct manipulation of hardware would be cumbersome, difficult, and error-prone, a lesson learned in the earliest days of computing when programmers literally connected wires to program a computer.
From page 171...
... MS-DOS in IBM Personal Computers and the System in Macintoshes are examples of operating system software; Novell is the brand name of one type of widely used networking software; Dbase IV is a popular database management system; Microsoft Basic and Borland C are examples of compiler system software; and Microsoft Windows and the Macintosh Desktop are examples of user interface system software. While the services provided by system software are usually required by all users, they are not in themselves sufficient to provide computing that is useful in solving the problems of the end user, such as the secretary or the accountant or the pilot.
From page 173...
... A function implemented in software executes more slowly, but is much easier to change. Abstractions enable computer scientists and engineers to deal with large differences of scale.
From page 174...
... Querying a database of a billion records requires a strategy different from one for querying a database of a thousand records, since narrowing the search is much more difficult in the larger case; writing a program to run on a computer that performs 10 billion calculations per second most likely requires an approach different from one for a program written to run on a computer that performs one million calculations per second, since the former is likely to be a parallel computer and the latter a serial computer. Bridging the gap between today's practice and the ideal making the "ought-to-be's" come true is the goal of much CS&E research today.
From page 175...
... Even within the design of a microelectronic chip, the task is organized around such specialties as circuit design, logic design, system organization, testing, and verification, and each of these specialties must deal with physical design and layout together with logical behavior. Design techniques similar to those employed for chips are applied at higher levels to assemble aggregates of chips on circuit boards, and to provide interfaces to disks, displays, communication networks, and other electronic or electromechanical systems.
From page 176...
... implemented electronic gates, adders, selectors, decoders, register memories, and other modules that had previously required circuit boards of discrete transistors and other electronic components. Photomasks for SSI and MSI chips were simple enough that they could be cut by hand from large sheets of plastic film before being photographically reduced.
From page 177...
... Many advances in chip design resulted from the application of lessons learned from managing the complexity of large programs. Structured design disciplines based on a layout-cell hierarchy encouraged designers to compose layouts flexibly and systematically, to focus on improvements at the high levels of algorithms and overall organization, and to abstain from certain low-level optimizations (analogous to "goto" statements in programming)
From page 178...
... The net result of these advances in design disciplines and tools is that today's state-of-the-art chips, although they approach being 100 times as complex as those of a decade ago, require a comparable design effort. Whereas a decade ago VLSI-design courses were offered in only a few universities, today they are offered in approximately 200 colleges and universities and have supplied the increasing demand for chip designers.
From page 179...
... These algorithms have been applied both to the internal design of VLSI chips and to the design of special-purpose systems such as digital signal processors, and to the programming of parallel and concurrent computers. The analyses of communication topologies hare, in addition, been important to the design of the communication networks used in programmable parallel and concurrent computers.
From page 180...
... These innovations were first applied to the supercomputers of the 1960s, and they are applicable to single-chip design issues as well. Fast memories require more area per bit than do slow memories; it is accordingly not economical to implement the primary memory of a computing system entirely from fast memories.
From page 181...
... "Virtual memory" refers to the ability of a process to refer to a larger range of memory addresses than may be present in primary memory at any one time. In addition to their demand for execution cycles, processes require memory resources.
From page 182...
... For instance, if process X reserves airline seats and process Y records cancellations, X must wait for Y whenever all seats for a flight are booked. Computer scientists and engineers invented synchronization mechanisms to deal with these problems long before they became common in practice.
From page 183...
... Data Communications and Networking Computer technology is becoming increasingly decentralized; the personal computers on office desks are an example of this trend. Similarly, centralized corporate control is passing down the hierarchy, and in many cases the hierarchy itself is flattening.
From page 184...
... While nationwide packet networks such as the ARPANET met the needs of the 1970s, a new requirement arose from the proliferation of personal computers in the 1980s. Personal computers presented a rapidly rising demand for connectivity among themselves, peripherals, and servers in a local office environment.
From page 185...
... Computer science researchers in the 1960s created the relational data model to represent data in a simple way. Computer engineers worked through the 1970s on techniques to implement this model.
From page 186...
... Without the early seed work by computer engineers, the implementation feasibility would not have been demonstrated. And without these two advances, we would likely still be programming databases in low-level programming languages.
From page 187...
... In addition/ consistent With the application-ore ented trend of computer science in general there is considershle interest in databases designed specifically for certain problems: geographic databases, text databases/ image databases/ and scientific databases. Programming Languages/ Comp11er~ and Software Engineering brow [~as Advances in science often lead to the invention of notations to express new concepts or to Employ and unify old ones.
From page 188...
... 188 COMPUTING THE FUTURE new kind of notation for expressing algorithms in terms more appropriate to human discourse, while nevertheless meeting the exacting mechanical needs of computers. The development of programming languages introduced new challenges.
From page 189...
... Third, a conflict between transparency of notation and efficiency of execution was introduced; programs have to be reasonably efficient as well as perspicuous. Since the dawn of the computer age, many developments in the area of programming language design have emerged developments that are reflected in a succession of programming languages that make it easier to express algorithms precisely in various problem domains.
From page 190...
... can and have been built. Without many of the software engineering tools that have been developed in the last 30 years, the construction of large software systems would not be possible.
From page 191...
... so that there is a trail of implementation responsibility comparable to the change history found on engineering drawings. "Discovery tools" are an important adjunct to source code control.
From page 192...
... Major packages of routines for statistics and other fields of mathematics and engineering are also available. Programming tools, as exemplified by Unix, foster the construction of systems out of many small and independent generic components coupled by a "shell" language.
From page 193...
... However, by taking advantage of certain features of a relatively new programming language, Ada, it is possible, under certain circumstances, to implement conveniently an algorithm ensuring that all tasks will be completed on time without knowing the precise details of the execution sequence. This algorithm-the rate monotonic scheduling algorithm also enables the convenient accommodation of changes in task specification without re-doing the entire time line, and is starting to come into use in some real-time aerospace applications.
From page 194...
... Chapter 1 noted that the solution of certain partial differential equations has been speeded up by a factor of about fell since 1945. A large part of this speedup is due to the development of faster algorithms, as outlined in Table 6.1.
From page 195...
... In many problems faced by industry and commerce, other types of algorithms are often relevant, including linear programming (LP) algorithms and algorithms to solve the traveling salesman problem.
From page 196...
... In manufacturing electronic components, the drilling of holes in circuit boards
From page 197...
... The Turing machine is also a vehicle through which it can be shown that all computational problems have an intrinsic difficulty that theoretical computer scientists call complexity-an inescapable minimum degree of growth of solution time with problem size. The solution time for a problem with high complexity grows very rapidly with problem size, whereas the solution time for a problem with low complexity grows slowly with problem size.
From page 198...
... Computational Complexity The study of the complexity of computations, referred to as computational complexity theory, has revealed remarkably rich connections between problems. Good examples of complexity theory arise in the context of linear programming and traveling salesman scheduling problems.
From page 199...
... Good solutions have been obtained for practical problems with 10,000 to 1 million cities. Besides its practical relevance, computational complexity provides beautiful examples of how the computing paradigm has permitted computer scientists to give precise formulations to relevant problems that previously could be discussed only in vague, nonscientific terms.
From page 200...
... Design, construction' testing, and measurement of computer programs are the major elements of the process by which AI researchers validate models and mechanisms of intelligent action. The original research paradigm in AI involved the following steps: · Identify a significant problem that everyone would agree requires "intelligence." · Identify the information needed to produce a solution (conceptualization)
From page 201...
... The impact of AI in manufacturing manifests itself at least in two ways: in: the automation of assembly, quality control, arid scheduling operations and in the use of AI techniques in engineering design. TABLE 6.2 Examples of Expert Systems in Use Company Expert System Purpose Schlumb erger Dip meter Advisor Kodak Hewlett-Packard Xerox Injection Molding Advisor Trouble Shooting Advisor PRIDE Interpret data from oil wells for oil prospecting Diagnose faults and suggest repairs for plastic molding mechanisms Diagnose faults in semiconductor manufacturing Design paper-handling systems Hazeltine OPGEN Plan assembly instructions for PC boards SOURCE: Data from Raj Reddy, "Foundations and Grand Challenges of Artificial Intelligence," AI Magazine, Volume 9(4)
From page 202...
... Many concepts in programming languages and environments arose directly from attempts by AI researchers to build AI software. AI languages like Lisp, Prolog, and Scheme are used in many parts of CS&E.
From page 203...
... For example, a checker-playing program would generate possible moves up to a certain play and would select the move using an evaluation function like "choose a move that results in the largest piece gain." The evaluation functions are heuristic: they are rules of thumb that work most but not all of the time. This work in heuristic search borrows from operations research, statistics, and theoretical computer science.
From page 204...
... Theoretical AI should lead to the development of a new breed of approximate but nearly optimal algorithms that obtain inputs as an ongoing process during a computation and that cannot precommit to a sequence of steps before embarking on a course of action.~3 And finally, AI will continue its quest for theories about how the human mind processes information made available to it directly via the senses and indirectly via education and training. Computer Graphics and User Interfaces Graphics Computer graphics has its roots in data plotting for science and engineering.
From page 205...
... WIMP graphical user interfaces have three major advantages over the use of command languages. They simplify computer operation for novices, because they provide a constant reminder of what the allowable actions are at any given moment.
From page 206...
... In the early 1970s, the Xerox Palo Alto Research Center (PARC) pioneered bitmap raster graphics workstations with graphics in the user interface, both for office automation tools such as word processors and illustration programs and for software development (e.g., Smalltalk)
From page 207...
... 207 As users have required more and more interactive capability with their computers, the popularity of graphical user interfaces and graphical displays has grown by leaps and bounds. Now, millions of users have graphical interfaces on their desks in the office or at home, and it is often possible for users to operate new applications with little or no reference to a manual.
From page 208...
... But human beings also make use of sound, touch, and gestures to communicate information. Computer scientists and engineers have thus developed a variety of
From page 209...
... Intellectual Challenges Computer graphics and interactive computing pose many intellectual challenges for CS&E. In the earliest days of computing, most of the computational power in a computer system could be devoted
From page 210...
... An orthographic diagram provides front, top, and side views of an object. In some sense, the orthographic diagram is "equivalent" to the metal bracket in hand, since the engineering diagram can be used to manufacture the bracket.
From page 211...
... These representations are limited only by the imagination of their creator, since any representation that can be imagined can be implemented. Thus computer graphics can be said to have spawned a new set of concepts for depicting information that were not practical prior to the computer.
From page 212...
... (The 80X86 processor is the heart of IBM-compatible personal computers.) In designing the 80286 processor chip, Intel engineers had to use seven different brands of computers depending on the step of the design process.
From page 213...
... Hence the designers of the Intel 80486 and 80586 (as well as their customers) have and will benefit from advances in hardware technology, operating systems, compilers, user interfaces, design software, and theory of algorithms.
From page 214...
... A grammar checker builds on a word-processing program, and thus the word processor plays a "system software" role for the grammar checker. But the word processor builds upon the operating system, so that the word processor is applications software from the perspective of the operating system.
From page 215...
... An area of active investigation today within theoretical computer science concerns the use of different models that differ in the fidelity with which they match actual computers. For example, some theoretical computer scientists consider so-called random access machine models, on the grounds that these models can access any memory location in a constant number of steps, while a Turing machine can access a given memory location only by stepping through all preceding locations.
From page 216...
... , the matrix multiplication requires on the order of N281 arithmetic operations. While such an algorithm involves some additional overhead, it turns out that for sufficiently large values of N


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.