Page 553
  Computer Science  
  A computer is a programmable electronic device that processes data and performs calculations and other symbol manipulation tasks. There are three types: the digital computer, which manipulates information coded as binary numbers (the digits 0 and 1, representing two different signals, on and off, with an individual digit being known as a binary digit or bit); the analog computer, which works with continuously varying quantities; and the hybrid computer, which has characteristics of both analog and digital computers.  
  There are four types of digital computer, corresponding roughly to their size and intended use. Microcomputers are the smallest and most common, used in small businesses, at home, and in schools. They are usually single-user machines. Minicomputers are found in medium-sized businesses and university departments. They may support from around ten to two hundred users at once. Mainframes, which can often service several hundred users simultaneously, are found in large organizations, such as national companies and government departments. Supercomputers are mostly used for highly complex scientific tasks, such as analyzing the results of nuclear physics experiments and weather forecasting.  
  The first mechanical computer was conceived by Charles Babbage in 1835. He designed a general-purpose mechanical computing device for performing different calculations according to a program input on punched cards. In the 1880s Herman Hollerith devised the first device for high-volume data processing, a mechanical tabulating machine. In 1943 Thomas Flowers built Colossus, the first electronic computer. John Von Neumann's computer, EDVAC, built in 1949, was the first to use binary arithmetic and to store its operating instructions internally. His design still forms the basis of today's computers.  
  Parts of a Computer  
  central processing unit (CPU)  
  At the heart of a computer is the central processing unit, which executes individual program instructions and controls the operation of other parts. It is sometimes called the central processor or, when contained on a single integrated circuit, a microprocessor.  
  The CPU has three main components: the arithmetic and logic unit (ALU), where all calculations and logical operations are carried out; a control unit, which decodes, synchronizes, and executes program instructions; and the immediate access memory, which stores the data and programs on which the computer is currently working. All these components contain registers, which are memory locations reserved for specific purposes.  
  The central processing unit communicates with the component parts of the computer and/or its peripherals via a number of electrical pathways, known as buses. Physically, a bus is a set of parallel tracks that can carry digital signals; it may take the form of copper tracks laid down on the computer's printed circuit boards (PCBs), or of an external cable or connection.  




Page 554
  central processing unit The relationship between the
three main areas of a computer's central processing unit.
The arithmetic and logic unit (ALU) does the arithmetic,
using the registers to store intermediate results, supervised
by the control unit. Input and output circuits connect the
ALU to external memory, input, and output devices.
  Disks are the most common medium for storing large volumes of computer data. A magnetic disk is rotated at high speed in a disk-drive unit in the computer as a read/write (playback or record) head passes over its surfaces to read or record the magnetic variations that encode the data on the disk's surface. The more recent optical disks, which have a much larger capacity, are read by a laser-scanning device.  
  hard disk The hard disk is a rigid metal disk coated with a magnetic material. A hard disk may be permanently fixed into the disk drive (a fixed hard disk) or in the form of a disk pack that can be removed and exchanged with a different pack (a removable hard  
  microcomputer The component parts of the microcomputer: the system unit contains
the hub of the system, including the central processing unit (CPU), information on
all of the computer's peripheral devices, and often a fixed disk drive. The monitor
(or video display terminal) displays text and graphics, the keyboard and mouse are
used to input data, and the floppy disk and CD-ROM drives read data stored on disks.




Page 555
  disk A hard disk. Data is stored in sectors within cylinders and is read
by a head which passes over the spinning surface of each disk.
  disk drive A floppy disk drive. As the disk is inserted into the
drive, its surface is exposed to the read/write head, which moves
over the spinning disk surface to locate a specific track.




Page 556
  disk). A fixed disk cannot be removed: once it is full, data must be deleted in order to free space or a complete new disk drive must be added to the computer system in order to increase storage capacity. Removable disks can be taken out of the drive unit and kept for later use. By swapping such disks around, a single hard disk drive can be made to provide a potentially infinite storage capacity. However, access speeds and capacities tend to be lower than those associated with large fixed hard disks.  
  Hard disks vary from large units with capacities of more than 3,000 megabytes, intended for use with mainframe computers, to small units with capacities as low as 20 megabytes, intended for use with microcomputers.  
  floppy disk The floppy disk is a light, flexible disk enclosed in a plastic jacket. Floppy disks are the most common form of backing store for microcomputers. They are much smaller in size and capacity than hard disks, either 13.13 cm/5.25 in or 8.8 cm/3.5 in in diameter, and they normally hold 0.5–2 megabytes of data. Floppy disks are inexpensive, and light enough to send through the mail, but have slower access speeds and are more fragile than hard disks.  
  optical disk Plastic-coated metal disks, such as a CD-ROM (compact-disk read-only memory) and WORM (write once, read many times), are optical disks. Data is recorded as binary digital information etched on the disk surface in the form of microscopic pits; this is read by a laser-scanning device. Optical disks have an enormous capacity—about 550 megabytes on a compact disk, and thousands of megabytes on a full-size optical disk. CD-ROMs are used in distributing large amounts of text, graphics, audio, and video, such as encyclopedias, catalogs, technical manuals, and games.  
  Standard CD-ROMs cannot have information written onto them by computer, but must be manufactured from a master, although recordable CDs, called CD-R disks, have been developed for use as computer disks. A compact disk, CD-RW, that can be overwritten repeatedly by a computer has also been developed.  
  The memory of a computer is the part of the system used to store data and programs either permanently or temporarily. There are two main types: immediate access memory and backing storage (see central processing unit illustration). Memory capacity is measured in bytes (sufficient memory to store a single character of data) or, more conveniently, in kilobytes (units of 1,024 bytes) or megabytes (units of 1,024 kilobytes).  
  Immediate access memory, or internal memory,  
  CD-ROM drive Data is obtained by the CD-ROM drive by converting
the reflections from a disk's surface into digital form.




Page 557
  describes the memory locations that can be addressed directly and individually by the central processing unit. It is either read-only (stored in ROM, PROM, and EPROM chips) or read/write (stored in RAM chips). Read-only memory stores information that must be constantly available and is unlikely to be changed. It is nonvolatile—that is, it is not lost when the computer is switched off. Read/write memory is volatile—it stores programs and data only while the computer is switched on.  
  Backing storage, or external memory, is nonvolatile memory, located outside the central processing unit, used to store programs and data that are not in current use. Backing storage is provided by such devices as magnetic disks (floppy and hard disks), magnetic tape (tape streamers and cassettes), optical disks (such as CD-ROM), and bubble memory. By rapidly switching blocks of information between the backing storage and the immediate-access memory, the limited size of the immediate-access memory may be increased artificially. When this technique is used to give the appearance of a larger internal memory than physically exists, the additional capacity is referred to as virtual memory.  
  motherboard The position of a motherboard within a computer's system unit. The
motherboard contains the central processing unit, Random Access Memory
(RAM) chips, Read-Only Memory (ROM), and a number of expansion slots.
  The motherboard is a large printed circuit board (PCB, an internal electrical circuit laid out on insulating board) that contains the main components of a microcomputer. The power, memory capacity, and capability of the microcomputer may be enhanced by adding expansion  
  integrated circuit An integrated circuit (I.C.), or silicon chip.
The I.C. is a piece of silicon, about the size of a child's
fingernail, on which the components of an electrical
circuit are etched. The I.C. is packed in a plastic container
with metal legs that connect it to the printed circuit board.




Page 558
  printed circuit board A typical microcomputer printed circuit board (PCB). The PCB
contains sockets for the integrated circuits, or chips, and the connecting tracks.
  boards to the motherboard. A computer may contain several printed circuit boards for various purposes; expansion boards and adaptors are examples of PCBs. Each PCB, in turn, may accommodate one or more integrated circuits (I.C.s), or silicon chips. These are miniaturized electronic circuits produced on a single crystal, or chip, of a semiconducting material, usually silicon.  
  operating system (O.S.)  
  The operating system is a program that controls the basic operation of a computer. A typical O.S. controls the peripheral devices such as printers, organizes the filing system, provides a means of communicating with the operator, and runs other programs.  
  Many operating systems were written to run on specific computers, but some are available from third-party software houses and will run on machines from a variety of manufacturers. Examples include Microware's OS/9, Microsoft's MS-DOS, and UNIX. UNIX is the standard on workstations, minicomputers, and supercomputers; it is also used on desktop PCs and mainframes.  
  A DOS (disk operating system) is a computer operating system specifically designed for use with disk storage; also used as an alternative name for Microsoft's operating system, MS-DOS.  
  Peripheral Devices  
  A peripheral device is any item connected to a computer's central processing unit (CPU). Typical peripherals include the keyboard, mouse, video display terminal (VDT), and printer. Users who enjoy playing games might add a joystick or a trackball; others might connect a modem, scanner, or Integrated Services Digital Network (ISDN) terminal to their machines.  
Major Operating Systems
Operating system Developer Interface Use Features
DOS Microsoft and IBM 1981 command line IBM-compatible PCs most widely used system
OS/2 IBM and Microsoft late 1980s GUI IBM-compatible computers multitasking
Mac Operating System Apple 1984 pioneered GUI Macintosh allows plug and play peripherals
UNIX AT&T late 1960s complex commands from mainframes to PCs multiuser, multitasking
Windows NT Microsoft GUI predominantly PCs networking and multitasking
Windows 95/98 Microsoft 1995/98 GUI dominates the PC market multitasking and multimedia





Page 559
  peripheral device Some of the types of peripheral devices that may be
connected to a computer include printers, scanners, and modems.
  input device  
  An input device is used for entering information into a computer. Input devices include keyboards, joysticks, mice, light pens, touch-sensitive screens, scanners, graphics tablets, speech-recognition devices, and vision systems.  
  keyboard The keyboard is the most frequently used input device, resembling a typewriter keyboard. Keyboards are used to enter instructions and data via keys. There are many variations on the layout and labeling of keys. Extra numeric keys may be added, as may special-purpose function keys,  
  keyboard A standard 102-key keyboard. As well as providing a QWERTY typing keyboard,
the function keys (labeled F1–F12) may be assigned tasks specific to a particular system.




Page 560
  whose effects can be defined by programs in the computer.  
  graphics tablet The graphics tablet is an input device, also known as a bit pad, in which a stylus or cursor is moved by hand over a flat surface. The computer can keep track of the position of the stylus, enabling the operator to input drawings or diagrams into the computer. A graphics tablet is often used with a form overlaid for users to mark boxes in positions that relate to specific registers in the computer, although recent developments in handwriting recognition may increase its future versatility.  
  graphics tablet A graphics tablet enables
images drawn freehand to be translated
directly to the computer screen.
  joystick The joystick is a hand-held lever that signals to a computer the direction and extent of displacement required by the user. It is similar to the joystick used to control the flight of an aircraft. Joysticks are sometimes used to control the movement of a cursor (marker) across a display screen, but are much more frequently used to provide fast and direct input for moving the characters and symbols that feature in computer games. Unlike a mouse, which can move a pointer in any direction, simple games joysticks are often capable only of moving an object in one of eight different directions.  
  joystick The directional and other controls on a conventional joystick may be
translated to a joy pad, which enables all controls to be activated by buttons.
  light pen Light pens, resembling ordinary pens, are used to indicate locations on a computer screen. With certain computer-aided design (CAD) programs, a light pen can be used to instruct the computer to change the shape, size, position, and colors of sections of a screen image.  
  mouse Input device used to control a pointer on a computer screen. It is a feature of graphical user interface (GUI) systems. The mouse is about the size of a pack of playing cards. It is connected to the computer by a wire, and incorporates one or more buttons that can be pressed. Moving the mouse across a flat surface causes a corresponding movement of the pointer. In this way, the operator can manipulate objects on the screen and make menu selections.  
  Mice work either mechanically (with electrical contacts to sense the movement in two planes of a ball on a level surface), or optically (photocells detecting movement by recording light reflected from a grid on which the mouse is moved).  




Page 561
  magnetic-ink character recognition An example of one of the uses of
magnetic ink in automatic character recognition. Because of the difficulties
in forging magnetic-ink characters, and the speed with which they can be
read by computer systems, MICR is used extensively in banking.
  scanner The scanner is a peripheral device that produces a digital image of a document for input and storage in a computer, using technology similar to that of a photocopier. Small scanners can be passed over the document surface by hand; larger versions have a flat bed, like that of a photocopier, on which the input document is placed and scanned. Scanners are widely used to input graphics for use in desktop publishing.  
  Various forms of scanners are used to read and capture large volumes of data very rapidly. They include document readers for magnetic-ink character recognition (MICR), optical character recognition (OCR), and optical mark recognition (OMR).  
  modem A modem (modulator/demodulator) is a device for transmitting computer data over telephone  
  modem Modems are available in various forms: microcomputers may use an
external device connected through a communications port, or an internal device,
which takes the form of an expansion board inside the computer. Notebook
computers use an external modem connected via a special interface card.




Page 562
  lines. Such a device is necessary because the digital signals produced by computers cannot, at present, be transmitted directly over the telephone network, which uses analog signals. The modem converts the digital signals to analog, and back again.  
  Modems are used for linking remote terminals to central computers and enable computers to communicate with each other anywhere in the world. In 1997 the fastest standard modems transmitted data at a nominal rate of about 33,600 bps (bits per second), often abbreviated to 33.6 K. 56 K modems launched in 1997 achieve higher speeds by using a digital connection to the user's computer, while using a conventional analog connection in the other direction. In theory the downstream link can transfer data at 56 Kbps but in practice, speeds are usually 45–50 K or less, depending on the quality of the phone line and other factors.  
  output device  
  An output device is used for displaying, in a form intelligible to the user, the results of processing carried out by a computer. The most common output devices are the VDT (video display terminal, or screen) and the printer. Other output devices include graph plotters, speech synthesizers, and COM (computer output on microfilm/microfiche).  
  port The socket that enables a computer processor to communicate with an external device is called the port. It may be an input port (such as a joystick port), or an output port (such as a printer port), or both (an i/o port).  
  port The two types of communications port in a microcomputer.
The parallel port enables up to eight bits of data to travel
through it at any one time; the serial port enables only one.
  Microcomputers may provide ports for cartridges, televisions and/or monitors, printers, and modems, and sometimes for hard disks and musical instruments (MIDI, the musical instrument digital interface). Ports may be serial or parallel.  
  printer The printer is an output device for producing printed copies of text or graphics. Types include the daisywheel printer, which produces good-quality text but no graphics; the dot matrix printer, which produces text and graphics by printing a pattern of small dots; the ink jet printer, which creates text and graphics by spraying a fine jet of quick-drying ink onto the paper; and the laser printer, which uses electrostatic technology very similar to that used by a photocopier to produce high-quality text and graphics.  




Page 563
  dot matrix printer Characters and graphics printed by a dot
matrix printer are produced by a block of pins which strike
the ribbon and make up a pattern using many small dots.
  ink-jet printer High-quality print images are produced by
ink-jet printers by squirting ink through a series of nozzles.
  laser printer A laser printer works by transferring tiny ink particles
contained in a toner cartridge to paper via a rubber belt. The image
is produced by laser on a light-sensitive drum within the printer.
  Printers may be classified as impact printers (such as daisywheel and dot matrix printers), which form characters by striking an inked ribbon against the paper, and non-impact printers (such as ink jet and laser printers), which use a variety of techniques to produce characters without physical impact on the paper.  
  A further classification is based on the basic unit of printing, and categorizes printers as character printers, line printers, or page printers, according to whether they print one character, one line, or a complete page at a time.  
  video display terminal (VDT) A VDT is a computer terminal consisting of a keyboard for input data and a screen, or monitor, for displaying output. The oldest and most popular type of VDT screen is the cathode ray tube (CRT), which uses essentially the same  




Page 564
  pixel Computer screen images are made of a number of pixels (''dots"). The greater
the number of pixels the greater the resolution of the image; most computer
screens are set at 640 × 480 pixels, although higher resolutions are available.
  technology as a television screen. Other types use plasma display technology and liquid crystal displays.  
  Screen resolution (the quality of the display) is dependent on the number of pixels (picture elements, or single dots on a computer screen) used; the smaller the pixel, the higher the resolution. Within the limits of the resolution supported by the screen itself, the display quality can be altered by the user to suit particular applications. In the same way, the number of screen colors supported by a VDT can be selected by the user.  
  Types of Computer  
  microcomputer or personal computer (PC)  
  A microcomputer is a small desktop or portable computer, typically designed to be used by one person at a time, although individual computers can be linked in a network so that users can share data and programs. Its central processing unit is a microprocessor, contained on a single integrated circuit, or chip. The first microprocessor appeared in 1971, designed for a pocket calculator manufacturer; it led to a dramatic fall in the size and cost of computers and heralded the introduction of the microcomputer.  
  Microcomputers are the smallest of the four classes of computer. Since the appearance in 1975 of the first commercially available microcomputer, the Altair 8800, micros have become ubiquitous in commerce, industry, and education.  
  laptops, notebooks, or portables Laptops are portable microcomputers that are small enough to be used on the operator's lap. The laptop consists of a single unit, incorporating a keyboard, floppy disk and hard disk drives, and a screen. The screen often forms a lid that folds back in use. It uses a liquid crystal or gas plasma display, rather than the bulkier and heavier cathode ray tubes found in most display terminals. A typical laptop computer measures about 210 × 297 mm/8.3 × 11.7, is 5 cm/2 in in depth, and weighs less than 3 kg/6 lb 9 oz.  
  In the 1990s the notebook format became the standard for portable PCs and Apple PowerBooks. Some manufacturers also offer smaller portable PCs called subnotebooks or palmtops.  
  A minicomputer is a multiuser computer with a size and processing power between those of a mainframe and a microcomputer. Nowadays almost all minicomputers are based on microprocessors. Minicomputers are often used in medium-sized businesses and in university departments handling database or other commercial programs and running scientific or graphical applications.  
  mainframe computer  
  A mainframe computer is a large computer used for commercial data processing and other large-scale operations. Because of the general increase in computing power, the differences between the mainframe, supercomputer, minicomputer, and microcomputer (personal computer) are becoming less marked.  
  Mainframe manufacturers include IBM, Amdahl, Fujitsu, and Hitachi. Typical mainframes have from 128 MB to 4 GB of memory and hundreds of gigabytes of disk storage.  
  The supercomputer is the fastest, most powerful type of computer, capable of performing its basic operations in picoseconds (thousand-billionths of a second), rather than nanoseconds (billionths of a second), like most other computers.  
  To achieve these extraordinary speeds, supercomputers use several processors working together and  




Page 565
  notebook computer The component parts of a notebook
computer. Although as powerful as a microcomputer, the
battery pack enables the notebook to be used while traveling.
  techniques such as cooling processors down to nearly absolute zero temperature, so that their components conduct electricity many times faster than normal. Supercomputers are used in weather forecasting, fluid dynamics, and aerodynamics. Manufacturers include Cray Research, Fujitsu, and NEC.  
  Of the world's 500 most powerful supercomputers 232 are in the United States, 109 in Japan, and 140 in Europe, with 23 in the U.K.  
  Computer Programming  
  Computer programmers write instructions in a programming language (see below) that is translated into machine code (a set of instructions that a computer's central processing unit can understand and obey directly) by means of a compiler or interpreter program to enable the user to operate the computer and perform specific tasks.  
  computer A mainframe computer. Functionally, it has the same
component parts as a microcomputer, but on a much larger scale. The
central processing unit is at the hub, and controls all the attached devices.




Page 566
  computer coding systems  
  All modern digital computers are directly operated by machine codes based on binary numbers (using combinations of the digits 1 and 0), which represent instructions and data. The values of the binary digits, or "bits," are stored or transmitted as, for example, open/closed switches or high/low voltages in circuits.  
  ASCII (American standard code for information interchange) Almost all mini- and microcomputers use ASCII, a coding system in which numbers are assigned to letters, digits, and punctuation symbols. Although computers work in code based on the binary number system, ASCII numbers are usually quoted as decimal or hexadecimal numbers. For example, the decimal number 45 (binary 0101101) represents a hyphen, and 65 (binary 1000001) a capital A. The first 32 codes are used for control functions, such as carriage return and backspace.  
  Strictly speaking, ASCII is a 7-bit binary code, allowing 128 different characters to be represented, but an eighth bit is often used to provide parity (the state of being even or odd, see below) or to allow for extra characters. The system is widely used for the storage of text and for the transmission of data between computers.  
character binary code
A 1000001
B 1000010
C 1000011
D 1000100
E 1000101
F 1000110
G 1000111
H 1001000
I 1001001
J 1001010
K 1001011
L 1001100
M 1001101
N 1001110
0 1001111
P 1010000
Q 1010001
R 1010010
S 1010011
T 1010100
U 1010101
V 1010110
W 1010111
X 1011000
Y 1011001
Z 1011010


  Parity is the term referring to the number of 1s in the binary codes used to represent data. A binary representation has even parity if it contains an even number of 1s and odd parity if it contains an odd number of 1s.  
  For example, the binary code 1000001, commonly used to represent the character "A," has even parity because it contains two 1s, and the binary code 1000011, commonly used to represent the character "C," has odd parity because it contains three 1s. A parity bit (0 or 1) is sometimes added to each binary representation to give all the same parity so that a validation check can be carried out each time data are transferred from one part of the computer to another. So, for example, the codes 1000001 and 1000011 could have parity bits added and become 01000001 and 11000011, both with even parity. If any bit in these codes should be altered in the course of processing the parity would change and the error would be quickly detected.  
character binary code parity base-ten representation
A 1000001 even 65
B 1000010 even 66
C 1000011 odd 67
D 1000100 even 68


  computer programming languages  
  Programming languages are designed to be easy for people to write and read, but must be capable of being mechanically translated (by a compiler or an interpreter) into the machine code that the computer can execute. Programming languages may be classified as high-level languages or low-level languages.  
  assembly language This is a low-level computer-programming language closely related to a computer's internal codes. It consists chiefly of a set of short sequences of letters (mnemonics), which are translated, by a program called an assembler, into machine code for the computer's central processing unit (CPU) to follow directly. In assembly language, for example, "JMP" means "jump" and "LDA" means "load accumulator." Assembly code is used by programmers who need to write very fast or efficient programs.  
  Because they are much easier to use, high-level  




Page 567
Programming Languages
Language Main uses Description
assembly languages
defense applications
jobs needing detailed control of the
hardware, fast execution, and small
program size
fast and efficient but require considerable
effort and skill
ALGOL (algorithmic language) mathematical work high-level with an algebraic style; no longer in current use, but has influenced languages such as Ada and PASCAL
BASIC (beginners' all-purpose symbolic
instruction code)
mainly in education, business, and the home, and among nonprofessional programmers, such as engineers systems and general programming easy to learn; early versions lacked the features of other languages

fast and efficient; widely used as a general-purpose language; especially popular among professional programmers
C++ systems and general programming; commercial software development developed from C, adding the advantages of object-oriented programming
COBOL (common business-oriented language) business programming strongly oriented toward commercial work; easy to learn but very verbose; widely used on mainframes
FORTRAN (formula translation)
control applications
scientific and computational work

developed for consumer electronics; used for many interactive Web sites
reverse Polish notation language
based on mathematical formulae; popular among engineers, scientists, and mathematicians
multipurpose, cross-platform, object-oriented language with similar features to C and C++ but simpler
LISP (list processing) artificial intelligence symbolic language with a reputation for being hard to learn; popular in the academic and research communities
LOGO teaching of mathematical concepts high-level; popular with schools and home computer users
Modula-2 systems and real-time programming; general programming highly-structured intended to replace PASCAL for "real-world" applications
OBERON general programming small, compact language incorporating many of the features of PASCAL and Modula-2
PASCAL (program appliqué à la sélection et la compilation automatique de la littérature)
Perl (pathological eclectic rubbish lister)
PROLOG (programming in logic)
general-purpose language

systems programming and Web
artificial intelligence
highly-structured; widely used for teaching programming in universities

easy manipulation of text, files, and processes, especially in UNIX environment
symbolic-logic programming system, originally intended for theorem solving but now used more generally in artificial intelligence


  languages are normally used in preference to assembly languages. An assembly language may still be used in some cases, however, particularly when no suitable high-level language exists or where a very efficient machine-code program is required.  
  compiler A compiler is a computer program that translates programs written in a high-level language into machine code (the form in which they can be run by the computer). The compiler translates each high-level instruction into several machine-code instructions—in a process called compilation—and produces a complete independent program that can be run by the computer as often as required, without the original source program being present.  
  Different compilers are needed for different highlevel languages and for different computers. In contrast to using an interpreter, using a compiler adds  




Page 568
  compiler The process of compilation; a program written
in a high-level language is translated into a program that
can be run without the original source being present.
  slightly to the time needed to develop a new program because the machine-code program must be recompiled after each change or correction. Once compiled, however, the machine-code program will run much faster than an interpreted program.  
  fourth-generation language This is a type of programming language designed for the rapid programming of applications but often lacking the ability to control the individual parts of the computer. Such a language typically provides easy ways of designing screens and reports, and of using databases.  
  high-level language This is a programming language designed to suit the requirements of the programmer; it is independent of the internal machine code of any particular computer. High-level languages are used to solve problems and are often described as problem-oriented languages—for example, BASIC was designed to be easily learned by first-time programmers; COBOL is used to write programs solving business problems; and FORTRAN is used for programs solving scientific and mathematical problems. In contrast, low-level languages, such as assembly languages, closely reflect the machine codes of specific computers, and are therefore described as machine-oriented languages.  
  Unlike low-level languages, high-level languages are relatively easy to learn because the instructions bear a close resemblance to everyday language, and because the programmer does not require a detailed knowledge of the internal workings of the computer. Each instruction in a high-level language is equivalent to several machine-code instructions. High-level programs are therefore more compact than equivalent low-level programs. However, each high-level instruction must be translated into machine code—by either a compiler or an interpreter program—before it can be executed by a computer. High-level languages are designed to be portable—programs written in a high-level language can be run on any computer that has a compiler or interpreter for that particular language.  
  interpreter This is a computer program that translates and executes a program written in a high-level language. Unlike a compiler, which produces a complete machine-code translation of the high-level program in one operation, an interpreter translates the source program, instruction by instruction, each time that program is run.  
  Because each instruction must be translated each time the source program is run, interpreted programs run far more slowly than do compiled programs. However, unlike compiled programs, they can be executed immediately without waiting for an intermediate compilation stage.  
  interpreter The sequence of events when running
an interpreter on a high-level language program.
Instructions are translated one at a time, making the
process a slow one; however, interpreted programs do
not need to be compiled and may be executed immediately.




Page 569
  low-level language This is a programming language designed for a particular computer and reflecting its internal machine code; low-level languages are therefore often described as machine-oriented languages. They cannot easily be converted to run on a computer with a different central processing unit, and they are relatively difficult to learn because a detailed knowledge of the internal working of the computer is required. Since they must be translated into machine code by an assembler program, low-level languages are also called assembly languages.  
  A mnemonic-based low-level language replaces binary machine-code instructions, which are very hard to remember, write down, or correct, with short codes chosen to remind the programmer of the instructions they represent. For example, the binary-code instruction that means "store the contents of the accumulator" may be represented with the mnemonic STA.  
  computer programs  
  A computer program is a set of instructions that controls the operation of a computer. There are two main kinds: applications programs, which carry out tasks for the benefit of the user, for example, word processing; and systems programs, which control the internal workings of the computer. A utility program is a systems program that carries out specific tasks for the user, for example converting the format of a data file so that it can be accessed by a different applications program. Programs can be written in any of a number of programming languages but are always translated into machine code before they can be executed by the computer.  
  communications programs or comms programs  
  These are general-purpose programs for accessing older online systems and bulletin board systems which use a command-line interface; also known as terminal emulators.  
  resolution An example of typical resolutions of
screens and printers. The resolution of a screen
image when printed can only be as high as the
resolution supported by the printer itself.
  Most operating systems include a trimmed-down comms program, but full-featured programs include facilities to store phone numbers and settings for frequently called services, address books, and the ability to write scripts to automate logging on. Popular comms programs include ProComm, Smartcom, Qmodem, and Odyssey.  
  computer graphics programs These are application programs that enable a computer to display and manipulate information in pictorial form. Input may be achieved by scanning an image, by drawing with a mouse or stylus on a graphics tablet, or by drawing directly on the screen with a light pen.  
  The output may be as simple as a pie chart, or as  
  bit map The difference in close-up between a bit-mapped and vector font.
As separate sets of bit maps are required for each different type size,
scaleable vector graphics (outline) is the preferred medium for fonts.




Page 570
  computer graphics  Some examples of the kinds of graphic design that can be achieved using
computers. Text and graphics may be combined within an illustration package, and sophisticated
three-dimensional drawings can be created using a computer-aided design (CAD) system.
  complex as an animated sequence in a science-fiction film, or a seemingly three-dimensional engineering blueprint. The drawing is stored in the computer as raster graphics or vector graphics.  
  Vector graphics are stored in the computer memory by using geometric formulae. They can be transformed (enlarged, rotated, stretched, and so on) without loss of picture resolution. It is also possible to select and transform any of the components of a vector-graphics display because each is separately defined in the computer memory. In these respects vector graphics are superior to raster graphics. They are typically used for drawing applications, allowing the user to create and modify technical diagrams such as designs for houses or cars.  
  Raster graphics are stored in the computer memory by using a map to record data (such as color and intensity) for every pixel (picture element, or individual dot) that makes up the image. When transformed (enlarged, rotated, stretched, and so on), raster graphics become ragged and suffer loss of picture resolution, unlike vector graphics. They are typically used for painting applications, which allow the user to create artwork on a computer screen much as if they were painting on paper or canvas.  
  Computer graphics are increasingly used in computer-aided design (CAD), and to generate models and simulations in engineering, meteorology, medicine and surgery, and other fields of science. Recent developments in software mean that designers on opposite sides of the world will soon be able to work on complex three-dimensional computer models using ordinary personal computers (P.C.s) linked by telephone lines rather then powerful graphics workstations.  




Page 571
Major Graphics and Design Programs
Software Manufacturer Description
Adobe Illustrator Adobe professional Draw program with PostScript graphics
Adobe Photoshop Adobe professional Paint program offers image creation and manipulation
AutoCAD AutoDesk industry standard and leading CAD software
ClarisDraw Claris paint and draw with smart tools; strong on presentations
Corel Draw Corel paint and draw with OCR, animation and presentation
DesignCAD PMS (Instruments) range of programs for CAD and modeling
FreeHand Macromedia established Draw program with built in effects and color support
Painter Fractal Design suite of creative artist's tools; provides tutorials and stunning results
Paint Shop Pro JASC popular, easy-to-use image-editing program; with shareware option
Simply 3D Visual Software paint program with full camera animation


  database programs These are programs used to create databases; they enable a user to define the database structure by selecting the number of fields, naming those fields, and allocating the type and amount of data that is valid for each field. To sort records within a database, one or more sort fields may be selected, so that when the data is sorted, it is ordered according to the contents of these fields. A key field is used to give a unique identifier to a particular record. Data programs also determine how data can be viewed on screen or extracted into files.  
  A database-management system (DBMS) program ensures that the integrity of the data is maintained by controlling the degree of access of the applications programs using the data.  
Major Database Programs
Software Manufacturer Description
Access Microsoft features wizards and macros; included in Microsoft Office Professional
Approach Lotus easy-to-use; requires no programming; with multiple database formats
dBASE Borland powerful and flexible relational database
Filemaker Pro Claris versatile and easy-to-use application with relational features
FoxPro Microsoft relational database with programming language


  database An example of the type of information that may be stored
on a database. The information may be stored in various formats,
enabling it to be sorted and output to other software programs.




Page 572
Major Desktop Publishing Software Programs
Software Manufacturer Description
FrameMaker Frame Technologies strong on technical reports and book production
PageMaker Adobe powerful professional tool; strong layout and color capabilities
PagePlus Serif good value; includes vector drawing program and bitmap editor
Publisher Microsoft low-level for beginners; provides wizards and clip art gallery
QuarkXpress Quark industry standard; numerous enhancement modules available


  Ultimate Electronic Publishing Resource
  Includes reviews of a number of DTP packages, as well as a large library of clip art and a number of links to sites on related topics, including Java, fonts, and graphics programs.  
  desktop publishing (DTP) programs These are application programs that enable small-scale typesetting and page make-up to be performed on a microcomputer. DTP packages use a graphical interface to import text and graphics from other packages; run text as columns, over pages, and around artwork and other insertions; enable a wide range of fonts; and allow accurate positioning of all elements required to make a page.  
  DTP systems are capable of producing camera-ready pages (pages ready for photographing and printing), made up of text and graphics, with text set in different typefaces and sizes. The page can be previewed on the screen before final printing on a laser printer.  
  spreadsheets These are application programs that mimic a sheet of ruled paper, divided into columns down the page, and rows across. The user enters values into cells within the sheet, then instructs the program to perform some operation on them, such as totaling a column or finding the average of a series of numbers. Highly complex numerical analyses may be built up from these simple steps.  
  Columns and rows in a spreadsheet are labeled; although different programs use different methods, columns are often labeled with alpha characters, and rows with numbers. This way, each cell has its own reference, unique within that spreadsheet. For example, A5 would be the cell reference for the fifth row in the first column. Cells can also be grouped using references; the range H9:H30 groups together all the cells in column H between (and including) rows 9 and 30. Single references or cell ranges may be used when inputting formulae into cells.  
  When a cell containing a formula is copied and pasted within a spreadsheet, the formula is said to be relative, meaning the cell references it takes its values from are relative to its new position. An absolute reference does not change.  
  The pages of a spreadsheet can be formatted to make them easier to read; the height of rows, the width of columns, and the typeface of the text may all be changed. Number formats may also be changed to display, for example, fractions as decimals or numbers as integers.  
  Spreadsheets are widely used in business for forecasting and financial control. The first spreadsheet program, Software Arts' VisiCalc, appeared in 1979. The best known include Lotus 1–2–3 and Microsoft Excel.  
  word processors These are programs that allow the input, amendment, manipulation, storage, and retrieval of text; also computer systems that run such software. Since word-processing programs became available to microcomputers, the method has largely replaced the typewriter for producing letters or other text. Typical facilities include insert, delete, cut and paste, reformat, search and replace, copy, print, mail merge, and spelling check.  
  The leading word-processing programs include Microsoft Word, the market leader, Lotus WordPro, and Corel WordPerfect.  
  A fault or mistake, either in the software or on the part of the user, can cause a program to stop running (crash) or produce unexpected results. Program errors, or bugs, are largely eliminated in the course of the programmer's initial testing procedure, but some will remain in most programs. All computer operating systems are designed to produce an error message (on the display screen, or in an error file or printout) whenever an error is detected, reporting that an error has taken place and, wherever possible, diagnosing its cause.  
  Errors can be categorized into several types: syntax errors are caused by the incorrect use of the programming language, and include spelling and keying mistakes. These errors are detected when the compiler or interpreter fails to translate the program into machine code (instructions that a computer can understand directly); logical errors are faults in the program  




Page 573
  word processing A word processing software package enables text to manipulated
in a variety of ways, such as copying and pasting and changing its size and typeface.
  design—for example, in the order of instructions. They may cause a program to respond incorrectly to the user's requests or to crash completely; execution errors, or run-time errors, are caused by combinations of data that the programmer did not anticipate. A typical execution error is caused by attempting to divide a number by zero. This is impossible, and so the program stops running at this point. Execution errors occur only when a program is running, and cannot be detected by a compiler or interpreter.  
  Computers are designed to deal with a set range of numbers to a given range of accuracy. Many errors are caused by these limitations: overflow error occurs when a number is too large for the computer to deal with; an underflow error occurs when a number is too small; rounding and truncation errors are caused by the need to round off decimal numbers, or to cut them off (truncate them) after the maximum number of decimal places allowed by the computer's level of accuracy.  
  flow chart  
  A flow chart is a diagram often used in computing to show the possible paths that data can take through a system or program.  
  A system flow chart, or data flow chart, is used to describe the flow of data through a complete data-processing system. Different graphic symbols represent the clerical operations involved and the different input, storage, and output equipment required. Although the flow chart may indicate the specific programs used, no details are given of how the programs process the data.  
  A program flow chart is used to describe the flow of data through a particular computer program, showing the exact sequence of operations performed by that program in order to process the data. Different graphic symbols are used to represent data input and output, decisions, branches, and subroutines.  
  A computer virus is a piece of software that can replicate and transfer itself from one computer to another, without the user being aware of it. Some viruses are relatively harmless, but others can damage or destroy data.  
  Viruses are written by anonymous programmers, often maliciously, and are spread on floppy disks, CD-ROMs, and via networks. Most viruses hide in the boot sectors of floppy or hard disks or infect program files, but more recently there has been a huge growth in macro viruses that infect Microsoft Word or Excel.  




Page 574
Millennium Bug: Preparing Computers for the Year 2000
  By Scott Kirsner  
digital disarray
It sounds like a bad riddle: how will two missing digits create a $600 billion industry when the calendar flips from 1999 to 2000?
Unfortunately, it's not a riddle; it's reality. As the 20th century draws to a close, an expensive computer problem dubbed ''the millennium bug" has emerged. In brief, most computers handle dates using a two-digit shorthand: 98 instead of the four-digit 1998. When presented with a date like 00, they become hopelessly confused. Since they're missing the two digits that indicate what millennium and century the date is in (19 or 20), computers tend to assume that 00 is actually 1900. So they'll either begin making errors of calculation or they'll stop working altogether. Reprogramming them to be capable of comprehending dates in the new millennium is expected cost as much as $600 billion dollars worldwide.
The origins of the problem are simple. First, the programmers who wrote software in the 1960s for the first generation of commercial mainframe computers were shortsighted. They didn't imagine that the programs they were creating-or the machines they were creating them for—would still be in service in the far-off year of 2000. So they conserved the computers' memory by using a two-digit shorthand for the year. Every byte of memory was precious in those days, and lopping off 19 from dates was an obvious way to save a few bytes here and there.
Banks were among the first institutions to notice the down-side to that approach. When they began writing long-term mortgages and approving loans that lasted past 1999, they were forced to confront the millennium bug. But the problem received little widespread attention until the mid-1990s. Technology consultants began writing articles and giving speeches about "the year 2000 problem." Business executives began to take notice, and even consumers couldn't ignore the problem When credit card issuers and driver's license organizations began to renew cards and licenses for shorter periods of time, because their systems couldn't handle an expiration date past 1999.
How might businesses and consumers be affected by the millennium bug? Computers, both new and old, in every industry are vulnerable. They might shut down as a result of being asked to process the date 00. Some say that is the best-case scenario, because at least businesses will know something is wrong. Worse would be if computers continued to operate, making numerous date-related errors that would be difficult to identify and fix.
The areas of greatest concern are defense, health care, transportation, telecommunications, financial services, and national and local governments. Technology experts warn of the hazards of air travel if the Federal Aviation Administration's computers can't manage data properly, the danger of hospital stays if the computers that monitor patients go awry, and the possibility of social unrest if the federal government can't provide services in 2000.
date expansion or windowing?
Fixing the millennium bug is a labor-intensive endeavor. An organization can opt to replace its systems entirely with new ones that can function in the 21st century, or it may pursue one of two basic repair strategies—"date expansion" or "windowing."
Date expansion involves changing the two-digit dates to four. That entails converting all of the data an organization has stored from one format to the other, and reprogramming systems to handle four-digit dates like 2001.
Windowing is considered a simpler, less expensive solution, but it's only a temporary patch. Rather than converting all of a company's data, the windowing approach merely adds logic to a program to help it determine whether a two-digit date 'belongs in the 20th century or the 21st. Programmers might create a "window" of time—from 00 to 30, for example—and then instruct the computer to assume that those dates should all be preceded by 20, whereas dates between 31 and 99 should be preceded by 19. But when 2031 rolls around, that hypothetical company would have a new problem on its hands. Windowing assumes that an organization will either replace its older systems before the window of time closes, or reprogram them yet again.
Eradicating the millennium bug is a multistage process. An organization must first assess which of its systems will be unable to handle dates in the 21st century. Then, it must convert those systems, either through expansion or windowing. Finally, it has to test the systems to ensure that they will work after the clock ticks past midnight on December 31, 1999.
ripple effects
But even if companies successfully repair their own systems, they're still vulnerable to what has been dubbed "the ripple effect." One of their suppliers or customers, or a government regulator, could send them unconverted data and contaminate their systems. Or even worse, a key supplier might be unable to provide services or raw materials as a result of the bug, hamstringing its customers. For those reasons, organizations must make sure that everyone else with whom they do business is solving their own year 2000 problems. Certain sectors of the economy, like the financial services arena, are even coordinating massive, interorganizational tests to make sure that stock exchanges, banks, regulators, and clearing houses will be able to work together in the new millennium.
And waiting in the wings are the lawyers. If software or hardware fails, they'll be scrutinizing contracts to see who is liable. If a conversion project turns out to have been defective, they





Page 575
may bring litigation against the service provider that was contracted to perform the fix. And if a company's stock takes a dive as a result of year 2000-related failures, lawyers may file negligence lawsuits against the Board of Directors. Once litigation and damages are figured into the cost of the millenium bug, some analysts believe the total worldwide cost could sky-rocket to as much as $3.6 trillion.
The sudden emergence of the year 2000 problem has created an entire mini-economy. Programmers and technology managers are finding that they can demand and receive higher salaries, computer consultants have more work than they can handle, and software companies have begun to market tools aimed at making assessment, conversion, and testing more efficient. There are dozens of Web sites and books devoted to the problem. The American Stock Exchange has even created an options index that enables investors to speculate on the fortunes of 18 companies selling software or services intended to solve the year 2000 problem.
Few participants in this mini-economy are willing to speculate about the extent to which the world will be affected by the millennium bug. Will January 1, 2000 arrive without a hitch, or will, as some technology experts predict, the front pages of every major newspaper be filled with stories about date-related computer crises? All that's certain is that programmers and their technology managers won't be among the celebrants on New Year's Eve 1999; they'll be huddled over their mainframes, fingers crossed.


  Antivirus software can be used to detect and destroy well-known viruses, but new viruses continually appear and these may bypass existing antivirus programs.  
  Computer viruses may be programmed to operate on a particular date, such as Dark Avenger's Michelangelo Virus, which was triggered on March 6, 1992 (the anniversary of the birth of Italian artist Michelangelo) and erased hard disks. An estimated 5,000–10,000 PCs were affected.  
  The Computer—Human Interface  
  user interface  
  The term "user interface" describes the procedures and methods through which the user operates a computer program. These might include menus, input forms, error messages, and keyboard procedures. A graphical user interface (GUI or WIMP) is one that makes use of icons (small pictures) and allows the user to make menu selections with a mouse.  
  graphical user interface A typical graphical user interface (GUI),
where the user moves
around the system
by clicking on representative buttons or icons using the mouse.




Page 576
  desktop A typical graphical desktop, showing the menu system, icons, programs, and applications available to the user.  
  A command line interface is a character-based interface in which a prompt is displayed on the screen at which the user types a command, followed by carriage return, at which point the command, if valid, is executed. An example of a command line interface is the DOS prompt.  
  A menu-driven interface presents various options to the user in the form of a list, from which commands may be selected. Types of menu include the menu bar, which displays the top level options available to the user as a single line across the top of the screen; selecting one of these options displays a pull-down menu. Programs such as Microsoft Word use menus in this way.  
  In a graphical user interface programs and files appear as icons, user options are selected from pulldown menus, and data are displayed in windows (rectangular areas), which the operator can manipulate in various ways. The operator uses a pointing device, typically a mouse, to make selections and initiate actions. The graphical user interface is now available on many types of computer—most notably as Windows, an operating system for IBM PC-compatible microcomputers developed by the software company Microsoft.  
  Computer Applications  
  artificial intelligence (AI)  
  Artificial intelligence is concerned with creating computer programs that can perform actions comparable with those of an intelligent human. Current AI research covers such areas as planning (for robot behavior), language understanding, pattern recognition, and knowledge representation.  
  The possibility of artificial intelligence was first proposed by the English mathematician Alan Turing in 1950. Early AI programs, developed in the 1960s, attempted simulations of human intelligence or were aimed at general problem-solving techniques. By the mid-1990s, scientists were concluding that AI was more difficult to create than they had imagined. It is now thought that intelligent behavior depends as much on the knowledge a system possesses as on its reasoning power. Present emphasis is on knowledge-based systems, such as expert systems, while research projects focus on neural networks, which attempt to mimic the structure of the human brain.  
  On the Internet, small bits of software that automate common routines or attempt to predict human likes or behavior based on past experience are called intelligent agents or bots.  
  fuzzy logic Fuzzy logic is a form of knowledge representation suitable for notions (such as "hot" or "loud") that cannot be defined precisely but depend on their context. Fuzzy logic enables computerized devices to reason more like humans, receiving complex messages from their control panels and sensors and responding effectively to these messages.  




Page 577
  Fuzzy logic has been largely ignored in Europe and the United States, but was taken up by Japanese manufacturers in the mid-1980s and has since been applied to hundreds of electronic goods and industrial machines. For example, a vacuum cleaner launched in 1992 by Matsushita uses fuzzy logic to adjust its sucking power in response to messages from its sensors about the type of dirt on the floor, its distribution, and its depth.  
  neural network  This is an artificial network of processors that attempts to mimic the structure of nerve cells (neurons) in the human brain. Neural networks may be electronic, optical, or simulated by computer software.  
  A basic network has three layers of processors: an input layer, an output layer, and a "hidden" layer in between. Each processor is connected to every other in the network by a system of "synapses;" every processor in the top layer connects to every one in the hidden layer, and each of these connects to every processor in the output layer. This means that each nerve cell in the middle and bottom layers receives input from several different sources; only when the amount of input exceeds a critical level does the cell fire an output signal.  
  The chief characteristic of neural networks is their ability to sum up large amounts of imprecise data and decide whether they match a pattern or not. Networks of this type may be used in developing robot vision, matching fingerprints, and analyzing fluctuations in stock-market prices. However, it is thought unlikely by scientists that such networks will ever be able to accurately imitate the human brain, which is very much more complicated; it contains around ten billion nerve cells, whereas current artificial networks contain only a few hundred processors.  
  robotics This term is applied to any computer-controlled machine that can be programmed to move or carry out work. Robots are often used in industry to transport materials or to perform repetitive tasks. For instance, robotic arms, fixed to a floor or workbench, may be used to paint machine parts or assemble electronic circuits. Other robots are designed to work in situations that would be dangerous to humans—for example, in defusing bombs or in space and deep-sea exploration. Some robots are equipped with sensors, such as touch sensors and video cameras, and can be programmed to make simple decisions based on the sensory data received.  
  computer-aided design and manufacturing  
  CAD (computer-aided design) The use of computers in creating and editing design drawings. CAD also allows such things as automatic testing of designs and multiple or animated three-dimensional views of designs. CAD systems are widely used in architecture, electronics, and engineering, for example in the motorvehicle industry, where cars designed with the assistance of computers are now commonplace. With a CAD system, picture components are accurately positioned using grid lines. Pictures can be resized, rotated, or mirrored without loss of quality or proportion.  
  CAM (computer-aided manufacturing) The use of computers to control production processes; in particular, the control of machine tools and robots in factories. In some factories, the whole design and production system has been automated by linking CAD (computer-aided design) to CAM.  
  Linking flexible CAD/CAM manufacturing to computer-based sales and distribution methods makes it possible to produce semicustomized goods cheaply and in large numbers.  
  computer games or video games  
  Computers can be used for leisure and entertainment as well as more serious purposes, and computer games represent an important and fast-growing area of use. There are a wide variety of computer-controlled games in which the computer (sometimes) opposes the human player. Computer games typically employ fast, animated graphics on a VDT (video display terminal) and synthesized sound.  
  Doomgate—Where it all Begins
  Clearing house of information for the computer game Doom. Frequently updated, it contains advice, frequently asked questions, technical details about graphics and specifications, add-on utilities, and links to the large number of Doom newsgroups.  
  Commercial computer games became possible with the advent of the microprocessor in the mid-1970s and rapidly became popular as amusement arcade games, using dedicated chips. Available games range from chess to fighter-plane simulations.  
  Some of the most popular computer games in the early 1990s were id Software's Wolfenstein 3D and Doom, which were designed to be played across networks including the Internet. A whole subculture built up around those particular games, as users took advantage of id's help to create their own additions to the game.  
  Kasparov v. Deep Blue—The Rematch
  Official site of the team that produced the first computer able to beat a world chess champion. This is a complete account of the tussle between Deep Blue and Gary Kasparov. There are some thought provoking articles on the consequences of Deep Blue's victory. There is also some video footage of the games.  




Page 578
  computer simulation  
  In this type of application, a real-life situation is represented in a computer program. For example, the program might simulate the flow of customers arriving at a bank. The user can alter variables, such as the number of cashiers on duty, and see the effect.  
  More complex simulations can model the behavior of chemical reactions or even nuclear explosions. The behavior of solids and liquids at high temperatures can be simulated using quantum simulation. Computers also control the actions of machines—for example, a flight simulator models the behavior of real aircraft and allows training to take place in safety. Computer simulations are very useful when it is too dangerous, time consuming, or simply impossible to carry out a real experiment or test.  
  virtual reality Virtual reality is an advanced form of computer simulation, in which a participant has the illusion of being part of an artificial environment. The participant views the environment through two tiny television screens (one for each eye) built into a visor. Sensors detect movements of the participant's head or body, causing the apparent viewing position to change. Gloves (datagloves) fitted with sensors may be worn, which allow the participant seemingly to pick up and move objects in the environment.  
  What Is Virtual Reality?
  Text-based introduction to VR and an information resource list. The site covers all major aspects of the subject, and also provides a great many literature and Internet references for further reading.  
  The technology is still under development but is expected to have widespread applications, for example in military and surgical training, architecture, and home entertainment.  
  Databases are structured collections of data, which may be manipulated to select and sort desired items of information. For example, an accounting system might be built around a database containing details of customers and suppliers. A telephone directory stored as a database might allow all those people whose names start with the letter B to be selected by one program, and all those living in Chicago, Illinois by another. Databases are normally used by large organizations as an effective and fast way of handling large amounts of information in various ways via mainframes or minicomputers.  
  There are three main types (or "models") of database: hierarchical, network, and relational, of which relational is the most widely used. In a relational database data are viewed as a collection of linked tables. A free-text database is one that holds the unstructured text of articles or books in a form that permits rapid searching. A collection of databases is known as a databank.  
  Communications and the Internet  
  E-mails are messages sent electronically from computer to computer via network connections such as Ethernet or the Internet, or via telephone lines to a host system. Messages once sent are stored on the network or by the host system until the recipient picks them up. As well as text, messages may contain enclosed text files, artwork, or multimedia clips.  
  Subscribers to an electronic mail system type messages in ordinary letter form on a word processor, or microcomputer, and "drop" the letters into a central computer's memory bank by means of a computer/telephone connector (a modem). The recipient "collects" the letter by calling up the central computer and feeding a unique password into the system. Due to the high speed of delivery electronic mail is cheaper than an equivalent telephone call or fax.  
  Integrated Services Digital Network (ISDN)  
  The ISDN is an internationally developed telecommunications system for sending signals in digital format. It involves converting the "local loop"—the link between the user's telephone (or private automatic branch exchange) and the digital telephone exchange—from an analog system into a digital system, thereby greatly increasing the amount of information that can be carried. The first large-scale use of ISDN began in Japan in 1988.  
  ISDN has advantages in higher voice quality, better quality faxes, and the possibility of data transfer between computers faster than current modems. With ISDN's Basic Rate Access a multiplexer divides one voice telephone line into three channels: two B bands and a D band. Each B band offers 64 kilobits per second and can carry one voice conversation or 50 simultaneous data calls at 1,200 bits per second. The D band is a data-signaling channel operating at 16 kilobits per second. With Primary Rate Access ISDN provides 30 B channels.  
  the Internet  
  The Internet is a global computer network connecting governments, companies, universities, and many other networks and users. Electronic mail, conferencing, and  




Page 579
  e-mail system The basic structure of an electronic mail system. A message is sent
via a telephone line and stored in a central computer. The message remains there
until the recipient calls up the central computer and collects the message.
  chat services are all supported across the network, as is the ability to access remote computers and send and retrieve files. In 1997 around 55 million adults had access to the Internet in the United States alone.  
  The technical underpinnings of the Internet were developed as a project funded by the Advanced Research Project Agency (ARPA) to research how to build a network that would withstand bomb damage. The Internet itself began in 1984 with funding from the U.S. National Science Foundation as a means to allow  
  analog-to-digital converter An analog-to-digital converter, or ADC,
converts a continuous analog signal produced by a sensor to
a digital ("off and on") signal for computer processing.




Page 580
  remote terminal Remote computer terminals communicate with the central mainframe
via modems and telephone lines. The controller allocates computer time to the terminals
according to predetermined priority rules. The multiplexor allows more than one terminal
to use the same communications link at the same time (multiplexing).
  U.S. universities to share the resources of five regional supercomputing centers. The number of users grew quickly, and in the early 1990s access became cheap enough for domestic users to have their own links on home personal computers. As the amount of information available via the Internet grew, indexing and search services such as Gopher, Archie, Veronica, and WAIS were created by Internet users to help both themselves and others. The newer World Wide Web allows seamless browsing across the Internet via hypertext.  
  Internet Starter Kit
  If you are struggling online then there are worse places to start than this—the full text of a recent book that covers everything from advice on getting connected to recommendations about the best shareware. It also includes some information on how to start creating Web pages yourself.  
  The U.S. vice president Al Gore announced plans, in April 1998, for Internet2 that will run on a second network Abilene, operated by private contractors, to provide a high-speed data communications backbone to serve the main U.S. research universities, enabling them to bypass the congestion on the Internet. It should be operational by 1999.  
  Parents' Guide to the Internet  
  Useful electronic booklet aiming to bridge the gap between children's and parents' knowledge of the Internet. The site introduces the main features of the "information superhighway" and argues for the benefits of getting connected to the Internet at home. Several sections provide navigation to assist parents' first steps on the Net and give them tips on safe traveling, and advice on how to encourage children's activities at home and at school.  
Categories of Hosts on the Internet
Number of Hosts on the Internet by Category Number of Hosts on the Internet by Year
  Source: Key Note (As of July 1996.)  
  Source: Key Note  
  Number of hosts  
  International organizations  





Page 581
  Internet The Internet is accessed by users via a modem to the service provider's hub,
which handles all connection requests. Once connected, the user can access a whole
range of information from many different sources, including the World Wide Web.
  Internet Service Provider (ISP) Dial-up access to the Internet is sold by an Internet Service Provider. Several types of company provide Internet access, including online information services such as CompuServe and America Online (AOL), electronic conferencing systems such as the WELL and Compulink Information eXchange, and local bulletin board systems (BBSs). Most recently founded ISPs, such as Demon Internet and PIPEX, offer only direct access to the Internet without the burden of running services of their own just for their members.  
  Such companies typically work out cheaper for their users, as they charge a low flat rate for unlimited usage. By contrast, commercial online services typically charge by the hour or minute.  
  World Wide Web (WWW)  
  hypertext system for publishing information on the Internet. World Wide Web documents ("Web pages") are text files coded using HTML (hypertext mark-up language) to include text and graphics, and are stored on a Web server connected to the Internet. Web pages may also contain dynamic objects and Java applets for enhanced animation, video, sound, and interactivity. The Web server can be any computer, from the simplest Apple Macintosh to the largest mainframe, if Web server software is available.  
  Every Web page has a URL (Uniform Resource Locator)—a unique address (usually starting with http://www) which tells a browser program (such as Netscape Navigator or Microsoft Internet Explorer) where to find it. An important feature of the World Wide Web is that most documents contain links enabling readers to follow whatever aspects of a subject interest them most. These links may connect to different computers all over the world. Interlinked or  
  Worldwide Web Workbook
  Guide for novice Web surfers (limited to users of PCs and MS Windows). Topics include hypertext, graphics, hypergraphics, imagemaps, and thumbnails. Once the basics have been covered, users are offered a short tour with the help of Spot, the mascot webdog.  




Page 582
Some Internet Terms
acceptable use set of rules enforced by a service provider or backbone network restricting the use to which their facilities may be put
access provider another term for Internet Service Provider
ack radio-derived term for "acknowledge," used on the Internet as a brief way of indicating agreement with or receipt of a message or instruction
alt hierarchy "alternative" set of newsgroups on USENET, set up so that anyone can start a newsgroup on any topic
anonymous remailer service that allows Internet users to post to USENET and send e-mail without revealing their true identity or e-mail address
Archie software tool for locating information on the Internet
bang path list of routing that appears in the header of a message sent across the Internet, showing how it traveled from the sender to its destination
Big Seven hierarchies original seven hierarchies of newsgroups on USENET. They are: comp.–computing; misc.–miscellaneous; news.–newsgroups; rec.–recreation; sci.–science; soc.–social issues; and talk.–debate
blocking software any of various software programs that work on the World Wide Web to block access to categories of information considered offensive or dangerous
blue-ribbon campaign campaign for free speech launched to protest against moves toward censorship on the Internet
'bot (short for robot) automated piece of software that performs specific tasks on the Internet. 'Bots are commonly found on multi-user dungeons (MUDs) and other multi-user role-playing game sites, where they maintain a constant level of activity even when few human users are logged on
bozo filter facility to eliminate messages from irritating users
browser any program that allows the user to search for and view data; Web browsers allow access to the World Wide Web
bulletin board center for the electronic storage of messages; bulletin board systems are usually dedicated to specific interest groups, and may carry public and private messages, notices, and programs
cancelbot automated software program that cancels messages on USENET; Cancelbot is activated by the CancelMoose, an anonymous individual who monitors newsgroups for complaints about spamming
crawler automated indexing software that scours the Web for new or updated sites
crossposting practice of sending a message to more than one newsgroup on USENET
cybersex online sexual fantasy spun by two or more participants via live, online chat
cyberspace the imaginary, interactive "worlds" created by networked computers; often used interchangeably with "virtual world"
cypherpunk passionate believer in the importance of free access to strong encryption on the Internet, in the interests of guarding privacy and free speech
digital city area in cyberspace, either text-based or graphical, that uses the model of a city to make it easy for visitors and residents to find specific types of information
e-zine (contraction of electronic magazine) periodical sent by e-mail. E-zines can be produced very cheaply as there are no production costs for design and layout, and minimal costs for distribution
FAQ (abbreviation for frequently asked questions) file of answers to commonly asked questions on any topic
firewall security system built to block access to a particular computer or network while still allowing some types of data to flow in and out onto the Internet
flame angry public or private electronic mail message used to express disapproval of breaches of netiquette or the voicing of an unpopular opinion
follow-up post publicly posted reply to a USENET message; unlike a personal e-mail reply, follow-up post can be read by anyone
FurryMUCK popular MUD site where the players take on the imaginary shapes and characters of furry, anthropomorphic animals
Gopher menu-based server on the Internet that indexes resources and retrieves them according to user choice via any one of several built-in methods such as FTP or Telnet. Gopher servers can also be accessed via the World Wide Web and searched via special servers called Veronicas
Gopherspace name for the knowledge base composed of all the documents indexed on all the Gophers in the world
hit request sent to a file server. Sites on the World Wide Web often measure their popularity in numbers of hits
home page opening page on a particular site on the World Wide Web
hop intermediate stage of the journey taken by a message traveling from one site to another on the Internet
HTTP (abbreviation for Hypertext Transfer Protocol) protocol used for communications between client (the Web browser) and server on the World Wide Web
hypermedia system that uses links to lead users to related graphics, audio, animation, or video files in the same way that hypertext systems link related pieces of text
in-line graphics images included in Web pages that are displayed automatically by Web browsers without any action required by the user
Internet Relay Chat (IRC) service that allows users connected to the Internet to chat with each other over many channels
Internet Service Provider (ISP) any company that sells dial-up access to the Internet
Jughead (acronym for Jonzy's Universal Gopher Hierarchy Excavation and Display) search engine enabling users of the Internet server Gopher to find keywords in Gopherspace directories





Page 583
killfile file specifying material that you do not wish to see when accessing a newsgroup. By entering names, subjects or phrases into a killfile, users can filter out tedious threads, offensive subject headings, spamming, or contributions from other subscribers
link image or item of text in a World Wide Web document that acts as a route to another Web page or file on the Internet
lurk read a USENET newsgroup without making a contribution
MBONE (contraction of multicast backbone) layer of the Internet designed to deliver packets of multimedia data, enabling video and audio communication
MIME (acronym for Multipurpose Internet Mail Extensions) standard for transferring multimedia e-mail messages and World Wide Web hypertext documents over the Internet
moderator person or group of people that screens submissions to certain newsgroups and mailing lists before passing them on for wider circulation
MUD (acronym for multi-user dungeon) interactive multi-player game, played via the Internet or modem connection to one of the participating computers. MUD players typically have to solve puzzles, avoid traps, fight other participants, and carry out various tasks to achieve their goals
MUSE (abbreviation for multi-user shared environment) type of MUD
MUSH (acronym for multi-user shared hallucination) a MUD (multi-user dungeon) that can be altered by the players
netiquette behavior guidelines evolved by users of the Internet including: no messages typed in upper case (considered to be the equivalent of shouting); new users, or new members of a newsgroup, should read the frequently asked questions (FAQ) file before asking a question; and no advertising via USENET newsgroups
net police USENET readers who monitor and "punish" postings which they find offensive or believe to be in breach of netiquette. Many newsgroups are policed by these self-appointed guardians
newbie insulting term for a new user of a USENET newsgroup
newsgroup discussion group on the Internet's USENET. Newsgroups are organized in seven broad categories: comp.– computers and programming; news.–newsgroups themselves; rec.–sports and hobbies; sci.–scientific research and ideas; talk.–discussion groups; soc.–social issues and misc.–everything else. In addition, there are alternative hierarchies such as the wide-ranging and anarchic alt. (alternative). Within these categories there is a hierarchy of subdivisions
newsreader program that gives access to USENET newsgroups, interpreting the standard commands understood by news servers in a simple, user-friendly interface
news server computer that stores USENET messages for access by users. Most Internet Service Providers (ISPs) offer a news server as part of the service
off-line browser program that downloads and copies Web pages onto a computer so that they can be viewed without being connected to the Internet
off-line reader program that downloads information from newsgroups, FTP servers, or other Internet resources, storing it locally on a hard disk so that it can be read without running up a large phone bill
Pretty Good Privacy (PGP) strong encryption program that runs on personal computers and is distributed on the Internet free of charge
proxy server server on the World Wide Web that "stands in" for another server, storing and forwarding files on behalf of a computer which might be slower or too busy to deal with the request itself
pseudonym name adopted by someone on the Internet, especially to participate in USENET or discussions using IRC (Internet Relay Chat)
signature (or .sig) personal information appended to a message by the sender of an e-mail message or USENET posting in order to add a human touch
spamming advertising on the Internet by broadcasting to many or all newsgroups regardless of relevance
spider program that combs the Internet for new documents such as Web pages and FTP files. Spiders start their work by retrieving a document such as a Web page and then following all the links and references contained in it
surfing exploring the Internet. The term is rather misleading: the glitches, delays, and complexities of the system mean the experience is more like wading through mud
sysop (contraction of system operator) the operator of a bulletin board system (BBS)
trolling mischievously posting a deliberately erroneous or obtuse message to a newsgroup in order to tempt others to reply—usually in a way that makes them appear gullible, intemperate, or foolish
URL (abbreviation for Uniform Resource Locator) series of letters and/or numbers specifying the location of a document on the World Wide Web. Every URL consists of a domain name, a description of the document's location within the host computer, and the name of the document itself, separated by full stops and backslashes
USENET (acronym for users' network) the world's largest bulletin board system, which brings together people with common interests to exchange views and information. It consists of e-mail messages and articles organized into newsgroups
vertical spam on USENET, spam which consists of many, often repetitive, messages per day posted to the same newsgroup or small set of newsgroups. The effect is to drown out other, more useful, conversation in the newsgroup
wAreZ slang for pirated games or other applications that can be downloaded using FTP
Web authoring tool software for creating Web pages. The basic Web authoring tool is HTML, the source code that determines how a Web page is constructed and how it looks
Web browser client software that allows you to access the World Wide Web
Webmaster system administrator for a server on the World Wide Web
Web page hypertext document on the World Wide Web
webzine magazine published on the World Wide Web, instead of on paper





Page 584
  web page An example of how pages on the World Wide Web may
be linked to take the user to additional pages of information.
  browser Two popular World Wide Web browsers, Netscape Navigator and Microsoft Internet
Explorer, which provide the user with a straightforward method of accessing information online.




Page 585
  nested Web pages belonging to a single organization are known as a Web site.  
  Web browser Web browsers allow access to the World Wide Web. Netscape Navigator and Microsoft's Internet Explorer were the leading Web browsers in 1996–97. They act as a graphical interface to information available on the Internet—they read HTML documents and display them as graphical documents which may include images, video, sound, and hypertext links to other documents.  
  Browsers using graphical user interfaces became widely available from 1993 with the release of Mosaic, written by Marc Andreessen and Eric Bina. For some specialist applications such as viewing the virtual reality sites beginning to appear on the Web, a special virtual reality modelling language (VRML) browser is needed.  
  Networking is the term for connecting computers so that they can share data and peripheral devices, such as printers. The main types of network are classified by the pattern of the connections—star or ring network, for example—or by the degree of geographical spread allowed; for example, local area networks (LANs) for communication within a room or building, and wide area networks (WANs) for more remote systems. The Internet is the computer network that connects major English-speaking institutions throughout the world, with more than 12 million users. Janet (Joint Academic Network), a variant of the Internet, is used in Britain. SuperJanet, launched in 1992, is an extension of this that can carry 1,000 million bits of information per second.  
  One of the most common networking systems is Ethernet, developed in 1973 (released in 1980) at Xerox's Palo Alto Research Center, California, by R. M. Metcalfe and D. R. Boggs.  
  Computers and the Future  
  fifth-generation computer  
  The fifth-generation computer is an anticipated new type of computer based on emerging microelectronic technologies with high computing speeds and parallel processing (see below). The development of very largescale integration (VLSI) technology, which can put many more circuits on to an integrated circuit (chip) than is currently possible, and developments in computer hardware and software design may produce computers far more powerful than those in current use.  
  It has been predicted that such a computer will be able to communicate in natural spoken language with its user; store vast knowledge databases; search rapidly through these databases, making intelligent inferences and drawing logical conclusions; and process images and ''see" objects in the way that humans do.  
  In 1981 Japan's Ministry of International Trade and Industry launched a ten-year project to build the first fifth-generation computer, the "parallel inference machine," consisting of over a thousand microprocessors operating in parallel with each other. By 1992, however, the project was behind schedule and had only produced 256 processor modules. It has since been suggested that research into other technologies, such as neural networks, may present more promising approaches to artificial intelligence.  
  parallel processing  
  Parallel processing is an emerging computer technology that allows more than one computation at the same time. Although in the 1980s this technology enabled only a small number of computer processor units to work in parallel, in theory thousands or millions of processors could be used at the same time.  
  Parallel processing, which involves breaking down computations into small parts and performing thousands of them simultaneously, rather than in a linear sequence, offers the prospect of a vast improvement in working speed for certain repetitive applications.  
  smart card  
  A smart card is a plastic card with an embedded microprocessor and memory. It can store, for example, personal data, identification, and bank account details, to enable it to be used as a credit or debit card. The card can be loaded with credits, which are then spent electronically, and reloaded as needed. Possible other uses range from hotel door "keys" to passports.  
  The smart card was invented by French journalist Juan Moreno in 1974. It is expected that by the year 2000 it will be possible to make cards with as much computing power as the leading personal computers of 1990.  
  Computer Museum
  Well designed interactive museum, examining the history, development, and future of computer technology. As well as plenty of illustrations and detailed explanations, it is possible to change your route through the museum by indicating whether you are a kid, student, adult, or educator.  




Page 586
  Computer Science Chronology  
Computer Science Chronology
100 Greek mathematician and inventor Hero of Alexandria devises a method of representing numbers and performing simple calculating tasks using a train of gears—a primitive computer.
1623 German inventor William Schickhard of Tübingen builds an early adding machine.
1642 French mathematician Blaise Pascal invents the first calculating machine.
1673 German mathematician Gottfried von Leibniz presents a calculating machine to the Royal Society. It is the most advanced yet, capable of multiplication, division, and extracting roots.
1679 Leibniz introduces binary arithmetic, in which only two symbols are used to represent all numbers. It will eventually pave the way for computers.
1805 French inventor Joseph-Marie Jacquard develops a loom that used punched cards to control the weaving of cloth.
c. 1835 English inventor Charles Babbage devises his analytical engine: "a mechanical device designed to combine basic arithmetic operations." Never completed because the production techniques were not yet available, it embodies most of the basic elements of modern computers: program control, memory arithmetic processing, and automatic printout.
1843 English mathematician Ada Byron, Countess Lovelace, writes a program for Charles Babbage's analytical engine—the first computer program.
1847 The English mathematician George Boole publishes The Mathematical Analysis of Logic, in which he shows that the rules of logic can be treated mathematically. Boole's work lays the foundation of computer logic.
1876 Scottish physicist William Thomson (Lord Kelvin), develops the first analog computer. Called the "Harmonic Analyser," he uses it to solve differential equations to predict tides.
1907 U.S. physicist Lee De Forest invents the "audion tube," a triode vacuum tube with a third electrode, shaped like a grid, between the cathode and anode that controls the flow of electrons and permits the amplification of sound. It is an essential element in the development of computers.
1915 U.S. physicist Manson Benedicks discovers that a germanium crystal can convert alternating current to direct current. It leads to the development of the microchip.
1930 U.S. electrical engineer Vannevar Bush builds the differential analyzer. The first analog computer, it is used to solve differential equations. It is forerunner of modern computers.
1936 British mathematician Alan Turing supplies the theoretical basis for digital computers by describing a machine, now known as the Turing machine, capable of universal rather than special-purpose problem solving.
1937 U.S. mathematician Georges Stibitz builds the first binary circuit that can add two binary numbers based on Boolean algebra. Consisting of batteries, lights, and wires, it is instrumental in the development of subsequent electromechanical computers.
1937–39 U.S. mathematician and physicist John V. Atanasoff invents an electromechanical digital computer for solving systems of linear equations. It uses punched cards and is the first electronic calculator using electronic vacuum tubes.
1938 German inventor Konrad Zuse constructs the first binary calculator using a binary code (Boolean algebra); it is the first working computer.
April 1939 U.S. physicists Georges Stibitz and Samuel B. Williams of Bell Laboratories build a computer consisting of over 400 relays connected to a teletype machine for input and output of data, thus introducing the idea of operating a computer via a terminal. Called a Complex Number Calculator, it is demonstrated on 8 January 1940.
1943 Colossus, the first electronic computer and code-breaker, is developed at Bletchley Park, England, to break German codes. Designed by Thomas Flowers, M. H. A. Newman, and English mathematician Alan Turing, it has 1,500 vacuum tubes and is the first all-electronic calculating device.
1944 U.S. mathematician Howard Aitken builds the Harvard University Mark I, or Automatic Sequence Controlled Calculator. The first program-controlled computer, it is 15 m/ 50 ft long and 2.4 m/8 ft high, and its operations are controlled by a sequence of instruction codes on punched paper that operate electromechanical switches. Simple multiplication takes 4 seconds and division 11 seconds.
1946 British scientist Maurice Wilkes writes the first assembly language—a mnemonic code using alphabetic symbols that translates instructions into computer machine languages.
1946 ENIAC (acronym for Electronic Numerical Integrator, Analyzer, and Calculator), the first general-purpose, fully electronic digital computer, is completed at the University of Pennsylvania for use in military research. It uses 18,000 vacuum tubes instead of mechanical relays, and can make 4,500





Page 587
  calculations a second. It is 24 m/80 ft long and is built by electrical engineers John Presper Eckert and John Mauchly, with input from John V. Atanasoff.
1947 U.S.-Hungarian mathematician John Von Neumann introduces the idea of a stored program computer, in which both instruction codes and data are stored.
1948 A magnetic drum for storage of computer data is introduced; data are recorded on magnetic tape on a rapidly spinning drum.
1948 Manchester University in Manchester, England, demonstrates a computer with a simple memory, which permits some software development. The stored-program electronic computer, Mark I, designed by Tom Kilurn, is the first to use Von Neumann architecture and stores data in a type of cathode ray tube (Williams tube).
Aug 1949 BINAC (acronym for Binary Automatic Computer) is built by U.S. scientists John W Mauchly and John Presper Eckert. It is the first electronic stored-program computer to store data on magnetic tape.
1949 EDSAC (acronym for Electronic Delay Storage Automatic Calculator) is constructed at Cambridge University, England; one of the first stored-program computers, it uses 3,000 vacuum tubes and is nearly six times faster than other computers; data are stored in mercury delay lines.
1949 U.S. engineer John W. Mauchly develops the Short Code, the first high-level programming language, which allows computers to recognize two-digits mathematical codes.
1950 Dr. Yoshiro Nakamata of the Imperial University, Tokyo, Japan, develops the floppy disk and licenses it to International Business Machines (IBM).
1950 EDVAC (Electronic, Discrete, Variable, Automatic Computer) is constructed at Princeton University in Princeton, New Jersey. Its instructions, or programs, are stored within the computer in numerical form.
June 1951 U.S. engineers John W. Mauchly and John Eckert build UNIVAC 1 (Universal Automated Computer), the first commercially available electronic digital computer, in Philadelphia, Pennsylvania. Built for the U.S. Bureau of the Census by the Remington Rand corporation, it uses vacuum tubes, it the first to handle both numeric and alphabetic information easily, has a memory of 1.5 kilobytes, and is the first to store data on magnetic tape.
1951 U.S. computer scientist Grace Hopper develops the first compiler. It translates programmers' codes into the binary machine codes used by computers.
1953 IBM introduces its first computer, the IBM 701, which competes with Remington Rand's UNIVAC. It has a memory of 4 kilobytes.
Feb 1955 IBM introduces the IBM 705 computer, the first commercially successful business computer to use magnetic core memory.
1955 U.S. firm IBM develops SABRE (Semi-Automated Business Related Environment) for American Airlines passenger reservations. It consists of more than 1,000 teletypewriters connected to a central database—the first computer network.
1956 IBM introduce RAMAC (Random Access Method of Accounting and Control), the first hard disk storage of data. Indexes are used to locate the data on the disk.
1956 Univac initiates the second generation of computers when it introduces the first commercially successful computer using transistors instead of vacuum tubes.
1956 U.S. computer programmer Jack Backus at IBM invents FORTRAN (formula translation), the first computer-programming language. It is used primarily by scientists and mathematicians.
Sept 12, 1958 U.S. electrical engineer Jack Kilby demonstrates the first integrated circuit. It consists of transistors, resistors, and capacitors contained within a silicon substrate. It leads to the third generation of computers.
1958 The U.S. telecommunications company Bell Laboratories invents the first modem, which allows telephone lines to transmit binary data.
1959 U.S. computer programmer Grace Hopper invents COBOL (Common Business Oriented Language), a computer language for business use.
1959 U.S. computer programmer John McCarthy develops the List Processor (LISP) computer language that is used in artificial intelligence applications.
1959 U.S. engineer Jean Hoerni of Fairchild Semiconductor Corporation designs the planar or "flat" transistor and U.S. engineer Robert Noyce discovers a way to join the circuits by printing, eliminating hundreds of hours in their production. Their work leads to the creation of the first microchip, which stimulates the computer industry with its sharply reduced size and cost and leads to the third generation of computers.
Nov 1960 U.S. computer scientist Kenneth Olsen, at Digital Equipment Corporation, introduces the PDP-1 computer. It has a memory of 26 megabytes and is the first to use a monitor and keyboard. It is the forerunner of the minicomputer.
1962 IBM builds the 7030 computer for the Los Alamos Laboratories, New Mexico. It contains 169,100 transistors and is 30 times faster than IBM's 704 mainframe computer.
1962 Magnetic disks begin to replace magnetic tape as the main means of storing computer data.
1964 IBM introduces the system 2250 computer, the first CAD (computer-aided design) computer.
1965 U.S. computer scientist John Kemeny and Thomas Kurtz develop BASIC (Beginners All-purpose Symbolic Instruction Code), a simplified computer programming language used in schools, businesses, and microcomputers.





Page 588
1965 U.S. Digital Equipment Corporation (DEC) introduces the PDP-8 (Programmed Data Processor) computer. The first minicomputer, it has 4 kilobytes of memory, is easy to use, and costs $18,000. It stimulates the growth of computers in business and education.
1966 The seven-level American Standard Code for Information Interchange (ASCII) receives widespread acceptance as a means of transmitting the high volumes of data generated by business machines.
1967 U.S. scientist Gene Amdahl proposes the use of parallel processors in computers to produce faster processing speeds.
1968 U.S. computer scientist Douglas Engelbart demonstrates the first computer mouse.
1968 U.S. firms Control Data, NCR, and Burroughs introduce the first commercial computers that use integrated circuits. the first of the "third generation" of computers, they are faster and have a greater capacity than previous ones.
1968 U.S. scientist Edward Feigenbaum and U.S. geneticist Joshua Lederberg develop DENDRAL, an expert system (which duplicates human decision-making processes) for identifying chemical substances in compounds based on the results of mass-spectrographic results. Its success spurs the development of other expert systems, especially in medicine.
1969 The U.S. Department of Defense establishes a computer network that is the basics of the Internet.
1970 IBM develops the floppy disk for storing computer data.
1970 U.S. computer programmers Kenneth Thomson and Dennis Ritchie develop the UNIX computer operating system. It becomes the standard operating system for computer systems with multiple tasking and multiple users.
1971 Dot matrix printers are first introduced.
1971 Swiss programmer Niklaus Wirth develops the computer language PASCAL. It is designed as a teaching tool for computer programming and allows errors to be discovered quickly.
c. 1971 A technique known as large-scale integration (LSI) is developed in the United States which makes it possible to pack thousands of transistors, diodes, and resistors on a silicon chip less than 5 mm/0.2 in square; it makes possible the development of microprocessors and microcomputers.
c. 1971 The programming language C is developed by Dennis Ritchie and Kenneth Thompson at Bell Laboratories; it is the preferred language of professional programmers and is widely used for writing software packages.
1972 SMALLTALK, one of the first object-oriented computer languages, is developed by U.S. scientist Alan Kay. It is especially adapted to graphics and uses icons.
1972 The computer language PROLOG (Programming-in-Logic) is developed by French computer scientist Alain Colmeraurer; it has applications in artificial intelligence.
1972 U.S. company Telnet Communications Corporation establishes a worldwide computer network.
1972 U.S. computer scientist Nolan Bushnell invents "Pong," the first computer game.
April 1974 Intel introduces the 8-bit 8080 microprocessor; it has 5,000 transistors.
1974 The first word processors are introduced by the Xerox corporation.
1975 The first "personal computer," the Altair 8800, is marketed in the United States; it has no keyboard or screen but uses toggle switches to input data and flashing lights for output.
1975 IBM introduces the laser printer.
1976 Daisywheel printers are introduced; they can print between 30 and 55 characters per second.
1976 IBM computers are built with chips with 16 kilobits (16,384 bits) of memory.
1976 IBM develops the ink-jet printers.
1977 Apple Computers launches the Apple II personal computer; owners must use their own television screens and store data on audiocassette tapes. It is the first mass-produced personal computer in assembled form.
1977 The U.S. firms Commodore Business Machines and Tandy Corporation introduce computers with built-in monitors although data and programs are still stored on tape cassettes; other computers use separate television screens.
1978 Apple Computers introduces personal computers with disk drives.
1978 Intel introduce the 16-bit 8086 microprocessor starting the x86 line of microprocessors. It has 29,000 transistors and runs at 10 MHz.
1978 The U.S. company DEC introduces the VAX (virtual address extension) computers; able to run very large programs, it becomes an industry standard for scientific and technical applications.
1978 U.S. computer programmer John Barnaby develops the word-processing program "Wordstar"; it becomes the most popular word processor in the early 1980s
1979 Motorola introduces the 8 MHz 68000 microprocessor; the first with a 32-bit register, it becomes the basis of the Macintosh computer.
1979 The Dutch company Philips and the Japanese company Sony work collaboratively to develop the compact disk (CD); tiny pits on the plastic are read by laser to reproduce sound or other information. CDs are first marketed in 1982.
1979 The first spreadsheet program for personal computers, VisiCalc, leads to the expansion in business use of PCs.





Page 589
1980 The database software package dBase II is developed by U.S. computer scientist Wayne Ratliffe; later versions of it become the principal filing system for personal computers.
Aug 12, 1981 IBM launches its personal computer, using the Microsoft disk-operating system (MS-DOS).
1981 The Japanese introduce computer chips with 64 kilobytes of memory.
1981 U.S. firm 3M develops the erasable optical disk, enabling disks to be reused.
1982 U.S. company Intel introduces the 16-bit 80286 microprocessor; it has 130,000 transistors and runs at speeds up to 12 MHz.
1982 U.S. firms Columbia Data Products and Compaq produce the first "clones" of an IBM personal computer; they use the same operating system as the IBM personal computer.
Feb 1983 IBM introduces the PC-XT personal computer, the first to have a built-in hard disk drive. The hard disk can store 10 megabytes of information even when the machine is turned off. It is supplied with DOS 2 which allows an unlimited number of files and subdirectories to be created.
March 29, 1983 The Tandy Corporation markets the first laptop computer in the United States. The TRS-80 Model 100 weighs less than 2 kg/4 lb and runs on 4 small batteries; prices range from $799 to $999.
1983 British computer company Inmos develops a computer capable of parallel processing: several operations such as memory, logic, and control are processed simultaneously, considerably speeding up overall processing speed.
1983 Japan launches the "fifth generation" computer project, aimed at producing a machine capable of a billion computations per second.
1983 Apple introduces the "Lisa," the first computer to use a mouse and pull-down menus.
Dec 20, 1984 The development of a 1 MB random access memory (RAM) chip is announced by Bell Laboratories; it is capable of storing four times as much data as any currently available.
1984 Computers are used to generate the 25 minutes of space-battle scenes in the film The Last Starfighter; it is the first film to make extensive use of computers.
1984 Japanese firm NEC produces computer chips with 256 kilobits of memory; similar ones are manufactured in the United States the following year.
1984 The Dutch company Philips and Japanese firm Sony introduce the CD-ROM, a laser-read, read-only disk.
1984 Apple launches the Macintosh personal computer in the Untied States; it is the first successful graphic-based microcomputer using icons and a mouse.
c. 1984 Computer "viruses" such as "Friday 13th," ''Trojan Horse," "Holland Girl," and "Christmas Tree" begin to appear.
1985 A chip that operates on fuzzy logic is developed and AT&T Bell Labs by Masaki Togai and Hiroyuki Watanabe.
1985 U.S. computer chip manufacturer Intel launches the 32-bit 20MHz 80386 microprocessor; it has 275,000 transistors.
1985 U.S. firm Cray Research introduce the Cray 2, a supercomputer with 4 processors and a 2 billion-byte memory that can perform 1 billion floating point operations per second.
1985 U.S. firm Microsoft develops Windows for the IBM PC.
1986 The first laptop computer is introduced in the United States.
Nov 2, 1988 Serious damage is done to more than 6,000 computer systems worldwide when the "Internet worm" computer virus, developed by Cornell University graduate student Robert Morris, is implanted in the Internet computer network.
1988 At Fujitsu laboratories in Japan, T. Kotani and coworkers develop a microprocessor that incorporates a Josephson junction; it works hundreds of times faster than conventional computer chips.
1988 U.S. computer scientists John Gustafson, Gary Montry, and Robert Benner develop a method of parallel processing that speeds up the processing of complex problems by a factor of 1,000; 100 times was thought to be the limit of this method.
1988 U.S. firm Motorola introduces a RISC (Reduced Instruction Set Computing) microprocessor; it processors fewer instructions and can operate much faster—processing up to 17 million instructions per second—than other processors.
1988 U.S. researcher Dana Anderson invents a holographic computer, capable of generating three-dimensional images.
1989 U.S. computer microchip manufacturer Intel launches the 25MHz 80486 microprocessor, it has 1.2 million transistors and includes a math coprocessor and cache memory.
1989 U.S. computing innovator Jaron Lanier makes the experience of virtual reality possible with his design of a headset and special gloves, which will allow a user to experience and manipulate a computer-generated world.
c. 1989 Developments in desktop publishing make high-quality print production more generally accessible.
Jan 29, 1990 U.S. scientist Alan Huang and his colleagues at Bell Laboratories demonstrate the first all-optical processor; calculations are performed optically using lasers, lenses, and fast light switches.
1991 British firm Virtuality launches its first commercial virtual reality products: games machines in arcades where players wear head-mounted displays.





Page 590
1991 Japanese electronics companies Sega and Nintendo compete for the lucrative console games market. Sega's "Sonic the Hedeghog" is matched against Nintendo's "Super Mario Brothers."
1991 Several U.S. companies introduce local area networks (LANs), which use nondirectional microwaves to transmit data as fast as fiber optic cables.
1991 The British Science Museum constructs Charles Babbage's second difference engine, demonstrating that it would have worked had the materials then been available. It evaluates polynomials up to the seventh power, with 30-figure accuracy.
1991 U.S. computer manufacturer Apple introduces "System 7," an intuitive, easy-to-use interface, with icons, windows, and a mouse.
1991 U.S. firm Cray Research introduces the Cray Y-MP 90 computer, which is capable of 16 billion calculations a second.
Nov 1992 The U.S. national on-line information service Delphi becomes the first national U.S. service to open a gateway to the Internet.
1992 The Japanese firm Fujitsu announces the launch of the first computer capable of performing 300 billion calculations a second.
1993 Fujitsu Corporation announces the development of a 256-megabit memory chip.
1993 Mosaic, the first graphical browser that allows pictures from the Internet to be seen, is developed at the National Center for Supercomputing Applications at the University of Illinois, United States.
1993 Personal computers based on the first 64-bit processor, the Intel Pentium chip, go on sale in the United States.
1994 The World Wide Web, a computer network that allows users to utilize graphical interfaces through Web "browsers," makes the Internet much more accessible to general users and permits a freedom of information distribution not previously possible.
Nov 1995 The Pentium Pro is launched by Intel. It is a 64-bit microprocessor containing 5.5. million transistors, compared with Pentium's 3.1 million, and can execute 166 million instructions per second.
1995 U.S. firm Sun Microsystems develops the computer-programming language "Java," which is used to construct World Wide Web sites.
Feb 10, 1996 IBM's Deep Blue computer beats Russian grand master Gary Kasparov at a chess match, the first computer to defeat a grand master. However, this is the first of a six-match series that Kasparov goes on to win 4–2.
June 3, 1997 U.S. computer scientists announce the construction of logic gates from DNA (deoxyribonucleci acid) which simulate the functions of an OR gate and an AND gates. Rather than responding to an electronic signal, the DNA gates respond to nucleotide sequences.
1997 An attempt in the United States to bring in legislation to control the Internet, intended to prevent access to sexual material, is rejected as unconstitutional.
April 14, 1998 U.S. vice president AI Gore announces plans for Internet2, a high-speed data communications network which will serve the main U.S. research universities, and bypass the congestion on the Internet. It should be operational by 1999.


  Aiken, Howard Hathaway (1900–1973) U.S. mathematician and computer pioneer. In 1939, in conjunction with engineers from IBM, he started work on the design of an automatic calculator using standard business-machine components. In 1944 the team completed one of the first computers, the Automatic Sequence Controlled Calculator (known as the Harvard Mark I), a programmable computer controlled by punched paper tape and using punched cards.  
  The Harvard Mark I was principally a mechanical device, although it had a few electronic features; it was 15 m/49 ft long and 2.5 m/8 ft high, and weighed more than 30 metric tons. Addition took 0.3 seconds, multiplication 4 seconds. It was able to manipulate numbers of up to 23 decimal places and to store 72 of them. The Mark II, completed in 1947, was a fully electronic machine, requiring only 0.2 seconds for addition and 0.7 seconds for multiplication. It could store 100 ten-digit figures and their signs.  
  Andreessen, Marc (1972– ) U.S. systems developer and coauthor of the first widely available graphical browser for the World Wide Web, Mosaic. He wrote Mosaic with fellow researcher Eric Bina while working at the National Center for Supercomputing Applications (NCSA), based at the University of Illinois. In 1994 both moved to the start-up company Netscape Communications Corporation to work on the next generation of browser software. This included Netscape Navigator, which was made freely available on the Internet and contributed to the explosive growth of the World Wide Web in the mid-1990s.  




Page 591
  Babbage, Charles (1792–1871) English mathematician who devised a precursor of the computer. He designed an analytical engine, a general-purpose mechanical computing device for performing different calculations according to a program input on punched cards (an idea borrowed from the Jacquard loom). This device was never built, but it embodied many of the principles on which digital computers are based.  
  In 1822 he began early work on a mechanical calculator, or difference engine, which could compute squares to six places of decimals, and was commissioned by the British Admiralty to work on an expanded version of the engine. But this project was abandoned in favor of the analytical engine, which he worked on for the rest of his life. The difference engine could perform only one function once it was set up. The analytical engine was intended to perform many functions; it was to store numbers and be capable of working to a program.  
  Babbage, Charles
  Extended biography of the visionary mathematician, industrialist, and misanthrope. This very full and entertaining account of his life is supported by a number of pictures of Babbage, his plan of the difference engine, and the actual difference engine built by the British Science Museum in 1991. There is a full bibliography.  
  Barlow, John Perry (1948– ) U.S. writer and cofounder in 1991 of the Electronic Frontier Foundation, a non-profitmaking organization concerned with protecting civil liberties, in particular freedom of speech, on the Internet. His writings about cyberspace issues, such as "Crime and Puzzlement" (1991) and "A Declaration of the Independence of Cyberspace" (1996), have circulated widely and influentially on the Net.  
  Electronic Frontier Foundation
  U.S.-based non-profit organization that aims to protect free speech on the Internet. This site includes a lot of technical legal jargon and the full text of Supreme Court decisions relating to their campaigns. However, there is also a lot of news-style pieces on issues such as encryption, privacy, and free speech which are more accessible to the casual browser.  
  Berners-Lee, Tim(othy) (1955– ) English inventor of the World Wide Web in 1990. He developed the Web whilst working as a consultant at CERN. He currently serves as director of the W3 Consortium, a neutral body that manages the Web. In 1996 the British Computing Society (BCS) gave him a Distinguished Fellow award.  
  His parents, both mathematicians, worked on the England's first commercial computer, the Ferranti Mark 1, in the 1950s.  
  Boole, George (1815–1864) English mathematician. His work The Mathematical Analysis of Logic (1847) established the basis of modern mathematical logic, and his Boolean algebra can be used in designing computers. His system is essentially two-valued. By subdividing objects into separate classes, each with a given property, his algebra makes it possible to treat different classes according to the presence or absence of the same property. Hence it involves just two numbers, 0 and 1—the binary system used in the computer.  
  Boole, George
  Extensive biography of the mathematician. The site contains a clear description of his working relationship with his contemporaries, and also includes the title page of his famous book Investigation of the Laws of Thought. Several literature references for further reading on the mathematician are also listed, and the Web site also features a portrait of Boole.  
  Byron, (Augusta) Ada, Countess of Lovelace (1815–1852) English mathematician, a pioneer in writing programs for Charles Babbage's analytical engine. In 1983 a new, high-level computer language, Ada, was named for her.  
  Cerf, Vinton (1943– ) U.S. inventor of part of the TCP/IP protocols on which the Internet is based. Known throughout the industry as the "Father of the Internet," Cerf is president of the Internet Society and was a principal developer of the ARPANET.  
  Cerf is senior vice president of data architecture for MCI Telecommunications Corporation's Data Services Division in Reston, Virginia. He was awarded the U.S. National Medal of Technology in 1997.  
  Clark, Jim (James) U.S. founder of Silicon Graphics Inc. in 1982 and the Netscape Communications Corporation in 1994. As an associate professor at Stanford University, California, he and a team of graduate students developed the initial technology upon which Silicon Graphics's first products were built. He resigned as chair of Silicon Graphics early in 1994 to start up Netscape, of which he is chair.  
  Cray, Seymour
  Lengthy interview with Seymour Cray, carried out on behalf of the National Museum of American History, at the Smithsonian Institution.  
  Cray, Seymour Roger (1925–1996) U.S. computer scientist and pioneer in the field of supercomputing. He designed one of the earliest computers to contain transistors in 1960. In 1972 he formed Cray Research to build the first supercomputer, the Cray-1, released in 1976. Its success led to the production of further supercomputers, including the Cray-2 in 1985, the Cray Y-MP, a multiprocessor design, in 1988, and the Cray-3 in 1989.  
  Eckert, John Presper
  Part of an archive containing the biographies of the world's greatest mathematicians, this site is devoted to the life and contributions of John Eckert.  




Page 592
  Eckert, John Presper, Jr. (1919–1995) U.S. electronics engineer and mathematician who collaborated with John Mauchly on the development of the early ENIAC (1946) and UNIVAC 1 (1951) computers.  
  The ENIAC (Electronic Numerical Integrator, Analyzer, and Calculator) weighed many tons and lacked a memory, but could tore a limited amount of information and perform mathematical functions. It was used for calculating ballistic firing tables and for meteorological and research problems. It was superseded by BINAC (Binary Automatic Computer), also designed in part by Eckert, and in the early 1950s Eckert's group began to produce computers for the commercial market with the construction of the UNIVAC 1 (Universal Automated Computer). Its chief advance was the capacity to store programs.  
  The Eckert—Mauchly Computer Corporation, formed in 1947, was incorporated in Remington Rand in 1950 and subsequently came under the control of the Sperry Rand Corporation.  
  Gates, Bill (William) Henry, III (1955– ) U.S. businessman and computer programmer. He cofounded the Microsoft Corporation in 1975 and was responsible for supplying MS-DOS, the operating system and the Basic language that IBM used in the IBM PC. In 1997 Gates controlled a $39.8 billion stockholding in Microsoft, making him the world's richest individual.  
  When the IBM deal was struck in 1980, Microsoft did not actually have an operating system, but Gates bought one from another company, renamed it MS-DOS, and modified it to suit IBM's new computer. Microsoft also retained the right to sell MS-DOS to other computer manufacturers, and because the IBM PC was not only successful but easily copied by other manufacturers, MS-DOS found its way onto the vast majority of PCs. The revenue from MS-DOS helped Microsoft to expand into other areas of software, guided by Gates.  
  In 1994 he invested $10 million into a biotechnology company, Darwin Molecular, with Microsoft cofounder Paul Allen.  
  In 1997 Gates and his wife, Melinda French Gates, formed the non-profitmaking Gates Library Foundation whose aim is to provide computers and software to public libraries. This initiative expanded Microsoft's program Libraries Online supporting libraries in the United States and Canada.  
  Herzog, Bertram (1929– ) German-born computer scientist, one of the pioneers in the use of computer graphics in engineering design. He has alternated academic posts with working in industry. In 1963 he joined the Ford Motor Company as engineering methods manager, where he extensively applied computers to tasks involved in planning and design. In 1965 he became professor of industrial engineering at the University of Michigan. Two years later he became professor of electrical engineering and computer science at the University of Colorado.  
  Hollerith, Herman (1860–1929) U.S. inventor of a mechanical tabulating machine, the first device for high-volume data processing. Hollerith's tabulator was widely publicized after being successfully used in the 1890 census. The firm he established, the Tabulating Machine Company, was later one of the founding companies of IBM.  
  While working on the 1880 U.S. census, he saw the need for an automated recording process for data, and had the idea of punching holes in cards or rolls of paper. By 1889 he had developed machines for recording, counting, and collating census data. The system was used in 1891 for censuses in several countries, and was soon adapted to the needs of government departments and businesses that handled large quantities of data.  
  Hollerith, Herman
  Part of an archive containing the biographies of the world's greatest mathematicians, this site is devoted to the life and contributions of inventor Herman Hollerith.  
  Hopper, Grace (1906–1992) U.S. computer pioneer who created the first compiler and helped invent the computer language COBOL. She also coined the term "debug."  
  In 1945 she was ordered to Harvard University to assist Howard Aiken in building a computer. One day a breakdown of the machine was found to be due to a moth that had flown into the computer. Aiken came into the laboratory as Hopper was dealing with the insect. "Why aren't you making numbers, Hopper?" he asked. Hopper replied: "I am debugging the machine!"  
  Hopper's main contribution was to create the first computer language, together with the compiler needed to translate the instructions into a form that the computer could work with. In 1959 she was invited to join a Pentagon team attempting to create and standardize a single computer language for commercial use. This led to the development of COBOL, still one of the most widely used languages.  
  Jacquard, Joseph Marie (1752–1834) French textile manufacturer. He invented a punched-card system for programming designs on a carpetmaking loom. In 1801 he constructed looms that used a series of punched cards to control the pattern of longitudinal warp threads depressed before each sideways passage of the shuttle. On later machines the punched cards were joined to form an endless loop that represented the "program" for the repeating pattern of a carpet. Jacquard-style punched cards were used in the early computers of the 1940s–60s.  
  Jobs, Steven Paul (1955– ) U.S. computer entrepreneur. He cofounded Apple Computer Inc with Steve Wozniak in 1976, and founded NeXT Technology Inc in 1985. In 1986 he bought Pixar Animation Studios, the computer animation studio spin-off from George Lucas's LucasFilm.  
  Jobs has been involved with the creation of three different types of computer: the Apple II personal computer in 1977, the Apple Macintosh in 1984—marketed as "the computer for the rest of us"—and the NeXT workstation in 1988.  
  The NeXT was technically the most sophisticated and powerful design, but it was a commercial disaster, and in 1993 NeXT abandoned hardware manufacturing to concentrate on its highly-regarded UNIX-based object-oriented operating system, NextStep. Apple Computer bought NeXT at the end of 1996 to obtain NextStep, and Jobs returned to Apple in an advisory capacity. However, he soon took over as acting chief executive officer of the struggling firm in 1997.  
  Kahle, Brewster (1960– ) U.S. computing entrepreneur who is best known for inventing the software tool WAIS system for  




Page 593
  publishing material on the Internet. Early in his career, Kahle founded Thinking Machines Corporation, a company that designed supercomputers. He sold his second company, WAIS Inc, to America Online in 1995. In 1996 he continued to coordinate WAIS, and set up an Internet archive which aims to keep a copy of every item on the Net.  
  Kapor, Mitchell (1951– ) U.S. entrepreneur and software designer who founded Lotus Development Corporation, a leading business software company, in 1982. In 1991 he cofounded the Electronic Frontier Foundation, a non-profitmaking organization concerned with protecting civil liberties, in particular freedom of speech on the Internet. Kapor is also a professor of media arts and sciences at the Massachussetts Institute of Technology.  
  Kay, Alan U.S. computing expert and a key figure in the development of graphical user interfaces (later popularized by the Apple Macintosh) and object-oriented languages (Smalltalk) while working at Xerox's Palo Alto Research Center (Parc) throughout the 1970s. Kay also came up with the inspirational idea of the DynaBook, a sort of computer-based personal digital assistant. Kay spent 1984–96 as an Apple Fellow, working mainly on future-oriented projects with children. In 1996 he joined Walt Disney Imagineering as a Disney Fellow.  
  Mauchly, John William (1907–1980) U.S. physicist and engineer who, in 1946, constructed the first general-purpose computer, the ENIAC (Electronic Numerical Integrator, Analyzer, and Calculator), in collaboration with John Eckert. Their company was bought by Remington Rand (later Sperry Rand) in 1950, and they built the UNIVAC 1 computer (Universal Automated Computer) in 1951 for the U.S. census. Mauchly was a consultant to Remington Rand 1950–59 and again from 1973, after setting up his own consulting company in 1959.  
  The work on ENIAC was carried out by Mauchly and Eckert during World War II, and was commissioned to automate the calculation of artillery firing tables for the U.S. Army. In 1949 the two partners designed a small-scale binary computer, BINAC, which was faster and cheaper to use Punched cards were replaced with magnetic tape, and the computer stored programs internally.  
  Mitnick, Kevin (1963– ) U.S. computer criminal, known as "the world's most wanted hacker" during the three years he spent on the run before being caught in 1994. He was a compulsive hacker who specialized in penetrating communications systems including MCI, Pacific Bell, the Manhattan telephone system, and a Pentagon defence computer.  
  Moore, Gordon (1928– ) U.S. cofounder, with the late Robert Noyce, of the microchip manufacturer Intel in 1968. In 1965, when writing an article for the 35th anniversary edition of Electronics magazine, Moore formulated what has since been named Moore's Law: the number of components that could be squeezed onto a silicon chip would double every year. Moore updated this prediction in 1975 from doubling every year to doubling every two years. These observations proved remarkably accurate—the processing technology of 1996, for example, was some 8 million times more powerful than that of 1966—partly because chip manufacturers tried to keep up with Moore's Law so as to avoid falling behind their rivals.  
  Negroponte, Nicholas U.S. founder and the director of the MIT Media Lab and columnist for Wired magazine. In 1996 he published Being Digital, in which he made an analogy between bits of data ("the DNA of information") and atoms of matter. Negroponte also predicted that mass media such as newspapers and television will give way to consumer-led electronic media in which people will take only the information they need.  
  Nelson, Ted (Theodore) (1937– ) U.S. computer scientist who coined the term hypertext in 1965 to propose a type of literature that used links embedded in text to connect readers to sources of further information. He went on to develop a global electronic publishing project called Xanadu, and was appointed professor of environmental information at Keio University, Japan, in 1996.  
  Noyce, Robert Norton (1927–1990) U.S. scientist and inventor, with Jack Kilby, of the integrated circuit (chip), which revolutionized the computer and electronics industries in the 1970s and 1980s. In 1968 he and six colleagues founded the Intel Corporation, which became one of the United States's leading semiconductor manufacturers.  
  Noyce was awarded a patent for the integrated circuit in 1959. In 1961 he founded his first company, Fairchild Camera and Instruments Corporation, around which Silicon Valley was to grow. The company was the first in the world to understand and exploit the commercial potential of the integrated circuit. It quickly became the basis for such products as the personal computer, the pocket calculator, and the programmable microwave oven. At the time of his death, he was president of Sematech Incorporated, a government-industry research consortium created to help U.S. firms regain a lead in semiconductor technology that they had lost to Japanese manufacturers.  
  Sinclair, Clive Marles (1940– ) British electronics engineer. He produced the first widely available pocket calculator, pocket and wristwatch televisions, a series of home computers, and the innovative but commercially disastrous C5 personal transport (a low cyclelike three-wheeled vehicle powered by a washing-machine motor).  
  Turing, Alan Mathison (1912–1954) English mathematician and logician. In 1936 he described a "universal computing machine" that could theoretically be programmed to solve any problem capable of solution by a specially designed machine. This concept, now called the Turing machine, foreshadowed the digital computer.  
  Turing is believed to have been the first to suggest (in 1950) the possibility of machine learning and artificial intelligence. His test for distinguishing between real (human) and simulated (computer) thought is known as the Turing test: with a person in one room and the machine in another, an interrogator in a third room asks questions of both to try to identify them. When the interrogator cannot distinguish between them by questioning, the machine will have reached a state of humanlike intelligence.  
  During World War II Turing worked on the Ultra project in the team that cracked the German Enigma cipher code. After the war he worked briefly on the project to design the general computer known as the Automatic Computing Engine, or ACE, and was involved in the pioneering computer developed at Manchester University from 1948.  




Page 594
  Alan Turing Home Page
  Authoritative illustrated biography of the computer pioneer, plus links to related sites. This site contains information on his origins and his code-breaking work during World War II, as well as several works written by Turing himself.  
  Von Neumann, John (originally Johann) (1903–1957) Hungarian-born U.S. scientist and mathematician, a pioneer of computer design. He invented his "rings of operators" (called Von Neumann algebras) in the late 1930s, and also contributed to set theory, game theory, quantum mechanics, cybernetics (with his theory of self-reproducing automata, called Von Neumann machines), and the development of the atomic and hydrogen bombs.  
  He designed and supervised the construction of the first computer able to use a flexible stored program (named MANIAC-1) at the Institute for Advanced Study at Princeton 1940–52. This work laid the foundations for the design of all subsequent programmable computers.  
  Von Neumann, John
  Biographical feature on this pioneer of computer design, plus quotations, and a bibliography. It is a text-based site, but contains plenty of information about Von Neumann.  
  Wang, An (1920–1990) Chinese-born U.S. engineer, founder of Wang Laboratories in 1951, one of the world's largest computer companies in the 1970s. In 1948 he invented the computer memory core, the most common device used for storing computer data before the invention of the integrated circuit (chip).  
  Wang emigrated to the United States in 1945. He developed his own company with the $500,000 he received from IBM from the sale of his patent. His company took off in 1964 with the introduction of a desktop calculator. Later, Wang switched with great success to the newly emerging market for word-processing systems based on cheap silicon chips, turning Wang Laboratories into a multibillion-dollar company. However, with the advent of the personal computer, the company fell behind and had to seek protection from its creditors. It staged a comeback, doubling in size during 1994–97 to achieve annual revenues of $1.3 billion.  
  Wiener, Norbert (1894–1964) U.S. mathematician, credited with the establishment of the science of cybernetics in his book Cybernetics (1948). In mathematics, he laid the foundation of the study of stochastic processes (those dependent on random events), particularly Brownian motion. He devoted much of his efforts to methodology, developing mathematical approaches that could usefully be applied to continuously changing processes.  
  During World War II, Wiener worked on the control of anti-aircraft guns (which required him to consider factors such as the machinery itself, the gunner, and the unpredictable evasive action on the part of the target's pilot), on filtering "noise" from useful information for radar, and on coding and decoding. His investigations stimulated his interest in information transfer and processes such as information feedback.  
  Wilkes, Maurice Vincent (1913– ) English mathematician who led the team at Cambridge University that built the EDSAC (Electronic Delay Storage Automatic Calculator) in 1949, one of the earliest British electronic computers. He chose the serial mode, in which the information in the computer is processed in sequence (and not several parts at once, as in the parallel type). This design incorporated mercury delay lines (developed at the Massachusetts Institute of Technology, United States) as the elements of the memory.  
  In May 1949 the EDSAC ran its first program and became the first delay-line computer in the world. From early 1950 it offered a regular computing facility to the members of Cambridge University, the first general-purpose computer service. Much time was spent by the research group on programming and on the compilation of a library of programs. The EDSAC was in operation until 1958.  
  EDSAC II came into service in 1957. This was a parallel-processing machine and the delay line was abandoned in favour of magnetic storage methods.  
abbreviation for artificial intelligence.
the logical sequence of operations to be performed by a program. A flow chart is a visual representation of an algorithm.
(of a quantity or device) changing continuously; by contrast a digital quantity or device varies in a series of distinct steps. For example, an analog clock measures time by means of a continuous movement of hands around a dial, whereas a digital clock measures time with a numerical display that changes in a series of discrete steps.
mini-software application. Examples of applets include Microsoft WordPad, the simple word processor in Windows 95 or the single-purpose applications that in 1996 were beginning to appear on the World Wide Web, written in Java. These include small animations such as a moving ticker tape of stock prices.
a program or job designed for the benefit of the end user. Examples of general purpose application programs include word processors, desktop publishing programs, databases, spreadsheet packages, and graphics programs. Application-specific programs include payroll and stock-control systems. Applications may also be custom designed to solve a specific problem not catered for in other types of application.
  The term is used to distinguish such programs from those that control the computer (systems programs) or assist the programmer, such as a compiler.  
a unit of electrical signaling speed equal to one pulse per second, measuring the rate at which signals are sent




Page 595
  between electronic devices such as telegraphs and computers; 300 baud is about 300 words a minute.  
  binary number system
a system of numbers to base two, using combinations of the digits 1 and 0. Codes based on binary numbers are used to represent instructions and data in all modern digital computers, the values of the binary digits (contracted to "bits") being stored or transmitted as, for example, open/closed switches, magnetized/unmagnetized disks and tapes, and high/low voltages in circuits.
(contraction of binary digit) a single binary digit, either 0 or 1. A bit is the smallest unit of data stored in a computer; all other data must be coded into a pattern of individual bits. A c0016-01.gifbyte represents sufficient computer memory to store a single character of data, and usually contains eight bits. For example, in the ASCII code system used by most microcomputers the capital letter A would be stored in a single byte of memory as the bit pattern 01000001.
  boot, or bootstrap,
the process of starting up a computer. Most computers have a small, built-in boot program that starts automatically when the computer is switched on—its only task is to load a slightly larger program, usually from a hard disk, which in turn loads the main operating system.
any program that allows the user to search for and view data. Browsers are usually limited to a particular type of data, so, for example, a graphics browser will display graphics files stored in many different file formats. Browsers usually do not permit the user to edit data, but are sometimes able to convert data from one file format to another.
  bubble memory
a memory device based on the creation of small "bubbles" on a magnetic surface. Bubble memories typically store up to 4 megabits (4 million bits) of information. They are not sensitive to shock and vibration, unlike other memory devices such as disk drives, yet, like magnetic disks, they are nonvolatile and do not lose their information when the computer is switched off.
an error in a program. It can be an error in the logical structure of a program or a syntax error, such as a spelling mistake. Some bugs cause a program to fail immediately; others remain dormant, causing problems only when a particular combination of events occurs. The process of finding and removing errors from a program is called debugging.
sufficient computer memory to store a single character of data. The character is stored in the byte of memory as a pattern of c0016-01.gifbits (binary digits), using a code such as ASCII. A byte usually contains eight bits—for example, the capital letter F can be stored as the bit pattern 01000110.
  cache memory
a reserved area of the immediate access memory used to increase the running speed of a computer program.
acronym for computer-aided design.
(acronym for computer-assisted learning) the use of computers in education and training: the computer displays instructional material to a student and asks questions about the information given; the student's answers determine the sequence of the lessons.
acronym for computer-aided manufacturing.
(abbreviation for compact-disk read-only memory) a computer storage device developed from the technology of the audio compact disk.
real-time exchange of messages between users of a particular system. Chat allows people who are geographically far apart to type messages to each other which are sent and received instantly. The biggest chat system is Internet Relay Chat (IRC), which is used for the exchange of information and software as well as for social interaction.
  chip, or silicon chip,
another name for an c0016-01.gifintegrated circuit, a complete electronic circuit on a slice of silicon (or other semiconductor) crystal only a few millimetres square.
  client—server architecture
a system in which the mechanics of looking after data are separated from the programs that use the data. For example, the "server" might be a central database, typically located on a large computer that is reserved for this purpose. The "client" would be an ordinary program that requests data from the server as needed.
  clip art
small graphics used to liven up documents and presentations. Many software packages such as word processors and presentation graphics packages come with a selection of clip art.
copy of hardware or software that may not be identical to the original design but provides the same functions. All personal computers (PCs) are to some extent clones of the original IBM PC and PC AT launched by IBM in 1981 and 1984, respectively—including IBM's current machines. Cloning a disk drive or workstation, however, means making an exact copy of all the files or software so that the new drive or machine functions identically to the original one.
  command language
a set of commands and the rules governing their use, by which users control a program. For example, an operating system may have commands such as SAVE and DELETE, or a payroll program may have commands for adding and amending staff records.
  computer art
art produced with the help of a computer. Since the 1950s the aesthetic use of computers has been increasingly evident in most artistic disciplines, including film animation, architecture, and music. Computer graphics has been the most developed area, with the "paint-box" computer liberating artists from the confines of the canvas. It is now also possible to program computers in advance to generate graphics, music, and sculpture, according to "instructions" which may include a preprogramed element of unpredictability.
  computer-assisted learning
use of computers in education and training; see c0016-01.gifCAL.
  computer crime
broad term applying to any type of crime committed via a computer, including unauthorized access to files. Most computer crime is committed by disgruntled former employees or subcontractors. Examples include the releasing of viruses, hacking, and computer fraud. Many countries, including the United States and the U.K., have specialized law enforcement units to supply the technical




Page 596
  knowledge needed to investigate computer crime.  
  computer generation
any of the five broad groups into which computers may be classified: first generation the earliest computers, developed in the 1940s and 1950s, made from valves and wire circuits; second generation from the early 1960s, based on transistors and printed circuits; third generation from the late 1960s, using integrated circuits and often sold as families of computers, such as the IBM 360 series; fourth generation using microprocessors, large-scale integration (LSI), and sophisticated programming languages, still in use in the 1990s; and fifth generation based on parallel processing and very large-scale integration, currently under development.
  control character
any character produced by depressing the control key (Ctrl) on a keyboard at the same time as another (usually alphabetical) key. The control characters form the first 32 ASCII characters and most have specific meanings according to the operating system used. They are also used in combination to provide formatting control in many word processors, although the user may not enter them explicitly.
  corruption of data
introduction or presence of errors in data. Most computers use a range of verification and validation routines to prevent corrupt data from entering the computer system or detect corrupt data that are already present.
abbreviation for central processing unit.
(singular datum) facts, figures, and symbols, especially as stored in computers. The term is often used to mean raw, unprocessed facts, as distinct from information, to which a meaning or interpretation has been applied.
  data compression
techniques for reducing the amount of storage needed for a given amount of data. They include word tokenization (in which frequently used words are stored as shorter codes), variable bit lengths (in which common characters are represented by fewer bits than less common ones), and run-length encoding (in which a repeated value is stored once along with a count).
any of the numbers from 0 to 9 in the decimal system. Different bases have different ranges of digits. For example, the c0016-01.gifhexadecimal system has digits 0 to 9 and A to F, whereas the binary system has two digits (or c0016-01.gifbits), 0 and 1.
  digital recording
technique whereby the pressure of sound waves is sampled more than 30,000 times a second and the values converted by computer into precise numerical values. These are recorded and, during playback, are reconverted to sound waves.
a list of file names, together with information that enables a computer to retrieve those files from backing storage. The computer operating system will usually store and update a directory on the backing storage to which it refers. So, for example, on each disk used by a computer a directory file will be created listing the disk's contents.
  The term is also used to refer to the area on a disk where files are stored; the main area, the root directory, is at the topmost level, and may contain several separate subdirectories.  
  document reader
an input device that reads marks or characters, usually on preprepared forms and documents. Such devices are used to capture data by (optical mark recognition (OMR), optical character recognition (OCR), and mark sensing.
(acronym for disk operating system) a computer operating system specifically designed for use with disk storage; also used as an alternative name for a particular operating system, MS-DOS.
abbreviation for desktop publishing.
  Dynamic HTML
the fourth version of hypertext mark-up language (HTML), the language used to create Web pages. It is called Dynamic HTML because it enables dynamic effects to be incorporated in pages without the delays involved in downloading Java applets and without referring back to the server.
(abbreviation for extended binary-coded decimal interchange code) a code used for storing and communicating alphabetic and numeric characters. It is an 8-bit code, capable of holding 256 different characters, although only 85 of these are defined in the standard version. It is still used in many mainframe computers, but almost all mini-and micro-computers now use ASCII code.
(abbreviation for electrically erasable programmable read-only memory) computer memory that can record data and retain it indefinitely. The data can be erased with an electrical charge and new data recorded. Some EEPROM must be removed from the computer and erased and reprogrammed using a special device. Other EEPROM, called flash memory, can be erased and reprogrammed without removal from the computer.
abbreviation for electronic mail.
(abbreviation for erasable programmable read-only memory) computer memory device in the form of an c0016-01.gifintegrated circuit (chip) that can record data and retain it indefinitely. The data can be erased by exposure to ultraviolet light, and new data recorded. Other kinds of computer memory chips are c0016-01.gifROM (read-only memory), c0016-01.gifPROM (programmable read-only memory), and c0016-01.gifRAM (random-access memory).
  expansion board, or expansion card,
printed circuit board that can be inserted into a computer in order to enhance its capabilities (for example, to increase its memory) or to add facilities (such as graphics).
  expert system
computer program for giving advice (such as diagnosing an illness or interpreting the law) that incorporates knowledge derived from human expertise. It is a kind of c0016-01.gifknowledge-based system containing rules that can be applied to find the solution to a problem. It is a form of artificial intelligence.
common name for facsimile transmission or telefax, the transmission of images over a telecommunications link, usually the telephone network. When placed on a fax machine, the original image is scanned by a transmitting device and converted into coded signals, which travel via the telephone lines to the receiving fax machine, where an image is created that is a copy of the original. Photographs as well as printed text and drawings can be sent. The standard transmission takes place at 4,800 or 9,600 bits of information per second.




Page 597
general principle whereby the results produced in an ongoing reaction become factors in modifying or changing the reaction; it is the principle used in self-regulating control systems, from a simple thermostat and steam-engine governor to automatic computer-controlled machine tools. A fully computerized control system, in which there is no operator intervention, is called a closed-loop feedback system. A system that also responds to control signals from an operator is called an open-loop feedback system.
  font, or fount,
complete set of printed or display characters of the same typeface, size, and style (bold, italic, underlined, and so on). Fonts used in computer setting are of two main types: bit-mapped and outline. Bit-mapped fonts are stored in the computer memory as the exact arrangement of pixels or printed dots required to produce the characters in a particular size on a screen or printer. Outline fonts are stored in the computer memory as a set of instructions for drawing the circles, straight lines, and curves that make up the outline of each character. In the U.K., font sizes are measured in points, a point being approximately 0.3 mm.
in computing, a measure of memory capacity, equal to 1,024 megabytes. It is also used, less precisely, to mean 1,000 billion c0016-01.gifbytes.
unauthorized access to a computer, either for fun or for malicious or fraudulent purposes. Hackers generally use microcomputers and telephone lines to obtain access. In computing, the term is used in a wider sense to mean using software for enjoyment or self-education, not necessarily involving unauthorized access. The most destructive form of hacking is the introduction of a computer virus.
the mechanical, electrical, and electronic components of a computer system, as opposed to the various programs, which constitute c0016-01.gifsoftware.
SI unit (symbol Hz) of frequency (the number of repetitions of a regular occurrence in one second). Radio waves are often measured in megahertz (MHz), millions of hertz, and the clock rate of a computer is usually measured in megahertz. The unit is named for Heinrich Hertz.
  hexadecimal number system, or hex,
a number system to the base 16, used in computing. In hex the decimal numbers 0–15 are represented by the characters 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, A, B, C, D, E, F.
  Hexadecimal numbers are easy to convert to the computer's internal binary code and are more compact than binary numbers.  
method of producing three-dimensional (3-D) images, called holograms, by means of laser light. Holography uses a photographic technique (involving the splitting of a laser beam into two beams) to produce a picture, or hologram, that contains 3-D information about the object photographed. Some holograms show meaningless patterns in ordinary light and produce a 3-D image only when laser light is projected through them, but reflection holograms produce images when ordinary light is reflected from them (as found on credit cards).
(Hypertext Markup Language) the standard for structuring and describing a document on the World Wide Web. The HTML standard provides labels for constituent parts of a document (for example headings and paragraphs) and permits the inclusion of images, sounds, and "hyperlinks" to other documents. A browser program is then used to convert this information into a graphical document on screen. The specifications for HTML version 4, called Dynamic HTML, were adopted at the end of 1997.
  Lightning HTML Editor
  HTML editor that teaches you about HTML as you use it. This is a well organized guide to the intricacies of designing Web pages. There is a good list of frequently asked questions and tips and tricks.  
system for viewing information (both text and pictures) on a computer screen in such a way that related items of information can easily be reached. For example, the program might display a map of a country; if the user clicks (with a mouse) on a particular city, the program will display information about that city.
a small picture on the computer screen, or VDT, representing an object or function that the user may manipulate or otherwise use. It is a feature of graphical user interface (GUI) systems. Icons make computers easier to use by allowing the user to point to and click with a mouse on pictures, rather than type commands.
  information technology
collective term for the various technologies involved in processing and transmitting information. They include computing, telecommunications, and microelectronics.
  integrated circuit (IC),
popularly called silicon chip, a miniaturized electronic circuit produced on a single crystal, or chip, of a semiconducting material—usually silicon. It may contain many millions of components and yet measure only 5 mm/0.2 in square and 1 mm/0.04 in thick. The IC is encapsulated within a plastic or ceramic case, and linked via gold wires to metal pins with which it is connected to a c0016-01.gifprinted circuit board and the other components that make up such electronic devices as computers and calculators.
  interactive video (IV)
computer-mediated system that enables the user to interact with and control information (including text, recorded speech, or moving images) stored on video disk. IV is most commonly used for training purposes, using analog video disks, but has wider applications with digital video systems such as CD-I (Compact Disk Interactive, from Philips and Sony) which are based on the CD-ROM format derived from audio compact disks.
the point of contact between two programs or pieces of equipment. The term is most often used for the physical connection between the computer and a peripheral device, which is used to compensate for differences in such operating characteristics as speed, data coding, voltage, and power consumption. For example, a printer interface is the cabling and circuitry used to transfer data from a computer to a printer, and to compensate for differences in speed and coding.
(abbreviation for Integrated Services Digital Network) a telecommunications system.




Page 598
  kilobyte (K or KB)
a unit of memory equal to 1,024 c0016-01.gifbytes. It is sometimes used, less precisely, to mean 1,000 bytes.
  knowledge-based system (KBS)
computer program that uses an encoding of human knowledge to help solve problems. It was discovered during research into artificial intelligence that adding heuristics (rules of thumb) enabled programs to tackle problems that were otherwise difficult to solve by the usual techniques of computer science.
abbreviation for c0016-01.gifliquid-crystal display.
abbreviation for c0016-01.giflight-emitting diode.
  light-emitting diode (LED)
an electronic component that converts electrical energy into light or infrared radiation in the range of 550 nm (green light) to 1300 nm (infrared). They are used for displaying symbols in electronic instruments and devices. An LED is a diode made of semiconductor material, such as gallium arsenide phosphide, that glows when electricity is passed through it. The first digital watches and calculators had LED displays, but many later models use c0016-01.gifliquid-crystal displays.
  liquid-crystal display (LCD)
display of numbers (for example, in a calculator) or pictures (such as on a pocket television screen) produced by molecules of a substance in a semiliquid state with some crystalline properties, so that clusters of molecules align in parallel formations. The display is a blank until the application of an electric field, which "twists" the molecules so that they reflect or transmit light falling on them. There two main types of LCD are passive matrix and active matrix.
(abbreviation for large-scale integration) the technology that enables whole electrical circuits to be etched into a piece of semiconducting material just a few millimeters square.
  machine code
a set of instructions that a computer's central processing unit (CPU) can understand and obey directly, without any translation. Each type of CPU has its own machine code.
  magnetic-ink character recognition (MICR)
a technique that enables special characters printed in magnetic ink to be read and input rapidly to a computer. MICR is used extensively in banking because magnetic-ink characters are difficult to forge and are therefore ideal for marking and identifying checks.
  magnetic tape
narrow plastic ribbon coated with an easily magnetizable material on which data can be recorded. It is used in sound recording, audiovisual systems (videotape), and computing. For mass storage on commercial mainframe computers, large reel-to-reel tapes are still used, but cartridges are becoming popular. Various types of cartridge are now standard on minis and PCs, while audio cassettes are sometimes used with home computers.
  megabyte (MB)
a unit of memory equal to 1,024 c0016-01.gifkilobytes. It is sometimes used, less precisely, to mean 1 million bytes.
a list of options, displayed on screen, from which the user may make a choice—for example, the choice of services offered to the customer by a bank cash dispenser: withdrawal, deposit, balance, or statement. Menus are used extensively in graphical user interface (GUI) systems, where the menu options are often selected using a mouse.
  Action 2000
  Practical U.K. government advice on how to avoid computer chaos at midnight on 31 December 1999. The "busting the bug" campaign warns of the consequences of inaction and advises how to check your software and take preventive measures. Further sources of information are provided.  
  Millennium Bug
crisis facing some computer systems in the year 2000 that will arise because computers may be unable to operate normally when faced with the unfamiliar date format. Information about the year has typically been stored in a two-digit instead of a four-digit field in order to save memory space, which may mean that after the year 1999 ends the year will appear as "00." Systems may consider this to mean 1900, or they may not recognize it at all and will crash, resulting in data corruption.
  monitor, or screen,
output device on which a computer displays information for the benefit of the operator user—usually in the form of a graphical user interface such as Windows. The commonest type is the cathode-ray tube (CRT), which is similar to a television screen. Portable computers often use liquid crystal display ( LCD) screens. These are harder to read than CRTs, but require less power, making them suitable for battery operation.
computerized method of presenting information by combining audio and video components using text, sound, and graphics (still, animated, and video sequences). For example, a multimedia database of musical instruments may allow a user not only to search and retrieve text about a particular instrument but also to see pictures of it and hear it play a piece of music. Multimedia applications emphasize interactivity between the computer and the user.
  multitasking, or multiprogramming,
a system in which one processor appears to run several different programs (or different parts of the same program) at the same time. All the programs are held in memory together and each is allowed to run for a certain period.
  online system
originally a system that allows the computer to work interactively with its users, responding to each instruction as it is given and prompting users for information when necessary. Since almost all the computers used now work this way, "online system" is now used to refer to large database, electronic mail, and conferencing systems accessed via a dial-up modem. These often have tens or hundreds of users from different places—sometimes from different countries—"on line" at the same time.
  optical character recognition (OCR)
a technique for inputting text to a computer by means of a document reader. First, a scanner produces a digital image of the text; then character-recognition software makes use of stored knowledge about the shapes of individual characters to convert the digital image to a set of internal codes that can be stored and processed by computer.
  optical fiber
very fine, optically pure glass fiber through which light can be reflected to transmit images or data from one end to the other. Although expensive to produce and install, optical fibers can carry more data than traditional cables, and




Page 599
  are less susceptible to interference. Standard optical fiber transmitters can send up to 10 billion bits of information per second by switching a laser beam on and off.  
  optical mark recognition (OMR)
a technique that enables marks made in predetermined positions on computer-input forms to be detected optically and input to a computer. An optical mark reader shines a light beam onto the input document and is able to detect the marks because less light is reflected back from them than from the paler, unmarked paper.
  personal computer (PC)
another name for a microcomputer. The term is also used, more specifically, to mean the IBM Personal Computer and computers compatible with it. The first IBM PC was introduced in 1981; it had 64 kilobytes of random access memory (RAM) and one floppy-disk drive. It was followed in 1983 by the XT (with a hard-disk drive) and in 1984 by the AT (based on a more powerful microprocessor). Many manufacturers have copied the basic design, which is now regarded as a standard for business microcomputers. Computers designed to function like an IBM PC are IBM-compatible computers.
(derived from picture element) a single dot on a computer screen. All screen images are made up of a collection of pixels, with each pixel being either off (dark) or on (illuminated, possibly in color). The number of pixels available determines the screen's resolution. Typical resolutions of microcomputer screens vary from 320 x 200 pixels to 640 x 480 pixels, but screens with 1,024 x 768 pixels are now common for high-quality graphic (pictorial) displays.
  printed circuit board (PCB)
electrical circuit created by laying (printing) ''tracks" of a conductor such as copper on one or both sides of an insulating board. The PCB was invented in 1936 by Austrian scientist Paul Eisler, and was first used on a large scale in 1948.
another name for the central processing unit or microprocessor of a computer.
(abbreviation for programmable read-only memory) a memory device in the form of an integrated circuit (chip) that can be programmed after manufacture to hold information permanently. PROM chips are empty of information when manufactured, unlike ROM (read-only memory) chips, which have information built into them. Other memory devices are c0016-01.gifEPROM (erasable programmable read-only memory) and c0016-01.gifRAM (random-access memory).
an agreed set of standards for the transfer of data between different devices. They cover transmission speed, format of data, and the signals required to synchronize the transfer.
(acronym for random-access memory) a memory device in the form of a collection of integrated circuits (chips), frequently used in microcomputers. Unlike c0016-01.gifROM (read-only memory) chips, RAM chips can be both read from and written to by the computer, but their contents are lost when the power is switched off.
  real-time system
a program that responds to events in the world as they happen. For example, an automatic-pilot program in an aircraft must respond instantly in order to correct deviations from its course. Process control, robotics, games, and many military applications are examples of real-time systems.
  repetitive strain injury (RSI)
inflammation of tendon sheaths, mainly in the hands and wrists, which may be disabling. It is found predominantly in factory workers involved in constant repetitive movements, and in those who work with computer keyboards. The symptoms include aching muscles, weak wrists, tingling fingers and in severe cases, pain and paralysis. Some victims have successfully sued their employers for damages.
  Repetitive Strain Injury
  Unofficial, but very informative and extensively-linked, site produced by a sufferer of RSI. It includes a section on the symptoms, as well as diagrams and photos of how to type without straining your back or hands. The site is largely focused upon prevention rather than cure.  
(abbreviation for reduced instruction-set computer) a microprocessor (processor on a single chip) that carries out fewer instructions than other (CISC) microprocessors in common use in the 1990s. Because of the low number and the regularity of machine code instructions, the processor carries out those instructions very quickly.
(abbreviation for read-only memory) a memory device in the form of a collection of integrated circuits (chips), frequently used in microcomputers. ROM chips are loaded with data and programs during manufacture and, unlike c0016-01.gifRAM (random-access memory) chips, can subsequently only be read, not written to, by computer. However, the contents of the chips are not lost when the power is switched off, as happens in RAM.
(abbreviation for c0016-01.gifrepetitive strain injury) a condition affecting workers, such as typists, who repeatedly perform certain movements with their hands and wrists.
  search engine
a remotely accessible program to help users find information on the Internet. Commercial search engines such as AltaVista and Lycos comprise databases of documents, URLs, USENET articles and more, which can be searched by keying in a key word or phrase. The databases are compiled by a mixture of automated agents (spiders) and webmasters registering their sites.
(Standard Generalized Markup Language) International Standards Organization standard describing how the structure (features such as headers, columns, margins, and tables) of a text can be identified so that it can be used, probably via filters, in applications such as desktop publishing and electronic publishing. HTML and VRML are both types of SGML.
software distributed free via the Internet or on disks given away with magazines. Users have the opportunity to test its functionality and ability to meet their requirements before paying a small registration fee directly to the author.
  silicon chip
integrated circuit with microscopically small electrical components on a piece of silicon crystal only a few millimeters square.




Page 600
a collection of programs and procedures for making a computer perform a specific task, as opposed to c0016-01.gifhardware, the physical components of a computer system. Software is created by programmers and is either distributed on a suitable medium, such as the floppy disk, or built into the computer in the form of firmware. Examples of software include operating systems, compilers, and applications programs such as payrolls or word processors. No computer can function without some form of software.
  source language
the language in which a program is written, as opposed to machine code, which is the form in which the program's instructions are carried out by the computer. Source languages are classified as either high-level languages or low-level languages, according to whether each notation in the source language stands for many or only one instruction in machine code.
  speech recognition, or voice input,
any technique by which a computer can understand ordinary speech. Spoken words are divided into "frames," each lasting about one-thirtieth of a second, which are converted to a wave form. These are then compared with a series of stored frames to determine the most likely word. Research into speech recognition started in 1938, but the technology did not become sufficiently developed for commercial applications until the late 1980s.
  speech synthesis, or voice output,
computer-based technology for generating speech. A speech synthesizer is controlled by a computer, which supplies strings of codes representing basic speech sounds (phonemes); together these make up words. Speech-synthesis applications include children's toys, car and aircraft warning systems, and talking books for the blind.
any agreed system or protocol that helps different pieces of software or different computers to work together. If computers are to communicate over a network, standards must be coordinated: the World Wide Web, for example, works because everybody who uses it agrees to follow the same conventions, such as using HTML to build Web documents. Other standards, like SMTP—the procedure for sending e-mail—exist to make cross-platform communication (for example between a UNIX machine and a Macintosh) possible. Bodies involved with this process include: the Internet Architecture Board, the W3 Consortium, and the International Standards Organization.
  systems analysis
the investigation of a business activity or clerical procedure, with a view to deciding if and how it can be computerized. The analyst discusses the existing procedures with the people involved, observes the flow of data through the business, and draws up an outline specification of the required computer system. The next step is c0016-01.gifsystems design.
  systems design
the detailed design of an applications package. The designer breaks the system down into component programs, and designs the required input forms, screen layouts, and printouts. Systems design forms a link between systems analysis and programming.
broadcast system of displaying information on a television screen. The information—typically about news items, entertainment, sport, and finance—is constantly updated. Teletext is a form of c0016-01.gifvideotext, pioneered in Britain by the British Broadcasting Corporation (BBC) with Ceefax and by Independent Television with Teletext.
a device consisting of a keyboard and display screen (VDT) to enable the operator to communicate with the computer. The terminal may be physically attached to the computer or linked to it by a telephone line (remote terminal). A "dumb" terminal has no processor of its own, whereas an "intelligent" terminal has its own processor and takes some of the processing load away from the main computer.
  tree-and-branch filing system
a filing system where all files are stored within directories, like folders in a filing cabinet. These directories may in turn be stored within further directories. The root directory contains all the other directories and may be thought of as equivalent to the filing cabinet. Another way of picturing the system is as a tree with branches from which grow smaller branches, ending in leaves (individual files).
multiuser operating system designed for minicomputers but becoming increasingly popular on microcomputers, workstations, mainframes, and supercomputers.
abbreviation for video display terminal.
  video adapter
an expansion board that allows display of graphics and color. Commonly used video adapters for IBM PC-based systems are Hercules, CGA, EGA, VGA, XGA, and SVGA.
system in which information (text and simple pictures) is displayed on a television (video) screen. There are two basic systems, known as c0016-01.gifteletext and viewdata. In the teletext system information is broadcast with the ordinary television signals, whereas in the viewdata system information is relayed to the screen from a central data bank via the telephone network. Both systems require the use of a television receiver (or a connected VTR) with special decoder.
  voice modem
a modem which handles voice as well as data communications, so that it can be used to add the capabilities of a voice mail system to a personal computer. Primarily aimed at small and home-based businesses, voice modems typically also include fax facilities.
(video random-access memory) a form of c0016-01.gifRAM that allows simultaneous access by two different devices, so that graphics can be handled at the same time as data are updated. VRAM improves graphic display performance.
(Virtual Reality Modeling Language) method of displaying three-dimensional images on a Web page. VRML, which functions as a counterpart to HTML, is a platform-independent language that creates a virtual reality scene which users can "walk" through and follow links much like a conventional Web page. In some contexts, VRML can replace conventional computer interfaces with their icons, menus, files, and folders.
graphical user interface (GUI) from Microsoft that has become the standard for IBM PCs and clones. Windows 95, updated to Windows 98, is designed for homes and offices and retains maximum compatibility with programs written for the MS-DOS operating system. Windows NT is a 32-bit multiuser and multitasking operating system designed for




Page 601
  business use, especially on workstations and server computers, where it is seen as a rival to UNIX. Windows CE is a small operating system that supports a subset of the Windows applications programming interface. It is designed for handheld personal computers (HPCs) and consumer electronics products.  
  World Wide Web
a hypertext system for publishing information on the c0016-01.gifInternet. The original program was created in 1990 for internal use at CERN, the Geneva-based physics research center, by Tim Berners-Lee and Robert Cailliau. The system was released on the Internet in 1991, but only caught on in 1993, following the release of Mosaic, an easy-to-use PC-compatible browser. From the 600-odd Web servers in existence in December 1993, the number grew to around 2,000,000 by the end of 1997. According to an estimate by the search engine Alta Vista, there were 100–150 million Web pages in 1997.
  History of the Web
  Transcript of Birthplace of the Web by Eric Berger, Office of Public Affairs at FermiLab. The text covers the origins of the web as a means of communication between scientists at CERN and at FermiLab, and describes how one person's idea in 1991 has brought about a social and cultural revolution in just a few years.  
  Further Reading  
  Angelides, Marios C., and Dustdar, Schahram Multimedia Information Systems (1997)  
  Beynon-Davies, Paul Database Systems (1996)  
  Bines, W. J. (ed.) Microcomputer Applications Handbook (1989)  
  Campbell-Kelly, Martin, and Aspray, William Computer: A History of the Information Machine (1996)  
  Clements, A. Principles of Computer Hardware (1992, second edition)  
  Collin, Simon E-Mail: a Practical Guide (1995)  
  Dorf, Richard Carl Modern Control Systems (1995, seventh edition)  
  Edwards, John, and Finlay, Paul N. Decision Making with Computers: the Spreadsheet and Beyond (1997)  
  Farkas, Bart, and Breen, Christopher The Macintosh Bible Guide to Games (1996)  
  Foley, James D Introduction to Computer Graphics (1994)  
  Galitz, Wilbert O The Essential Guide to User Interface Design: An Introduction to GUI Design Principles and Techniques (1997)  
  Hennessy, John L., and Patterson, David A. Computer Organization and Design: the Hardware/Software Interface (1998, second edition)  
  Herz, J. C. Surfing on the Internet (1995)  
  Hillman, David Multimedia Technology and Applications (1998)  
  Horstmann, Cay Computing Concepts with Java Essentials (1998)  
  Kamin, Jonathan Understanding Hard Disc Management on the PC (1989)  
  Kaplan, Randy M. Intelligent Multimedia Systems: A Handbook for Creating Applications (1997)  
  Knuth, Donald E. The Art of Computer Programming (1998)  
  Korolenko, Michael Writing for Multimedia: A Guide and Sourcebook for the Digital Writer (1997)  
  Lathrop, Olin The Way Computer Graphics Works (1997)  
  Mandel, Theo The Elements of User Interface Design (1997)  
  Maybury, Mark T. Intelligent Multimedia Information Retrieval (1997)  
  Mulholland, Dawn Desktop Publishing: A Complete Course (1996)  
  Nutt, Gary J. Operating Systems: A Modern Perspective (1997)  
  Oxborrow, Elizabeth A. Databases and Database Systems: Concepts and Issues (1989, second edition)  
  Parker, Roger C. Desktop Publishing and Design for Dummies (1995)  
  Peck, Dave D. Multimedia: A Hands-On Introduction (1997)  
  Pimentel, Ken, and Teixeira, Kevin Virtual Reality: Through the New Looking Glass (1993)  
  Rathbone, Andy Multimedia and CD ROMs for Dummies (1995, second edition)  
  Reid, T. R. Microchip: The Story of a Revolution and the Men who Made It (1985)  
  Ritchie, Colin Operating Systems (1997, third edition)  
  Ross, Sheldon M. Simulation (1997, second edition)  
  Slade, Robert M. Robert Slade's Guide to Computer Viruses: How to Avoid Them, How to Get Rid of Them, How to Get Help (1994)  
  Wood, John M. Desktop Magic: Electronic Publishing, Document Management, and Workgroups (1995)  
  Zorkoczy, Peter Information Technology: An Introduction (1995, fourth edition)