Nineteen years of evolution have brought the personal computer from being the plaything of the computer hobbyist to the brink of acceptance as a commodity product. The passing of nineteen years has also witnessed a major shift in perception of the personal computer by nearly all of those who use computers on a professional basis.
The personal computer, or microcomputer as it was once known, was perceived as being nothing more than an amusing toy which would never be able to challenge the mighty mainframe computer in the real world. Developments in distributed computing having harnessed the combined power of networked microcomputers is only one of the challenges to both the mini-computer and the mainframe.
That the domain of even the supercomputer is no longer sacrosanct is highlighted by a recent product announcement from Mips on their R8000 microprocessor based chip set. The R8000 performing at "300 MFLOPS, which is equivalent to the performance of the Cray Y-MP", (Pountain, 1994). Digital Equipment Corporation (DEC) with their Alpha 21164 have raised the performance boundaries even higher. The 21164 is capable of "330 SPECint92 and 500 SPECfp92", (Ryan, 1994), compared to the R8000's peak of "108 SPECint92 and 310 SPECfp92", (Pountain, 1994).
The power of these microprocessors, which generally find their way into workstations, is not typical of current personal computer practice which is based on systems built around the Intel series microprocessors (80486 and Pentium) or Apple systems built around the Motorola 68040 microprocessor. Apple and Motorola, having joined forces with IBM, have now entered the RISC camp and developed systems around the PowerPC 601 microprocessor.
The drive for more power in desktop computers is now coming from the demands of audio-visual oriented multimedia and GUIs. Full screen motion video with sound, at high resolution and full colour photographic quality requires the decompression of huge amounts of data on the fly and displaying at a minimum rate of around 25 frames per second. this all adds up to a requirement for a fast and efficient microcomputer system.
Furthermore, if such a system is to gain commodity status, which it must if it is to be a part of the emerging "affordable interactive multimedia-information superhighway" scenario, then it must be designed so that non-technical consumers find it easy to set-up, reliable in operation and easy to fit with new technology based products as they become available, so called "Plug and Play" (abbreviated as PnP).
This PnP philosophy demands that any re-configuration of the user's system to facilitate memory access, microprocessor function, data storage devices and other system resources be achieved automatically or at least by a simply implemented system reset.
Globally, the two most prevalent personal computer standards, standard being a term which should be treated with some caution, are those of the Apple Macintosh family (which could also include the PowerMac) and the descendants of the IBM PC. Of these two only the Apple Macintosh standard can lay any claims to being capable of PnP.
In the United Kingdom there is an indigenous multimedia capable microcomputer the Acorn A series and its successor the RiscPC, all built around versions of the ARM RISC microprocessor. These very capable microcomputers, best known in educational circles require only a simple system reset when expansion cards are fitted (increasingly not even that). Acorn has one of the longest histories of any company in RISC microprocessor design and utilization. Acorn desktop machines using RISC CPUs predate the Apple PowerMac by almost seven years.
The ARM is based on a quite different design philosophy, that of small and efficient as opposed to the large, power hungry, performance at all costs approach of the high-end microprocessors from Mips and DEC. This makes the ARM ideal for hardware applications where physically small size, cost and/or power consumption, and hence heat dissipation, are the over-riding considerations with multiprocessor configurations being available when raw performance counts.
As for the IBM PC's descendants, true PnP, as defined by the Plug and Play Association (Microsoft and Intel) remains a mirage. Turning this into reality is going to be a slow and painful process requiring some drastic replacement surgery. All the major components of a PC system will need to be changed including the ROM BIOS, operating system, hardware devices and even the application software. This will come as a shock to many a PC buyer who will discover that their machine was obsolete before it was purchased. A recent Byte article provided a rationale for this:
"Faster microprocessors, bigger hard drives and more memory have unquestionably led to more powerful PCs, but underneath it all lies the same foundation that IBM defined for the PC in 1981 and extended for the AT in 1984. Without a single defining leader...........the world's leading computer platform has been propelled forward by sheer market momentum. Meanwhile, the foundation has been slowly cracking under the weight of more and heavier hardware and software." (HalfHill, 1994).
A useful way of explaining how the PC has arrived at such a watershed is to trace the history of the personal computer which begins some time before the introduction of the IBM PC.
In 1972 the Intel Corporation produced an 8-bit microprocessor the 8008 as a successor to the 4-bit 4004. The 8008 microprocessor, at six or seven hundred dollars per unit, was too expensive for the computer hobbyist until the MITS company introduced a desktop computer, the Altair in 1975. The Altair came, in bits as a kit for self-assembly, with 2k of memory and a serial TTY (TeleType) interface at a price cheaper than the going rate for a single 8008. Purchasing an Altair was often used as a cheap way to get hold of an 8008.
Once assembled, the Altair bore little resemblance to a personal computer as we think of it today, keyboards and monitors there were none and no software. A panel of switches and lights were used to load data directly into memory locations. In this manner a small program was created which could utilise a teletype (typically an ASR-33) for input/output, and thereupon load software from punched paper tape.
Such systems, the province of electronics engineers and programmers, were not suited to use by scientists and other professionals. It wasn't long before other developers produced systems with small bootstrap programs in ROM to initialize the system at power-on. Soon video display based systems with QWERTY keyboards appeared, especially those based around the Intel 8080 8-bit microprocessor or the Zilog Z-80 which was functionally similar. Many of these industrial models employed a legacy from the Altair, the S-100 expansion bus, which with the CP/M operating system set one industry de-facto standard until the arrival of the IBM PC in 1981. CP/M's popularity was largely on account of its support for disc drive usage for data and program storage. CP/M formed the basis for the development of PCDOS (a.k.a. MS-Dos) which became the operating system for the IBM PC.
Meanwhile, a Hewlett-Packard minicomputer electronics engineer and programmer, Steve Wozniak, had developed an interest in microcomputers through the Homebrew Computer Club. Wozniak, whose real forte was simplifying circuits by making components fulfil more than one function, proceeded to design a small computer with video-terminal capability around the Mostek 6502 microprocessor (because at $25 it was the only one he could afford).
Electronics engineer, and entrepreneur Steve Jobs, the son of a salesman, had, whilst working for Atari, engaged the talents of Wozniak in hard wiring the video game Breakout. Wozniak reduced the chip count to 44 (when normal chip counts for this type of game were 150-170), the design was too complex for Atari engineers to understand and a pre-production redesign was required. Wozniak's talent for minimalist design was a vital factor in product commercial viability by keeping construction costs down.
Demonstrations of Wozniak's microcomputer at the Homebrew Club quickly led to firm orders for production machines. Hewlett-Packard having declined an interest in the microcomputer designed by their employee, gave Wozniak a legal release, opening the way for the formation of Apple Computer and sales of the Apple I. Soon Jobs and Wozniak realised that what the world wanted was a personal computer that only required connecting to a domestic TV to be made ready for use, so Wozniak designed a successor.
The Apple II, also based on the Mostek 6502 microprocessor, scored immediate success with its coloured (or should it be "colored') graphics and use of discs for data and program storage, the operating system, also supplied on disc, required loading into RAM during startup. Sales of this machine accelerated fast enough for third parties to become interested in developing hardware expansions and software.
The spreadsheet, as an aid to financial planning and what-if analysis was the concept of Dan Bricklin who first produced a demonstration version written in BASIC on an Apple II. It was with this demonstrator that the use of the slash character to initiate a command first appeared, which became such a familiar feature of spreadsheet software including Lotus 1.2.3. Bricklin teamed up with Bob Frankston and created the Software Arts company to produce the full assembly language version, VisiCalc. The combination of VisiCalc and a disc system was so successful that many Apple IIs were sold into the business community, enhancing Apples credibility. Indeed, IBM held up the announcement of the IBM PC until a VisiCalc conversion was ready.
Thus, in the early nineteen-eighties, there were two de-facto standards for personal computers; the S-100 bus–CP/M camp and the Apple following. There were many other proprietary machine architectures, each with a unique combination of microprocessor, bus, memory architecture and file system formats, transfer of data between formats was all but impossible.
When the IBM Personal Computer (PC) was introduced in August 1981, little did the world suspect that a standard was being set, a standard which would continue to make its existence strongly felt as far into the future as 1994, and possibly beyond.
IBM brought their first PC to market in a little over thirteen months from inception. This was remarkable for a company whose project gestation period was usually measured in years. A major factor in the short development time was that IBM had taken the unprecedented step, for an industry giant noted for developing products which were proprietary through and through, of using existing hardware components from external vendors. The use of many design elements of IBM's earlier System/23 DataMaster was also a major factor in the speed of development and expansion cards for this system could be used in the new machine.
More remarkable was the adoption of an operating system from the relatively small Microsoft company, who also supplied a BASIC language interpreter. IBM also offered CP/M-86, a 16-bit version of the 8-bit CP/M from Gary Kildall's Digital Research. The first Dos was 86-DOS for S-100 computers upgraded with the Intel 8086 CPU. DOS-86 was produced by Seattle Computer Products the rights of which were bought by Microsoft.
The well thought out synthesis of the best features from existing microcomputers and the close compatibility with CP/M systems ensured marketing success beyond even IBM's expectation. The non-proprietary nature of the PC system architecture encouraged many other manufactures to begin building compatibles. This development had far-reaching consequences for the way in which the personal computer industry developed.
Although the 8088 microprocessor operates at 16-bits internally it communicates with other components of the PC over an 8-bit bus. The advantage of this design strategy was that DataMaster features and 8-bit logic chips, which were plentiful and cheap, could be used.
Accessing memory over an 8-bit bus caused a bottleneck. In 1983 IBM introduce the PC/XT which was also built around the 8-bit Intel 8088 microprocessor. The PC/XT used version 2.0 of PCDOS, which for the first time used a hierarchical filing system capable of dealing with the much larger capacity hard discs with which the XT could be equipped.
Meanwhile Apple had embarked on a new project, code named Lisa, to develop a new office computer. In search of funds, Apple's Steve Jobs approached Xerox, whereupon he and other members of the Lisa team visited Xerox's Palo Alto Research Center (PARC), here they were shown the Alto. The Apple team were so impressed with the Alto's sharp graphics, displaying a virtual desktop complete with usable documents and small on screen pictures called icons, they decided that the Lisa would be the Alto for the masses.
The Apple teams enthusiasm and ideas so impressed Larry Tesler, a member of the Alto's design team that he joined Apple. High resolution graphics demand fast microprocessors and ample RAM, both very expensive commodities in 1983, the resultant high unit price was the major factor in the Lisa's commercial failure. Undeterred, the Apple team carried on with development and launched a scaled down version in 1984 under the name Macintosh, based on the Motorola 68000 microprocessor.
The Macintosh with its WIMP based GUI and its lack of program modality was a revelation to a world used to a command line, or at best menu, driven interface. The Macintosh was to have a profound effect on the future development of personal computers. The Mac' as it became known with its GUI fronted operating system was presumably what Byte's Editor in Chief had in mind when he wrote:
"I'd buy an operating system any day that takes a long time to run a given program but which makes me more productive by communicating with me in useful ways." (Morgan, 1981).
In 1984 IBM introduced the IBM PC/AT built around the new Intel 80286 16-bit microprocessor. The 80286 apart from being capable of faster throughput than previous models offered some advanced features. Amongst these were processing parallelism and hardware implemented task switching with program protection.
Unfortunately in maintaining backward compatibility with version 2, the new 3.0 version of PCDOS did not support either multitasking or multiuser environments. Thus, the AT was primarily used as a more efficient PC/XT and could still only make use of a maximum of 640k of user memory in "real address mode'. Users wishing to take advantage of the possible 16 megabytes of memory, as well as the multitasking and multiuser capability were expected to wait for the forthcoming XENIX operating system.
A major architectural feature of the original PC was the use of an expansion bus equipped with connectors, or slots, to take adaptor cards for interfacing with peripherals in particular visual display units (VDUs) and hard drives. The expansion slot data width is one factor which determines how quickly data flows between the microprocessor and the peripheral. The original PC slots had an 8-bits wide data path increased to 16-bits on the AT to match the data width of the 80286. Both the 8-bit and the 16-bit bus specifications are known as the ISA.
Compatible makers continued to build enhanced versions of the PC for sale at competitive prices, introducing all manner of compatibility issues as a result. IBM when developing a 32-bit bus to suite the new 32-bit Intel 80386, and realising that they had lost control of their architecture, produced MCA. MCA used many proprietary methods and components with which IBM hoped to defeat the compatible makers. The first systems with MCA were the PS/2 range, launched in 1987, this included models built around the 80386, 80286 and 8086. PC compatible manufacturers were allowed to use MCA architecture providing they paid IBM substantial royalties.
To avoid paying such royalties, a consortium of compatible makers, led by Compaq, responded by developing the 32-bit EISA bus which, apart from being faster and cheaper to implement, had the added advantage of maintaining compatibility with existing 8MHz ISA bus adaptors.
The launch of the Macintosh had focused the computer worlds attention on the user interface and where Apple led many were soon to follow. Digital Research produced GEM which could run under CP/M-86 or TOS on the Atari ST, and Microsoft produced Windows for the PC, both of which incurred Apple's wrath for being to close to the look and feel of the Mac interface, Commodore having bought out a small company Amiga, launched a computer of that name using Intuition, as a GUI.
IBM with Microsoft produced a new 16-bit operating system OS/2 for the PS/2 range. OS/2 was designed to give a GUI, Presentation Manager, a head start by clearing away the 640k memory limitations of MS-Dos. Meanwhile the Unix camp were evolving their own many flavours of GUI such as Motif, DEC-windows, Open Look, Open Desktop and NextStep. Nextstep is the user interface developed to run under Unix on the Next Computer, the product Steve Jobs nurtured sometime after leaving Apple.
The move towards GUIs was generally welcomed by the computing fraternity, but there was one big drawback, especially for users of PC systems running MS-Dos. GUIs such as Windows, with their resolutions of 640 × 480 pixels or larger as opposed to the typical 24 lines of 80 characters, increased dramatically the amount of video traffic. Furthermore, if ram is limited, large volumes of data need to be swapped out to disc frequently. When this is achieved over the 8Mhz ISA bus then systems can become sluggish and unresponsive, no matter how fast and capable the microprocessor.
With the introduction of the fast 80486 CPU, overcoming the ISA bus bottleneck became a high priority, especially where graphics adaptors and Windows, SCSI interfaces often used by CD-ROMS and scanners and hard drives were concerned. One answer appeared in the concept of the local bus, with which peripherals are connected directly to the CPU and/or ram. A number of manufacturers, including Compaq, Dell and Hewlett-Packard developed their own proprietary local bus systems which, although technically adequate, restricted adaptor choice to specifically designed cards.
The first widely adopted local bus standard was the 32-bit VESA local bus, (VLB or VL bus). This bus could be driven as fast as the clock of a 33Mhz processor, higher speeds requiring the implementation of wait states. The VLB has a maximum rated throughput of 128-132 Mbs compared to the ISA bus maximum of 8Mbs. VESA is a voluntary standard which some manufacturers have only partially implemented and it is in the process of being updated to cope with the 64-bit wide data path of the Pentium CPU.
PCI is an Intel initiated local bus standard which has been slow to gain wide acceptance because of its late introduction. With the increasing numbers of Pentium based systems PCI is becoming more widely accepted.
Many of these more recent bus technologies overcome the DMA and IRQ configuration troubles which continue to dog the majority of systems which still use the ISA bus. Many of these problems will not disappear until Windows dispenses with the services of the archaic Dos operating system:
"The 640KB of RAM that once seemed so luxurious is now choked with contentious device drivers and TSR programs. IRQs (interrupt requests), DMA channels, I/O memory ports, and other system resources are now being fought over like the last pebbles of ore in a played-out gold mine". (Halfhill, 1994).
Adaptor card timing problems can also cause much slot swapping and a rapidly growing collection of mutually exclusive cards.
The Apple Macintosh and the Acorn 32-bit RISC computers do not suffer from any such problems, the systems were designed from the ground-up with a more efficient and extensible combined operating system and GUI.
Computer-based Learning and Training covers a wide and diverse range of activities.
Schools have quit different needs to commerce and industry and require computers which have the following main attributes:
1. have a shallow learning curve for the interface,
2. be easily set up by non technical staff,
3. be flexible in operation and capable of easy and cheap expansion,
4. be frugal in the use of resources such as RAM and media storage.
5. be reliable and robust in use i.e. suffer few compatibility problems,
6. be responsive to user actions and provide meaningful feedback,
7. have a favourable price/performance ratio,
8. have an economic maintenance profile, including hardware and software upgrades.
2. be easily set up by non technical staff,
3. be flexible in operation and capable of easy and cheap expansion,
4. be frugal in the use of resources such as RAM and media storage.
5. be reliable and robust in use i.e. suffer few compatibility problems,
6. be responsive to user actions and provide meaningful feedback,
7. have a favourable price/performance ratio,
8. have an economic maintenance profile, including hardware and software upgrades.
With the ad-hoc development of its architecture, the PC compatible computer compares badly with alternatives, particularly with respect to categories 2, 4, 5 and 8. Indeed few schools have a large enough budget to employ specialists with the skills and experience required for configuring, upgrading and even installing/deinstalling software on a PC compatible. Particularly as networked computers become more widely adopted, network cards requiring the use of expansion slots.
With respect to the cost of software and the ongoing costs of software upgrades, the Acorn computer range has a calculable advantage over the Mac or PC compatible. Hardware upgrades to keep pace with recent developments, and improve performance, are available and affordable. In a report for St. Vincent College IT analysts for account ants Brooking Knowles & Lawrence reported:
"We believe that the continued use of the Archimedes in the education arena offers significant advantages over the use of IBM compatible PCs." (Milford and Holroyd, 1994).
Additionally and by design, Acorn computers have a decided advantage with respect to RAM and hard drive storage requirements because the OS kernel and many of its modules are stored in ROM and (with one exception) only outline font technology is used. Furthermore the ARM instruction set produces untypical RISC code density:
"Because of its CISC-like code density, software for an ARM processor requires less memory than software for RISC architectures targeted at the workstation market." (Tesler, 1993).
The use of PC compatible computers for learning and training may be acceptable in higher education and for corporate training, where the equipment and interface more closely match the systems the student is either using for work or will shortly be using. This is not the case for schools who are in the business of providing a broad based education, not specific training.
What is an industry standard computer?
"These days there's no such thing as a "standard" PC - instead we've got a range of different disk sizes, graphics adaptors and hardware add-ons, such as mice, modems and memory boards." (PC PLUS Product Facts Panel Guide, 1993).
And finally; what form will the generic computer have taken by the time todays infants become employable adults?:
"....while applications from platform to platform will become more and more similar, the platforms themselves will be splitting off in divergent directions. The message is: don't get too fond of your interface. It won't last." (Malone, 1993).
No comments:
Post a Comment