Jump to content

IBM PC–compatible

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Ae-a (talk | contribs) at 02:13, 17 January 2005 (→‎History: New subsection - Standards, design-flaws, and more compatability issues). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

IBM PC compatible refers to a class of computers which make up the vast majority of smaller computers (microcomputers) on the market today. They are based (without IBM's participation) on the original IBM PC. They use the Intel x86 architecture (or an architecture made to emulate it) and are capable of using interchangeable commodity hardware. These computers also used to be refered to as PC clones, and nowadays, just PCs.


History

Origins

File:5150-1b.jpg
One of the first PCs from IBM - the IBM PC. The early PCs had a monitor that looked like a fish-bowl

The origins of this platform came with the decision by IBM in 1981 to market a personal computer as quickly as possible in response to Apple Computer's rapid success (50% marketshare) in the burgeoning PC market. In August 1981, the first IBM-PC went on sale.

In licensing an operating system from Microsoft, IBM's agreements allowed Microsoft to sell MS-DOS for non-IBM platforms (the IBM version was called PC-DOS). Also, in creating the platform, IBM used only one proprietary component: the BIOS.

Columbia produced the first IBM PC compatible in 1982. Compaq Computer Corp. produced an early IBM PC compatible (which was also the first sewing machine-sized portable PC) a few months later in 1982 — the Compaq Portable. Compaq could not directly copy the BIOS as a result of the court decision in Apple v. Franklin, but it could reverse-engineer the IBM BIOS and then write its own BIOS using clean room design. Compaq became a very successful PC manufacturer, and was bought out by Hewlett-Packard in 2002.

Compatibility issues

Simultaneously, many manufacturers such as Xerox, Digital, and Sanyo introduced PCs that were, although x86- and MS-DOS-based, not completely hardware-compatible with the IBM PC. While such decisions seem foolish in retrospect, it is not always appreciated just how fast the rise of the IBM clone market was, and the degree to which it took the industry by surprise. Later, in 1987, IBM itself would launch the PS/2 line of personal computers which was only software compatible with the PC architecture; this was also hugely unsuccessful.

Microsoft's intention, and the mindset of the industry from 1981 to as late as the mid-1980s, was that application writers would write to the API's in MS-DOS, and in some cases to the firmware BIOS, and that these components would form what would now be called a hardware abstraction layer. Each computer would have its own OEM version of MS-DOS, customized to its hardware. Any piece of software written for MS-DOS would run on any MS-DOS computer, regardless of variations in hardware design.

During this time MS-DOS was sold only as an OEM product. There was no Microsoft-branded MS-DOS, MS-DOS could not be purchased directly from Microsoft, and the manual's cover had the corporate color and logo of the PC vendor. Bugs were to be reported to the OEM, not to Microsoft. However, in the case of the clones, it soon became clear that the OEM versions of MS-DOS were virtually identical, except perhaps for the provision of a few utility programs.

MS-DOS provided adequate support for character-oriented applications, such as those that could have been implemented on a minicomputer and a Digital VT100 terminal. Had the bulk of commercially important software fallen within these bounds, hardware compatibility might not have mattered. However, from the very beginning, many significant pieces of popular commercial software wrote directly to the hardware, for a variety of reasons:

  • Communications software directly accessed the UART chip, because the MS-DOS API and the BIOS did not provide full support for the chip's capabilities.
  • Graphics capability was not taken seriously. It was considered to be an exotic or novelty function. MS-DOS didn't have an API for graphics, and the BIOS only included the most rudimentary of graphics functions (such as changing screen modes and plotting single points); having to make a BIOS call for every point drawn or modified also increased overhead considerably, making the BIOS interface notoriously slow. Because of this, line-drawing, arc-drawing, and blitting had to be performed by the application, and this was usually done by bypassing the BIOS and accessing video memory directly. Games, of course, used graphics. They also performed any machine-dependent trick the programmers could think of in order to gain speed. Thus, games were machine-dependent—and games turned out to be important in driving PC purchases.
  • Even for staid business applications, speed of execution was a significant competitive advantage. This was shown dramatically by Lotus 1-2-3's competitive knockout of rival spreadsheet Context MBA. The latter, now almost forgotten, preceded Lotus to market, included more functions, was written in Pascal, and was highly portable. It was also too slow to be really usable on a PC. Lotus was written in pure assembly language and performed some machine-dependent tricks. It was so much faster that Context MBA was dead as soon as Lotus arrived.
  • Disk copy-protection schemes, popular at the time, made direct access to the disk drive hardware precisely in order to write nonstandard data patterns, patterns that were illegal from the point of view of the OS and therefore could not be produced by standard OS calls.
  • The microcomputer programming culture at the time was hacker-like, and enjoyed discovering and exploiting undocumented properties of the system.

At first, other than Compaq's models, few "compatibles" really lived up to their claim. "95% compatibility" was seen as excellent. Gradually vendors discovered, not only how to emulate the IBM BIOS, but the places where they needed to use identical hardware chips to perform key functions within the system. Reviewers and users developed suites of programs to test compatibility, generally including Lotus 1-2-3 and Microsoft Flight Simulator, the two most popular "stress tests." Meanwhile, IBM damaged its own franchise by failing to appreciate the important of "IBM compatibility," when they introduced products such as the IBM Portable (essentially a Compaq Portable knockoff), and later the PCjr, which had significant incompatibilities with the mainline PCs. Eventually, the Phoenix BIOS and similar commercially-available products permitted computer makers to build essentially 100%-compatible clones without having to reverse-engineer the IBM PC BIOS themselves.

By the mid-to-late 1980s buyers began to regard PCs as commodity items, and became skeptical as to whether the security blanket of the IBM name warranted the price differential. Meanwhile, of course, the incompatible Xeroxes and Digitals and Wangs were left in the dust. Nobody cared that they ran MS-DOS; the issue was that they did not run off-the-shelf software written for IBM compatibles.

The domination of PCs and the declining influence of IBM

Since 1982, IBM PC compatibles have conquered both the home and business markets of commodity computers so that the only notable remaining competition comes from Apple Macintosh computers with a market share of only a few per cent. Meanwhile, IBM has long since lost its leadership role in the market for IBM PC compatibles (this may have had to do with the failiure of other manufacturers to adopt the new features of the IBM PS/2); currently the leading players include Dell and Hewlett-Packard.

Despite advances in computer technology, all current IBM PC compatibles remain very much compatible with the original IBM PC computers, although most of the components implement the compatibility in special backward compatibility modes used only during a system boot.

Expandability

One of the strengths of the PC compatible platform is it's modular design. This meant that if a component became obsolete, you only had to upgrade the individual component, and not the whole computer as was the case with many of the microcomputers of the time. As long as applications used operating system calls and did not write to the hardware directly, the existing applications would work fine. However, MS-DOS (the dominant operating system of the time) did not have support for many calls for multimedia-hardware, and the BIOS was also inadequate. Varous attempts to standardise the interfaces were made, but in practice, many of these attempts were either flawed or ignored. Even so, there were many expansion options, and the PC compatible platform advanced much faster than other competing platforms of the time.

"IBM PC Compatible" becomes "Wintel"

In the 1990s, IBM's influence on PC architecture became increasingly irrelevant. Instead of focusing on staying compatible with the IBM-PC, vendors began to focus on compatibility with the evolution of Microsoft Windows. As of 2004, no vendor dares to be incompatible with the latest version of Windows, and Microsoft's annual WinHEC conferences provide a setting in which Microsoft can lobby for and in some cases dictate the pace and direction of the hardware side of the PC industry.

The term "IBM PC Compatible" is on the wane. Ordinary consumers simply refer to the machines as "PCs," while programmers and industry writers are increasingly using the term "Wintel architecture" ("Wintel" being a contraction of "Windows" and "Intel") to refer to the combined hardware-software platform.

The breakthrough in entertainment software

The original IBM PC was not designed with games in mind. The monochrome graphics and very simple sound made it unsuitable for multimedia applications. That, and the fact that it was priced out of the entertainment market made it seem unlikely that the PC platform would be used for games.

As the technology of the PC advanced, games started to appear for the PC. At first, these were inferior to the games for other platforms. Thanks to the modular design, the technology behind the PC advanced rapidly. What PC games lacked in multimedia capabilities, they made up for in raw speed. A few years later, VGA cards started to appear. These offered 256-colour graphics out of a palette of 262144. At around this time, sound-cards started to appear. They improved the beeping sounds of the PC speaker to give a more rich sound.

By the time the PC had superior hardware to the competing platforms of the time, it still was not taken seriously as a games machine. This could have been caused by the higher price, or the fact that video game consoles rather than personal computers were now starting to attract gamers, or it could have been that the hardware was very awkward to program for, and required the development of different drivers for all the multimedia hardware out there. Another theory is that the PC platform did not manage to create a cult-following like the other platforms had done. At the time, there was a demo scene on the PC but it was small, did not appear until many years after the original IBM PC and demos were few and far between. The lack of a demo scene meant that there were few programmers who knew how to get the most out of the machine, so there weren't many PC programmers out there with the intimate knowledge required to squeeze that last drop of performance out of the machine.

One thing that PCs did have in their favour were lots of raw processing power. This made them suitable for 3D games. It is not sure exactly when the PC made a breakthrough as a games-machine, but the latest possible killer-game was Doom released in 1993. The graphics were mind-blowing, and the gameplay was fantastic. Because networking hardware was widespread on PCs, Doom also offered multiplayer support accross a network. Few games offered that at the time. Doom finally established the PC as a games-machine.

Standards, design-flaws, and more compatability issues

When the PC was originally designed, even though it was designed for expandability, even the designers of the original IBM PC could not take into account the hardware-developments of the '80s. By the late '80s, IBM the creators of the IBM PC hardly had much say, and a lot of other companies were trying to push their standards.

To make things worse, IBM, Intel and Microsoft introduced several design flaws along the way which created hurdles for developping the PC compatible platform. One example of such a design flaw was the 640k DOS Barrier (memory below 640k is known as conventional memory). This was partly to do with the way IBM mapped the memory of the PC, and the memory-managment of DOS (which was the most widely used operating system) had a way of dealing with it that made things worse. In order to expand PCs beyond one megabyte, EMS was devised to allow access to the memory above 1 megabyte. However, once Intel released the 80286 processor, an alternative memory managment scheme was introduced — XMS. EMS and XMS were originally incompatible, so anyome writing software that used more than one megabyte had to support both systems.

Graphics cards suffered from their own incompatibilities. Once graphics cards advanced to SVGA level, the standard for accessing them was no longer clear. Each manufacturer developed their own ways of numbering the new graphics modes and accessing the screen-memory. This meant that the manufacturers needed to develop device drivers in software that allowed the SVGA modes to be used by a program that accesses the graphics-card at the driver level. Unfortunately, there was no standard for device-drivers that all manufacturwers followed. An attempt at creating a snadadrd called VESA was made, but not all manufacturers adhered to it. To make things worse, the manufacturers' drivers often had bugs. To work around them, the application developers had to write their own drivers for the cards with buggy drivers.

As you can immagine, programming the PC was a nightmare. It put many hobbyists off, and may have been responsible for the slow take-off of the PC as a multimedia platform. When developing for the PC, a large test-suite of various hardware combinations was needed to make sure the software was compatible with as many PC configurations as possible.

Eventually, a new memory-model was devised — DPMI. It offered a flat memory model and made life for programmers a lot easier.

Meanwhile, consumers were being confused and overwhelmed by the many different combinations of hardware on offer. In order to give them some idea of what sort of PC is required to run a given piece of software, the Multimedia PC standard (or MPC) was set in 1990. It meant that a PC that met the minimum MPC standard could be considered an MPC. Software that could run on a minimalistic MPC-compliant PC would be guaranteed to run on any MPC. The MPC level 2 and MPC level 3 standards were later set, but the term 'MPC' compliant never caught on. After MPC level 3 in 1996, no further MPC standards were set.

The rise and rise of Windows

The first version of the commercially available Microsoft product Windows, Windows 3.0, was seen as a massive change in the way users interacted with the IBM PC computer that was open to most users. IBM had already released OS/2 in 1987; which seemed to be a superior software product to Windows. However, like Betamax video tapes, because Microsoft had the market share of MS DOS and everyone already had it preinstalled on their systems, Windows 3.0 was the GUI of choice for most people and OS/2 failed to catch on. Initially sitting on top of the MS DOS environment, and being incredibly slow and relatively difficult to use, Windows 3.0 was hailed as the way forward for the majority of home users who owned a PC and Compatible.

Although Windows 3.0 was based heavily on Apples MAC OS and IBM's OS/2, Windows 3.0 revolutionised the way the PC was operated by the user. In the past, users had typed in commands into the MS DOS interface whereas now they could intuitavely perform operations via a GUI and by also using icons.

Windows 3.0 was followed by Windows 3.1 and eventually Microsoft cottened on to the fact that users wanted to network their PCs, including standard network protocols into a newer 3.11 version.

Adding more and more features and standardised protocols and building on hardware support, Windows 95 was born. Before Windows 95, games and gaming were a totally MS DOS experience. Users had to put up with rebooting into DOS, fiddling with memory (see the 640k DOS Barrier) and generally reconfiguring their PC every time they wanted to load a game. Windows 95 provided a system called DirectX which allowed programmers access to a standard API to perform video and sound card calls from Windows, revolutionising the games arena. For the first time, a PC programmer could benefit from Windows 95's memory management capabilities and extended functionality, AND have API access to the graphics and sound cards - of which there were many versions and drivers. 3D graphics were possible from within Windows, (for those with 3Dfx cards) and now Network Multiplayer 3D graphics games were in the realms of possibility to almost every programmer.

Windows 95 was soon replaced with Windows 98, Windows ME (often referred to as the worst version of Windows ever for its very bad memory problems and stability) and as current Windows XP.

A branch of Windows meant for servers and workstations, Windows NT and its successors Windows 2000 (Workstation and Server versions) and XP have also proved popular.

Suffice to say, Windows has dominated the desktop PC market, and almost every PC to be distributed by major manufacturers comes with Windows of one flavour or another.

The PC today

A modern PC's case. This is more fancy than the traditional beige box cases used throughout the '90s and late '80s.

Nowardays, the original IBM PC is long forgotten. The term PC compatible is not used much, and usually, just PC is used. The processor speed and memory are many orders of magnitude greater than what they were on the original IBM PC, yet any well-behaved program for the original IBM PC that does not call the hardware directly can still run on a modern PC, but these programs have all been made obsolete by the plethora of software available today. Some say that the desire for backward compatibility might have hindered the development of the PC, but many believe the ability to run legacy software is what helped keep the PC alive.

The modular design makes it possible to choose every component of a PC from a variety of different manufacturers and to buy only what is needed for the tasks the computer is intended to carry out. In practce, not all buyers are aware of exactly what specification of components is needed, and the evolving nature of the PC platform soon make the hardware obsolete, but thanks to the expandability of the PC, upgrades are easy. It is also possible to choose the operating system to run on the PC, and what software to run.

Software and compatability amongst different PCs and hardware compatibility within the PC platform is no longer a major issue (although there are still a few glitches). The dominance of the PC platform has created the situation that most people have computers that can run the same piece of software. There are still other platforms in existance today (mostly the Apple Macintosh), but they are a minority).

Thanks to the more intuitive user-interfaces and the information-gathering and comunications capabilities of the Internet, the computer has finally escaped from the domain of computer-professionals and computer-hobyists, and has become mainstream.

The design of computer cases has become more elaborate to reflect the change in the demographic of users of the PC platform, and in some cases, the users modify the cases themselves (this is known as case modding), but even so, the plain beige box case design that has been around since the 80's are still produced.

There is a thriving demo scene, and a huge community of people willing to write free software; in fact, there is an entire operating system that's free — Linux.

Hardware

Configurations

A PC can come in one of the following configurations:

A computer that sits on the top of a desk. Portability is not part of the design, so the desktop computers tend to be too heavy and too large to carry. This has the advantage that the components do not need to be mimiaturised, and are therefore cheaper.

File:CompaqPortable.jpg
The Compaq Portable — the first portable PC compatible computer.

Not long after the first IBM-PC came out, Compaq produced the Compaq Portable — the first portable PC compatible computer. Weighing in at 28 poinds, it was more of a "luggable" than a "portable".

The Portable computer evolved into the laptop.

A modern laptop

A Laptop (also known as a Notebook) is a PC that has been miniaturised so that it is easy to carry and can fit into a small space. It uses a flat-screen LCD display which is folded onto the keyboard to create a slab-shaped object. Carrying a laptop around is easy, but this increased portability comes at a cost. To reduce size and mass, a special design is used with smaller components. These components are more expensive than regular components. The design is more integrated meaning that it is less expandable, although the RAM and the hard drive can be upgraded. Laptops are also battery powered, so as well as being smaller, the components need to have a low power-usage.

File:ToshibaLibretto1100.jpg
The Libretto 1100

In 1996, Toshiba produced the Libretto range of sub-notebooks (mini-notebooks). The first model (the Libretto 20) had a volume of [[1_E-4_m%B3|821.1 cm3]] and weighed just 840 g! They were fully PC compatible (unlike PDAs). There were several models porduced in the Libretto range. Librettos are no longer produced.

Components

The modular design of the PC has played a great part in propelling the PC to be the leading computer platform. The PC can be upgraded by either adding new Expansion cards or replacing existing ones.

Operating systems

The PC compatible platform did not come with a built in operating system. Instead, an operating system was meant to be booted from a disk. Over the years, there have been several operating systems for the PC

This was the first operating system for the PC compatible platform to gain widespread use, and was one of the first operating systems for the PC compatible. This was a quick and messy affair (sometimes, the variant MS-DOS is colloquially refered to as Messy DOS). The operating system offered a hardware abstraction layer that although adequate for developing character-based applications was woefully inadequate for accessing most of the hardware (such as the graphics hadrware). This lead to application programmers accessing the hardware directly. The result of this was that each application would have to have a set of device drivers written for it to use the various types of hardware on offer (different printers, etc.), and when some new hardware was released, the hardware manufacturers would have to make sure that device drivers for their hardware for the popular applications became available. There were several variants of DOS. The most widely used one was MS-DOS from Microsoft. This was also integrated with a few versions of Microsoft Windows. Another variant was PC-DOS. PC-DOS was only distributed with IBM-PCs (most other PC compatibles were distributed with MS-DOS). For the early years of this operating system, it was almost identical to MS-DOS. More recently, free versions of DOS such as FreeDOS and OpenDOS have started to appear.

The fact that MS-DOS was one of the first operating systems for the PC and that MS-DOS compatible programs were made well into the 90's, and that it was integrated into several versions of Windows (early versions of Windows were just a graphical shell for MS-DOS) meant that MS-DOS was often considered to be the native operating system of the PC compatible platform.

Originally a graphical shell for MS-DOS, Windows went on to become an operating system in its own right. By the mid '90s, most PCs ran a version of Windows. Today, Windows is the most widely used OS on the PC compatible platform.

Linux was a clone of the UNIX operating system for the PC compatible platform first introduced in 1991. It was distributed freely along with it's source. The open-source nature meant that anybody could add to it. At first, it was used by only a handful of hobyists, but by the late '90s, Linux started to make a serious dent in OS usage statistics, and now it has a significant (albeit small) share of PC compatible OS usage.


Software


See also

PC Resources

Buying a PC

Building a PC

Software downloads

Support and optimising

Miscelaneous