Wetherby Computers

CALL OR TEXT - 07876 - 51 00 43 - WE COME TO YOU


Brief History of the PC

How the PC came to be

In August of 1981 IBM officially announced the Personal Computer (PC). At the time, many disagreed on the success of IBM selling computers to the public through IBM's stores and distributors. It was also predicted by some staff at IBM that because of the questionable and unreliable design of the PC, and the fact that it had not undergone any of IBM's quality tests and design procedures, no one would want to buy the computer.

One of the best things about IBM's personal computer was the Open System attitude of the designers, who guessed that by making a full listing of the system BIOS and internal design and specification schematics readily available, third party manufacturers and designers would/could build expansion cards, peripheral devices and even clone systems. This would make the PC a much more viable tool in a field already swamped with 8 bit computers.

IBM took measures to ensure that the hardware design remained theirs, and patented the ROM BIOS and other hardware. The PC was shipped with an operating system called PC DOS, ported to the PC from QDOS, purchased from Seattle Computer Products by Microsoft (then Bill Gates and Paul Allen).


The first PCs were shipped in desktop cases, housing full sized motherboards with Intel 8088 processors, 1 MB of surface mounted memory, a BIOS but no real time clock (RTC) or battery backed CMOS, an expansion card for output to a monitor (usually monochrome, green or amber although colour was supported), a keyboard controller and keyboard but no numeric keypad, a single sided double density 5.25" floppy drive and floppy disk controller, a parallel port and two serial ports and a copy of DOS. The motherboard offered in total eight expansion slots for input/output devices although some would already be in use.


Printers where also available, the most common were daisy wheel which struck against a typewriter ribbon to leave an impression on the paper which was feed through a roller by sprockets which aligned with holes along the edge of the printer paper and known as fanfold.


Everything about the PC was focused mainly on an office environment and mimicked the `functions and procedures of tasks carried out by the workforce. Tasks such as typing letters (word processing), keeping records and files (database) and working with tables or charts (spreadsheets and graphics). Systems analysts and/or programmers would create a logical representation of the flow, inputs, processes or functions and outputs of the business and build a logical working computerised model of the business.

As the PC and software gained popularity more software and hardware manufacturers produced for it. Some of the most useful expansion cards for the original PC were the real time clock and the hard disk controller. The real time clock allowed the computer to keep track of the date and time once booted to the operating system and hard disk drives originally built by western digital and controller cards to operate the device which enables operating systems, applications and data to be stored on one disk. Once the hard disk appeared for the PC software houses supplied software that could be installed to the hard disk instead of running from floppy drive, no more floppy disk swapping.


The first breakthrough to make a difference to users was more RAM from 1 MB to a possible 16 MB of system RAM. Two main standards competed to become de facto, extended memory and expanded memory, both incompatible with the other and, in the beginning, could not operate at the same time in the same system. Software houses supplied versions of programs that utilised this extra memory, often as data workspace, and so the PC was able to run more powerful applications and use more data, quicker in the process, because there was no need to sort or fetch it from disk.


Keyboards started to appear with numeric keypads and shortly after the release of Microsoft Windows pointing devices such as track balls, joy pads and mice became very popular. Earlier versions of Windows were not as useful as the enhanced mode offered by Windows version 3.0 which allowed the processor to switch to a mode that would allow multiple applications to execute simultaneously sharing system devices and resources and eventually data. This meant that a word processor could be started and also a database application etc., as long as there was enough resources. Also, RAM was not restricted to physical quantity as the hard disk could be used to page (move) working data to and from the drive as required.


One of the best features of the Microsoft Windows front end was allowing hardware manufacturers to supply a driver that only needed to work in one program, namely windows, which software writers could access from within applications, without needing to know how to actually program that devices low level functions. As long as a programmer could tell windows what to print or display then the driver would know how to instruct the hardware. No more loading drivers into every application that accessed a piece of hardware.


Soundcards soon appeared, as the next useful addition that would take up a free slot and an IRQ. Adding multimedia capability to the PC literally made it all singing and all dancing. Around the same time higher resolution and deeper colour depth, graphics cards and monitors where appearing. The expansion slots and the cards that populated them advanced from 8 bit cards to 32 bit bus mastering devices capable of talking to any other device, the system RAM, or operating system without the need for the CPU's intervention.


A pattern emerged, where the hardware demands of software was pushing advances in designs, manufacturing and obviously boosting sales of computer hardware. Every computer manufacturer in all sectors was striving to be the company to emerge with the next mind blowing, got to have it, voted to be, industry standard computer upgrade (The same went for software houses).


As a result of all the new technology changes, consumers became paranoid when purchasing for fear of being sold "old technology", and when Microsoft changed the naming convention of all their software from version xyz, to 95 (the year released), it drove home the true meaning of "out of date."


Another problem to emerge from the rapid advances was dead end systems, which had no, or limited upgradeability. Every time Intel released a new processor, then a new motherboard and various support components were necessary, causing frustration to system up-graders. Purchasing a new motherboard also meant buying new RAM, graphics card, etc., if the new technology was to be fully implemented (Some manufacturers maintained more backward compatibility than others, based on the market they were trying to sell to).


The original message from IBM, "that your personal computer will never be obsolete", is based on the fundamental principle of backward compatibility (but not necessarily forward compatibility). A good thing and a bad thing? New technological advances sometimes mean breaking away from the way things are normally done and thus end the backward compatibility, unless some careful thought is put into the planning and implementation.


Backward compatibility is a good thing if newer personal computers can use old hardware, however this hinders ideas for breakthrough's if no choice can be found to upgrade and implement the original IBM PC design. Manufacturers have been caught out in the past, trying to lead the way only to be bowled over or left behind by rapid advances. Designs were out of date before they were even completed. It was traditional to follow Intel and for a long time. They determined the speed and direction of the personal computer, but all along, in the background was the consumer, who could make or break upgrades.


Being the first to release to the consumers also meant oldest technology, design, features and more important components used in the manufacturing were old. Some designs chose to use the latest, riskier, untested components and others stuck with what they knew. Success became more dependent on how many people used the product. A new trend appeared, manufacturers were releasing products into the consumer market that had neither been properly tested nor rigorously used with existing designs, in their rush to release products (again the same went for software houses), now you see the way things can go wrong with the best of intentions.


In More Recent Times hard drive capacities have increased along with speed and reliability, allowing more business critical applications and data to be implemented on the PC. Software houses added more functionality to their programs and installing them on to computer often meant feeding dozens of floppy disks in to the drive, an incentive to utilise the CD-ROM technology utilised by the music recording industry.


Bigger and better software products soon emerged. Some merely included massive movies and rolling intros, but who could blame them given all that space. It was not long before CD writers became popular and soon everyone was copying there own data to CD. Distributing data on this form of media became the standard adopted by most publishers.


Flatbed scanners also drastically advanced in design and as a result, are more cheaply and readily available as a useful upgrade to import data from the physical world in to electronic representation. Optical Character Recognition (OCR) became commonplace and all of a sudden, you had the capability to edit a scanned image from a book into your own words.


Printer designs changed from impact designs to inkjet and laser technology. Sound cards have added three dimensional sound and a realism to multimedia that is startlin. The AGP graphics card has once again put the PC in front of the games playing consoles (and always will, if manufacturers strive to maintain backward compatibility with every addition of new design).

Today, the present memory technology and design can not supply processors fast enough and is, the major bottleneck in the PC system.


What is next? Who can really say. Consumers are striving for "future-proofing" which just isn't possible. Who can say what is going to be the industry standard speed/performance/capacity of say a hard disk drive in 2 or 3 years time?

The future isn't written yet. Consumers are steering the entire industry from behind it. At present the future looks like being in the futuristic world of wireless connections, bluetooth, blu-ray, skype and the like.


People want the future, now. Video conversations at real-time speeds are now possible with the use of high speed broadband internet connections. They are about to hit 24 meg per second! What does the consumer want next year? Figure that one out and you could make a lot of money, just like IBM did when they guessed at the future over twenty five years ago.

Call Us: 01937 584136 - Visit Us: Mon - Fri 9.00-6.00 - Sat 10.00 - 4.00