DHgate is a Global Supplier of Android Tablets
Light Theme · Dark Theme
Facebook Share Button Twitter Share Button Reddit Share Button

Try Backblaze Unlimited Online Backup for free!
Backup and sync your files anywhere with GoodSync
Newest Page: How to Build a Video Editing Computer
Buy a barebones computer kit at Amazon

As an Amazon Associate I earn from qualifying purchases. All product links on this page are monetized.


Microprocessor History and Background

 

Male commercial drone pilot flying a drone at a construction site.

The CPU ("central processing unit," synonymous with "microprocessor" or simply "processor") is often referred to as the "brain" of the computer. Together with the motherboard, the processor is the part that of your machine that will most define its capabilities and limitations.

Choosing the correct processor is vital to the success of your homebuilt computer project. The processor's speed, number of cores, architecture, amount of on-board cache memory, power consumption, heat production, reputation for reliability, and many other factors are things you'll need to consider when making that decision.

In addition, you have to consider the processor and the motherboard together when planning your computer. A processor can only perform as well as the motherboard it's mounted on. So don't rush this stage of your planning; and if your budget requires you to cut costs, try looking somewhere else to save money. The motherboard and the processor are not parts you want to scrimp on if you have any choice in the matter.

Here's a little background about the history of microprocessors.

 

In the beginning (sort of), there was Intel

Close up of an Intel i 9 processor

It all began in 1971, when Intel invented the microprocessor. Or perhaps more accurately, they invented the term "microprocessor." An earlier 8-bit chip, the Four-Phase Systems AL1, had been invented by Lee Boysel in 1969 as part of a multi-chip CPU. But it wasn't called a "microprocessor" until a court case in the 1990's, when it was demonstrated that the AL1 could function as the core of a computer.

But I digress. For all practical purposes, the age of the personal computer began with Intel's first microprocessor, a tiny device that nonetheless contained as much processing power as the most powerful computer that existed in the world at the time: the ENIAC, which filled an entire room. The world's first commercially-viable microprocessor was dubbed the Intel 4004. It was quickly succeeded less than a year later by the 8008, which was twice as powerful.

In 1978, Intel released the 16-bit 8086 processor. The 8088, also a 16-bit chip, followed less than a year later. The 8088 incorporated technologies designed to make it backward-compatible with the 8-bit chips that were still in wide use at the time. IBM chose the 8088 chip to power their original Personal Computer, and chose an operating system called DOS made by a little company named Microsoft to make their computer do useful things.

And so it was that IBM, Intel, and a little startup company called Microsoft brought computing to the masses.

In the early 1990's, Intel released the i386 processor. The 386 was the first commercially available 32-bit microprocessor. For the first time, it made multitasking (running more than one program at a time) possible on desktop computers. The i486 added an onboard math co-processor, improved data transfer, and an onboard memory cache, all of which were stunning advances in technology in that era.

The Intel Pentium processor, released in 1993, was the first commercially available microprocessor capable of executing two instructions for every clock cycle. Later releases in the Pentium line revolutionized everything from the way data is moved about on the chips, to the way that multimedia content is handled.

Those improvements have since been eclipsed, however, by Intel's line of multi-core, 64-bit processors, such as the popular Core i7 series and the staggeringly powerful Core i9 series, which boast capabilities that were unheard of only a few years ago. For a comparison of the features of Intel's most current processors, please click here.

Advanced Micro Devices (AMD)

Close up of an AMD Ryzen Thread Ripper processor

Intel may have been the big player in the microprocessor market, but they weren't the only player.

Another manufacturer of computer chips, Advanced Micro Devices (AMD), had actually been a second-source manufacturer of Intel chips. That agreement ended in 1986, whereupon AMD started looking at new avenues for expansion.

In 1991, AMD released the first in a line of reverse-engineered processors called the AM386 series, which were able to perform the same tasks as Intel processors, but at a much lower cost. These less-expensive chips helped bring personal computing to a lot of people who otherwise would not have been able to afford to buy computers.

AMD also developed an excellent reputation for quality control, and their chips were excellent values on a cost / power comparison. Nonetheless, for several years after the introduction of the AMD386 line, they continued to be regarded as a manufacturer of inexpensive (albeit high-quality) chips for lower-end machines. Their business model at the time was to wait until Intel set the standard with a new line of chips, and then reverse-engineer them to match their performance at a lower price.

That all changed in 2003.

Backblaze Online Backup Free Trial Graphic

In 2003, AMD took the technology world by storm by introducing the world's first 64-bit processors, the AMD Opteron and Athlon 64. The Opteron was marketed more toward the high-end server market, and the Athlon 64 to the workstation market.

The decision that AMD made to start the 64-bit party without inviting Intel took a lot of guts. It was a completely new architecture for which there existed little use at the time. To this day, many popular programs still run on 32-bit architecture. Back in 2003, everything ran on 32-bit architecture. AMD was taking the "if we build it, they will come" approach.

AMD's chance paid off. The roar in the IT community was deafening. It wasn't only that 64-bit computing had finally arrived that excited the tech world. It was that AMD had beaten Intel. It solidified the company's credibility as a manufacturer of innovative, state-of-art processors. For the first time in the history of microprocessors, it was Intel who was playing catch-up.

AMD's chips have always been popular with DIY computer builders because they are reasonably priced, perform well, and are well-supported both by AMD and by a host of motherboard manufacturers. The company itself also has a history of reaching out to the DIY market, whereas until recently, Intel marketed almost exclusively to commercial computer manufacturers, business IT departments, and the server market.

Today's AMD CPU's run the gamut from inexpensive, embedded chips used in Internet appliances and other dedicated devices; to extreme high-end processors like the AMD Ryzen Threadripper series that include some of the finest and most powerful microprocessors ever made.

For the most current information on AMD's current CPU offerings, please click here.

The gray-bearded author outdoors with a wild bird on his shoulder and a Buy Me a Coffee tip link
buymeacoffee.com/rjmweb


Getting Started

Try Amazon Prime 30-Day Free Trial

 

 

Page copy protected against web site content infringement by Copyscape