• Animal
  • Auto
  • Computers
  • Education
  • Finance
  • Health
  • Math
  • Mobiles
  • Programming
  • Science
  • Sports
  • Technology
  • Terms
  • Web

  • Difference between 8-bit and 16-bit Microprocessor

    8-bit v 16-bit Microprocessor

    Before illustrating you the difference between 8-bit and 16-bit microprocessor, Let me illustrate you the what actually this “8-bit” or “16-bit mean”. An 8-bit CPU and ALU architectures are those that are based on registers, address buses, or data buses of that size. 8-bit is also a term given to a generation of microcomputers in which 8-bit microprocessors were the norm and for 16-bit it is the vice-versa.

    Lets know some interesting concept which leads us to know the difference between 8-bit ad 16-bit Microprocessor.

    What is the difference between 8-bit and 16-bit Microprocessor ?

    In June of 1978, Intel introduced the revolutionary new processor called the 8086. It was one of the first 16-bit processor chips on the market, at that time most of the all other processors had 8-bit design architecture.

    Intel 8086 Characteristics

    3 um process
    29k transistors
    5-10 MHz
    16-bit word size
    16-bit external-internal bus size
    40-pin DIP package

    The 8086 had 16-bit internal registers and people started running new softwares using 16-bit instructions. It has an 16- bit external data bus, which means it could transfer data to memory 16 bits at a time.

    And the address bus was 20 bits wide, which means that the 8086 can address a full 1MB (2^20) of memory. This made a stark contrast to most other chips of that time that which had 8-bit internal registers, an 8-bit external data bus, and a 16-bit address bus allowing a maximum of only 64KB of RAM (2^16).

    Unfortunately, most of the personal computer world at that time was using 8-bit processors, which ran 8-bit Control Instructions for operating systems and software. And the board and circuit designs at that time were mostly 8-bit as well. Building a full 16-bit microprocessor and memory system would be very costly.
    Difference between 8-bit and 16-bit Microprocessor
    The cost was very high as the 8086 needed a 16-bit data bus rather than a less expensive 8-bit bus. As all the systems available at that time were 8-bit, and sales were also very low as of the 8086 indicated and this made Intel to think that people weren’t willing to pay for the extra performance of the full 16-bit design.

    As a result, Intel introduced a new kind of crippled version of the 8086, called the 8088. The 8088 essentially deleted 8 of the 16 bits on the data bus, making the 8088 an 8-bit chip as far as data input and output were concerned. Anyways as it retained the full 16-bit internal registers and the 20-bit address bus, the 8088 ran 16-bit software and was capable of addressing a full 1MB of RAM.

    So the main difference is speed that is the frequency at which the instructions per cycle are executed

    The following two are the microprocessors which are introduced by Intel which led to the evolution of 16-bit microprocessor

    First Intel processor aftr 4004 series

    Intel 8008 Characteristics

    10 um process
    3500 transistors
    500 – 800 kHz
    Do Memory Retention exercises
    8-bit word size
    18-pin DIP package

    Intel 8080 Characteristics

    6 um process
    4500 transistors
    2 MHz
    8-bit word size
    40-pin DIP package

    Posted under: Computers
    1957 readers are already subscribed to this blog! Why don't you be one of them? Subscribe to this blog via your favorite RSS feed reader or by entering your email address on the form below:

    Leave a Reply

    page counter
    NiharsWorld on Twitter NiharsWorld on Facebook NiharsWorld RSS Feed