4

I have always wondered why is computer architecture in $2^n$ bits. We have 8 / 16 / 32 / 64-bit microprocessors or for that matter other parts of computer are also in power of 2 bits.

The only logic I could understand from my reasoning is that usually computer design process starts from lower amount of bits. For example : Say I want to design a Full adder to add 16 bit numbers. So I would first design a digital circuit to add 2 bits (one from number A and other form number B). Then I would replicate this circuit 16 times. So this will give me 16-bit full adder.

Is my reasoning correct ? Is there some other reason also ?

SimpleGuy
  • 215
  • 5
  • 11

2 Answers2

2

Not necessarily, for instance, look at the memory bus width in modern day GPUs, they would have values of 192, 384, 768(while 128, 256.... are common).

One could argue that 192 is a sum of powers of 2 (2^7+2^6), sure, but what number isn't?

That being said, any memory modules directly associated with a microprocessor or a microcontroller will have capacities in powers of 2. Traditionally registers will be 8,16, 32 or 64 bits long (though 10-bit and other non conventional widths exist).

A processor (or controller) with an n-bit register can address upto 2^n addresses (i.e, 0-(2^n)-1 bytes). Hence without changing the processor, one can extend the memory associated with it although exceeding the addressing space beyond what the processor can manage will be redundant.

In many DSPs, the memory associated with the processor is in 2^n multiples as many of the signal processing algorithms can be made very efficient when calculating this way.

d34df3tu5
  • 21
  • 1
-1

Some say that the IBM STRETCH (1961) introduced 8bits bytes :

http://en.wikipedia.org/wiki/IBM_7030_Stretch

Until the late 70s, computers used other sizes and multiples, for example 36bits = 8 characters of 6 bits. Early computers used only uppercase letters and numbers, using 6bits per character was more efficient.

Anyway, ask wikipedia : http://en.wikipedia.org/wiki/12-bit, http://en.wikipedia.org/wiki/18-bit...

There can be several explanations for the fact that all modern computers use 8bits bytes and multiples thereof. All monolithic microprocessors used that (8080, 6502...), 7 or 8bits ASCII was necessary for making text in uppercase and lowercase with internationalisation. Using powers of two all the way down must be appealing for computer designers...

Grabul
  • 1,900
  • 10
  • 12