Monday 3 July 2017

A Brief History of Computer Architectures

Modern computers started their journey from 8 bit processors. You might still remember the great 8085 microprocessor, which was completely programmable and it was such a craze among scientists. Then came the 16 bit applications. Those were the days of console gaming. While everything else were 16 bit as well, we remember 16 applications mostly for the console games and the mass use of personal computers in large organisations. As there were no standardisation, even Microsoft Windows were yet to get a proper foothold in the market, organisations developed their own 16 bit applications, or software, to use internally.

Presently, we have 32 bit and 64 bit architectures with 32 bit architecture slowly getting backdated. We can see how Microsoft encourages using the 64 bit version of their operating systems. 128 bit and 256 bit are not too much far behind because we can already see 1278 bit architecture being used in certain computing environments and 256 bit encryption is the standard for digital security. Soon, the complete planet will move to these architectures.

Anyway, remember the 16 bit applications the large companies build for their internal use? These applications can't work in the modern environment. So the companies are left with two choices, either stick with a completely outdated 16 bit computing system or upgrade the 16-bit application to a modern architecture. The first method makes the company vulnerable to cyber attacks and it also doesn't allow them to exploit the advances the computer industry made in the last decade. Thus, upgrading the application is the wise choice.

0 comments:

Post a Comment