History of Processors
01001100 01100101 01110100 00100111 01110011 00100000 01100101 01111000 01110000 01101100 01101111 01110010 01100101 00100000 01110100 01101000 01100101 00100000 01010000 01110010 01101111 01100011 01100101 01110011 01110011 01101111 01110010 01110011 00100000 01100110 01100001 01101101 01101111 01110101 01110011 00100000 01100110 01101111 01110010 01101101 01110011 00100000 01110100 01101000 01110010 01101111 01110101 01100111 01101000 00100000 01110100 01101001 01101101 01100101 00100001
Hover over Bit or click here to see what he says!Let's explore the Processors famous forms through time!
Vacuum Tubes
Before the invention of transistors in 1947, the first computer had already made its appearance. The Electronic Numerical Integrator and Computer, or simply ENIAC, was the first general purpose computer. The University of Pennsylvania, specifically its Moore School of Electrical Engineering, completed the build in December 1945 just after World War II had finished. Its purpose: making calculations and solving complex numerical problems. The ENIAC relied on a massive list of components to perform its calculations, the most important one being the 18.000 vacuum tubes which performed the switching task that transistors do nowadays. However, these tubes would burn out fairly often, making them less reliable. The majority of the failures happened when the tubes were warming up or cooling down, the moment when they were under the most thermal stress.We had a tube fail about every two days and we could locate the problem within 15 minutes.
The ENIAC was a big unit! The panels alone took up a space of 9m (30ft) by 15m (50ft). That's two average school classrooms combined!


Look, I may be a smart Byte, but even I don't fully understand this ENIAC program and how it works.. Hats off to these amazing women who stood at the starting line of programming, I am you all eternally grateful! Please, a round of applause for Kathleen McNulty Mauchly Antonelli, Jean Jennings Bartik, Frances (Betty) Snyder Holberton, Marlyn Wescoff Meltzer, Fances Bilas Spence, and Ruth Lichterman Teitelbaum!
Learn more about Computers using Vacuum TubesVon Neumann Architecture


EDVAC
The Electronic Discrete Variable Automatic Computer was the first built using the new Von Neumann Architecture. After finishing the design, the government signed a construction contract in April 1946 therefore ordering the build of the system. Ultimately, in 1949 the EDVAC was delivered to the Ballistic Research Laboratory. It became the first binary serial computer, capable of addition, subtraction, multiplication, division with a memory capacity of 1.024 44-bit words. In modern day terms, that is equal to 5.6 kilobytes, roughly the size of a small text file. The EDVAC was much smaller, yet already stronger, than the ENIAC. With almost 6.000 vacuum tubes and 12.000 diodes, it consumed just 56 kilowatts of power. That is almost one third of what the ENIAC had in components and power consumption. Not only that, it also used less space with just 45m2 (485ft2) and weighing in at almost 8 tons. Furthermore, the EDVAC was capable of using floating-point arithmetic since receiving a few upgrades in 1958, which means it could calculate things as 2.2 * 3.25. Before, only full integers could be used in the calculations. In the end, the EDVAC ran from 1949 until 1962 when it was retired and replaced by the Ballistic Research Laboratories Electronic Scientific Computer. Read More about the EDVAC on Wikipedia...EDSAC
The Electronic Delay Storige Automatic Calculator was an early computer made by Cambridge University and was heavily inspired by the First Draft of a Report on the EDVAC. Maurice Wilkes and his team started the construction of the EDSAC in 1947 and finished it early 1949 with its first program running on May 6th 1949. It used a different technique to store data in memory; instead of vacuum tubes, they opted for mercury delay lines. For the logic, they still used the vacuum tubes. By drastically reducing the amount of vacuum tubes, they were able to cut the power consumption back to only 11 kilowatts of electricity. In the end, the EDSAC ran from 1949 until 1958 when it was superseded by the EDSAC 2. Unlike the ENIAC and EDVAC, the Electronic Delay Storage Automatic Calculator was primarily used in university research supporting the Cambridge University Mathematical Laboratory. This system has played a pivotal role in the research of many renowned scientists. It helped at least three Nobel Prize winners, John Kindred and Max Perutz (Chemistry, 1962), Andrew Huxley (Medicine, 1963), and Martin Ryle (Physics, 1974) who all acknowledged the role that the EDSACs computing power had played in their research. Currently, there is a project going on at the National Museum of Computing to replicate the original EDSAC. If you love a bit of history, and you live in the UK, it is definitely worth a look! If you don't live in the UK, don't worry, the museum has a webpage set up to get more information on the project and its current state. Read More about the EDSAC on Wikipedia...From Printed Circuit Boards
To Integraded Circuits
With vacuum tube computers already in use by major universities and government instances, the demand for these mathematical geniuses for the consumer market rose drastically. However, the cost of maintenance and tubes could not realistically be covered by most private individuals or even commercial companies. That is even without taking the sheer size of these computers into consideration. Fortunately, during the 50s, a new type of computer came into the playing field which enabled smaller to mid-sized companies to purchase computing power. Two decades later, a new advancement in processor technology would pave the path for home computing that we all know, love, and use today. At the center of these advancements is the transistor, invented in 1947. This quicker and more power efficient component would dominate the processor industry for years to come.
Transistors on a Printed Circuit Board Around the mid 50s, two major companies developed the next generation of computers by replacing the vacuum tubes for transistors. They created modules; printed circuit boards with transistors, resistors, capacitors and other electronic components soldered directly onto them. Each type of module had a specific feature or role within the computer. A system would be made up of these modules and such a system could easily be expanded in functionality, memory, or processing capacity by adding more modules. One of the companies, IBM, even created the IBM Standard Modular System which remained in use with legacy systems well throughout the 70s after the introduction of the IBM 1401 in 1959. Digital Equipment Corporation was a fierce competitor of IBM at the time. They introduced their first fully transistorised computer, the Programmed Data Processor-1 (or PDP-1), in 1959. This machine became pivotal in computer history. Many applications that we deem the norm today were first created on or for this system. The PDP-1 is the original hardware for:
The PDP-1 is famous for being the most important computer that created the hacker culture at MIT. The first reference to malicious hacking came from MITs student newspaper, The Tech, talking about "telephone hackers". Apparently, a few student hackers from MIT were tying up the lines with Harvard, configuring the PDP-1 to make free calls, and accumulating large phone bills.
Transistors in an Integrated Circuit
During the continued the development of more minicomputers, a shift in electronic circuit technology happened. Since the late 50s, around the time when the first transistor computer came to market, pioneers like Robert Noyce at Fairchild Semiconductor were experimenting with a new way of creating an electronic circuit; one so tiny, that they cannot be seen by the naked eye. This new technology would be the breakthrough to make computers more affordable and smaller, and, overtime, to our wrists and pockets. Integrated circuits brought a new era of computing technology to us. So, what is an integrated circuit? The science and technology behind this is very complex, but here is an attempt at a simplified explanation. Integrated circuits are like teeny tiny cities with buildings and roads made in silicon. These buildings could be transistors, resistors, and capacitors with little roads interconnecting them. Lightning travels between and through the buildings, resulting in a miniature full electrical circuit. Essentially, it is the same as the printed circuit boards with the electronic components soldered onto them, but much much much smaller. Nowadays, we cannot live without this technology. The majority of electronics that we use today have these integrated circuits. It doesn't only apply to processors, it is used for a wide variety of components. Do you play video games using a graphics card? Yep, that has ICs in them. Saving pictures and files to your SSD? You bet that's made of ICs. Cameras, whether on your phone or as a camerabody, also littered with ICs.Maybe, one day, I will attempt to explain in much more detail how integrated circuits and the technology behind it to make them work. For now, just know that it is a miniature electrical circuit. However, if you want to learn more about it, and don't want to wait for me to tell you, I have a few resources that you might find interesting. The materials are complex, just so you know what you're getting into. Learn more about integraded circuits, the metal-oxide-semiconductor field-effect transistor, the applications of MOSFET in the integrated circuit industry, and ASMLs role in the semiconductor industry.
A Timeline of Microprocessors
Since the introduction of the first general purpose microprocessor in 1971 by Integrated Electronics (aka, Intel), processor technology has advanced at a rapid rate. Gordon Moore, one of the founders of Intel, speculated that processors would double their transistor count by every two to three years, increasing in speed and operating capacity with each cycle. This observator was later named after him: Moore's Law. Processor technology hasn't disappointed, and still is mostly adhering to this observation. Hence, to keep track of the microprocessors from history, this timeline highlights the most popular CPUs of the past 50 years. 4-bit Important CPUs: Intel 4004 (1971), Intel 4040 (1974) The Intel 4004 is generally considered the first general purpose processor that gave rise to the integrated circuit technology. With an astounding 2300 transistors fitted into a 12mm2 die, roughly the head of a small nail, having a computer at home suddenly became a possibility. The chip was fairly small and still powerful enough to cater to the users needs of that time. While it wasn't used much in the personal computer sphere, it did show the potential that integrated circuits would provide to make this a reality.The Intel 4040 was released three years later as an improvement upon the Intel 4004. Both these chips were designed by the same person, Federico Faggin, who would design a few more impressive CPUs in the oncoming years.What was your first Processor or Computer? Let me know in the comments!
8-bit Important CPUs: Intel 8008 (1972), Intel 8080 (1974), Motorola 6800 (1974), MOS 6502 (1975), Zilog Z80 (1976) Not long after the release of the Intel 4004, the Intel 8008 took the stage as the first consumer grade 8-bit CPU. Originally, this processor was not intended to be released as a stand alone CPU for the market. Computer Terminals Corporation commissioned Intel to create a CPU for their Datapoint 2200, because they didn't have the capacity to design and manufacture processors themselves. However, since the processor didn't meet the requirements, CTC allowed Intel to use the design for themselves. After rebranding it, Intel released the Intel 8008 in 1972. The performance of the chip was dreadful, but it was the first 8-bit processor nonetheless. The Origin Story of the Intel 8008, the first commercial 8-Bit processor.The Seed of the Personal Computer
In 1968, Computer Terminals Corporation (later named, Datapoint) made its entry into the computer world. They wanted to bring a more advanced terminal system to the smaller companies and even the homes of people, so they could experience the power of the computer. With three prototypes of the Datapoint 3300, they took the market by storm to the point where they had to use other manufacturers, even one that made motorcycle helmets, to get the production of these systems flowing consistently. A year later, CTC wanted to improve upon their design. The founders, Jon Phil Ray and Gus Roche, envisioned an even more intelligent terminal system. Two of their employees started working on a new instruction set architecture for a new processor for the new Datapoint 2200. However, CTC didn't have the capacity to come up with a CPU design that would fit the developed architecture. Since they were already using the chips from Intel and Texas Instruments for their previous computer, the most logical step would be to go to these companies.Creative Business Practices
However, the company didn't have a lot of money on hand. In fact, they had outstanding invoices with the two chipmaking giants. Fortunately, that didn't deter Ray and Roche from coming up with an interesting gamble. The two arranged a dinner with the president of Intel, Robert Noyce, and the president of Texas Instruments. Before this meeting, Ray drew the schematics for a microprocessor and the new instruction set architecture on two postcards and during the dinner he gave one to each president with a bet: the company who first can create a computer on one chip would forgive CTCs outstanding debts. But why should the winner of the bet forgive these debts? CTC had bought, and still would need, a lot of chips and components from these tech giants. Without the funds to pay for it all, they had racked up some serious debt. So, CTC wasn't just looking for a way to source a new computer, they were also looking for a way to cut costs. The idea here was that the company would do business with the winner — meaning that whichever company could produce the chip, would also be the one selling these chips to CTC. Ray and Roche believed that this would be the future of computers and that there was a huge market for this type of system, which would bring a lot of profit to the winner as well. Initially, Noyce questioned this approach. At the time, Intel was selling a lot of computer components; putting all the components in one processor would surely reduce sales and therefore Intel would undoubtedly lose money on it. Not entirely a wrong assumption for that time, but we all know how it turned out in the end. Still, both companies agreed to the bet and got to work. Texas Instruments was the first to back out of the bet. By late 1970 or early 1971, they couldn't deliver a stable and reliable processor. By this time, Intel still had not produced a demo product, so CTC decided to go down the transistor computer route instead which they could do themselves. This is the reason why the Intel 8008 instruction set was first used on a transistor computer, rather than a microprocessor for the first year of its existence.Rebranding a "Failed" Product
By the end of 1971, Intel managed to come up with a processor that was capable of running the code. However, CTC decided not to go through with the CPU they had made. In the demonstration, the CTC 1201 as this processor was called at the time did not meet the performance requirements of CTC management. Furthermore, it needed a lot of extra accessory chips to make it work, and they wouldn't be able to meet the release dat for the new Datapoint 2200. However, CTC had effectively contracted Intel to make the chip design and to produce it. So, to avoid paying the costs of the commission, they released Intel from their contract allowing them to use the CPU design to make a product themselves. Within a few months, Intel had rebranded the processor to the Intel 8008 and marketed it as the first 8-bit CPU of its time. Noyce's concerns at the time were absolutely valid. No one could ever have predicted how popular the successors of the Intel 8008 would become. For all they knew, they would lose a lot of business with fewer chips needed in computers. Fortunately for Intel, they become the leading company in the advancement of processor technology which definitely has contributed to their solid position in the CPU market to this day.
Did you know that the WDC 65C02 by Western Design Center is still in active use to this day? This little 8-bit CPU (and its bigger 16-bit brother, the WDC 65C816) protect the lives of many people with embedded heart defibrillation and pacing systems! They consume a small amount of power and are very stable, making them the perfect candidates for healthcare applications. Not only that, many hobbyists love toying with these chips! Just like our writer here, who wants to Build a Computer From Scratch.
The 8-bit processors have laid a foundation in the computer industry that no other CPU can match to this day. They have started the revolution which gave us the home computer, the gaming console which ultimately has lead to the advancement of the technology as a whole. Without these chunky CPUs, we never would have had smart watches or smartphones. These devices are much faster and smarter than our 8-bit friends, but they wouldn't have existed without them. Want to learn even more about 8-bit CPUs and Computers?I see you want to learn more about 8-bit CPUs and Computers, that's awesome! I have accumulated some resources for you. Some are reading materials, others are YouTube channels that cover this type of information and projects. Have a look!Microprocessor (Wikipedia); this covers all of microprocessors, not just 8-bit, but still as interesting! Microprocessor Chronology (Wikipedia); this is an extensive list of all the microprocessors created to this day (or roughly..). The 8 Bit Guy; an awesome YouTuber covering just about everything there is about old hardware. Adrians Digital Basement; another awesome YouTuber covering old hardware. There is probably a great many more resources, but this should get you started at least.
- ShadowPhoenix, the Writer
Intel x86
In 1978, Intel released their, now infamous, Intel 8086. This processor marks the start of the Intel x86 series which powers most modern personal computers to this day. The Intel 8086 used the same microarchitecture as its 8-bit predecessors. This meant that programs written for the Intel 8008, 8080, and 8085 (1977) could run perfectly on the Intel 8086 without altering the code. It was the first case of backwards compatibility in computers. A year later, the Intel 8088 was released and became hugely popular because it was the CPU of choice for one of the very first personal computers: the IBM PC. With their subsequent releases of the 80186, 80188, 80286, and the 80386, their first 32-bit CPU, Intel became the most dominant CPU player in the PC market with the processor family's backwards compatibility.Did you know, the first computer of Dimitris, the founder of WorldAnvil, featured the Intel 80286? Its standard clock was 6MHz and had a turbo up to 12MHz, which was really fast for its time. To give an idea how fast computers are nowadays, the Intel Core i5-13400 (an average Intel Core i5) runs natively on 3.3GHz, which is 500x faster than the 80286s standard clock, and has a turbo up to 4.6GHz. The 80286 had 120.000 transistors, whereas the i5-13400 has 14200 million transistors, that's over 100.000 times more transistors!
Intel's Extra PeripheralsRise of Motorola
Shortly after 16-bit processors were available for consumers to purchase, the first 32-bit CPUs started to appear. One of the most prominent 32-bit processors came from Motorola. Their 8-bit M6800 was hugely popular at the time, but sales were already dimming especially with their fierce cheaper competitors (Zilog Z80 and MOS 6502 primarily) taking over a good chunk of the market. By 1976, Motorola was aware of Intel and Zilog working on a 16-bit CPU, the Intel 8086 and the Z8000 respectively. Instead of trying to compete with them directly in the 16-bit space, the company opted to aim directly for a 32-bit processor to get ahead in the game. Early on in the development, Motorola abandoned the idea of backwards compatibility with the M6800; they felt that its 8-bit design would be too limiting for a new and innovative 32-bit processor. Their primary goal was winning back computer vendors, such as Apple and AT&T, by immensely improving the performance, aiming for a speed of 1 MIPS. Free from the 8-bit limitations, they presented the world with the Motorola MC68000, otherwise known as the Motorola 68k or the m68k, in 1979. They were competing directly against the Intel 8088 in the IBM PC. Their m68k powered many systems, such as the iconic Apple Lisa and the Apple Macintosh, the first two systems to run an operating system with a graphical user interface. Starting the early 1990s, the use of the m68k family in systems reduced as the PowerPC family, another 32-bit CPU that was made in collaboration between IBM and Motorola, made its way into new iterations of Apple computers throughout the 90s.Did you know that at one point there were more MC68020 processors in embedded equipment than Intel Pentiums in PCs?
Improvements on the M68kIntel and AMD; From Partnership to Competitors
In the meantime, Intel didn't sit still. While they were still primarily riding on the success of their 16-bit processors of the x86 family in the early 80s, they had started the development of a 32-bit version on the same instruction set architecture. Three years after the release of their latest 16-bit CPU, the Intel 80286 in 1982, the company introduced the Intel 80386 which was later renamed to i386. It followed in the footsteps of the Intel 8088, powering many personal desktops and company workstations as well. The i386 was widely adopted, not just in personal computers, but also in embedded systems. The processor, or one of its many derivatives, are common in aerospace technology and electronic musical instruments, among others. Its successor, the i486 from 1989, was the first processor to exceed the 1 million transistors in its CPU die mark.
Did you know that the i386s production was discontinued in 2007? This chip had been produced for over 20 years! The Linux Kernel supported this chip the longest, up until the very end of 2012.
Up until this point, Intel had a partnership with AMD and a few other companies, dating back to the early 80s. In order to seal the deal with IBM to put the Intel 8088 in their flagship personal computer, Intel had to accept one important condition; IBM wanted to make sure there was a second-source manufacturer capable of producing the processor. Back in those early days, it was a common practise to license other companies to manufacture and sell components originally designed by another company. This was done to make sure that there would always be someone capable of producing the necessary components, so a popular and profitable product can always be made even when the primary production facility burns down for example. Thus, AMD had the designs of the Intel microprocessors in their databases to be able to make them. While the processor designs of the early x86 family were licensed, Intel decided not to do this for their new flagship line of CPUs: the Intel Pentium. From 1993 until 2003, the Intel Pentium series sealed the domination of Intel in the personal and business computing market. Ranging from personal desktops to the first laptops; from workstations to the first set of cell- and smartphones; nearly everyone was using Intel microprocessors. Over the course of these 10 years, their CPUs increased in complexity, which is determined by the amount of transistors, and capability, which is determined byt he amount of instructions a CPU can execute per second (aka MIPS), by at least three orders of magnitude. Intel may have dominated the market during this time, but they still had good competition with IBM and Motorola with their PowerPC line that were powering many of Apple's machines. Furthermore, AMD also wasn't keeping quiet either. Even though they didn't have any licenses for the Intel Pentium series, they still had the previous x86 designs as a blueprint for their new designs. In 1991, just before the release of the Intel Pentium, AMD introduced a new CPU that was compatible with the i386, the Am386, designed by their engineers. From this point, AMD became, and still is, a direct competitor of Intel, forcing the company to innovate and to create better microprocessors. The AMD K5 (1996) and AMD K6 (1997) were the predecessors that had lead the team towards the fastest processor for its time; the iconic Athlon line launched in June 1999 and was for two years the fastest x86 CPU breaking the 1GHz speedbarrier. 64-bit Important CPUs: Intel Pentium 4, AMD Athlon 64, Intel Core Duo, AMD Athlon 64 X2, Intel Core Series During the early 1990s, a few companies were already experimenting with a 64-bit architecture for their CPUs. The majority of these companies focussed on delivering microprocessors for large servers and workstations in big companies. Digital Equipment Corporation was one of the first to introduce the next generation of server CPUs; DEC unveiled the Alpha 21064 to the world during the 39th International Solid-State Circuits Conferences in mid-February 1992. Five years later, Sun (now known as Oracle) launched the UltraSPARC; and in 1997 IBM released their 64-bit processor, the RS64. Only big corporates had access to purchase these products, since they were specifically designed for workstations and servers which often required significantly more computing power than normal consumers.
Imagine this: the CPU has a storage space with many shelves next to the calculator. It uses the shelves to store values: the inputs for the calculator and the output. These shelves can hold 32 bits of data. To give you an idea, with 32 bits, 32 zeros and ones, you can create 4,294,967,296 combinations of zeros and ones. These combinations form the letters we see on screens, the logic of things, just everything. You have decided to upgrade your CPU with a new and shiny one that is 64-bits. When you walk into the storage space, you see shelves that can hold 64 bits! 64 zeros and ones, which allow you to create 18,446,744,073,709,551,616 different combinations! Not only that, you even have twice as many shelves as before, meaning you cannot only store more on the shelves, but you also have more space to store other data! Think of all the new possibilities, the resources new programs can use now to do more advanced things!
Learn more about ARM processors here!
RISC
As noted before, Intel has the x86 family, all originating from the ISA created for the Intel 8086. The x86 ISA is a very large and complex architecture with many instructions at its disposal, which has its pros and cons. Having a wider variety of instructions, some of which consisting of multiple smaller instructions, means that programmers have to write relatively little code to make an Intel CPU execute a complex process. However, there is a price to be paid; big chunky instructions slow down the CPU. Furthermore, it also creates a fair bit of overhead as not all instructions necessarily have to be performed to get the same result. On the opposite side of the instruction set architecture landscape stands RISC, or Reduced Instruction Set Computer. As the name suggests, RISC opts to have less instructions available to the programmer, but every instruction is much simpler. The biggest con to this, is that the programmer has to write more code to make a RISC based CPU do the same thing as an Intel x86 based CPU. That being said, the simplicity of the instructions to make them faster to perform. Therefore, RISC hopes to offset the need to process more instructions with speed. The simplicity of the code, albeit longer but faster, makes RISC based CPUs the perfect candidate for embedded systems, and the most popular of all are ARM processors.Acorn RISC Machine

Advanced RISC Machines
Over the years, many new versions of ARM CPUs have taken a steady hold in the computing industry. They are quick and light, making them perfectly suitable for mobile applications. ARM licensed their CPU designs to other companies, such as Samsung, Apple, and many more. The majority of the phones runs on an ARM processor and very recently, they have emerged in more complex applications as well. Apple introduced their first fully ARM based processors in their flagship laptops in November 2020, with the third generation of Apple Silicon M3 chips released this year (2023).Yesterday's Future is Tomorrow's History
So, what can we expect to change in the processor market in the future? It is hard to say what Research and Development departments at Intel and AMD are focussing on at the moment. From this point forward, it is a guessing game what will become the next big thing in processor space.128-bit Processor
So far, CPUs have always doubled up in register size; from 4-bit all the way to 64-bit. It would seem logical that 128-bit CPUs are around the corner, and AMD happened to have released their first 128-bit consumer CPU. That said, it is unlikely that these processors will become affordable for consumers anytime soon, since we hardly need more than 64-bit gives us.Smaller.. and Smaller
Companies continue to strive for smaller transistors in order to fit more on a single processor die. The biggest player by far in this game is ASML, a Dutch company that builds the machines which can make these wafers consisting of many CPU dies. Their newest generation of EUV machines are expected to go to market in 2025, bringing 2nm technology to the fabrication of processors.A 128-bit register provides us with an enormous amount of combinations... 39 numbers long... that's like writing 123,456,789 four times, and then adding another 123 at the end... that's... more than enough, don't you think?
A relatively obscure Dutch company
Artificial Intelligence
There are rumours that hint towards more AI integration directly into the CPU hardware to optimise the execution of programs based on the users preference and usage. With many people and companies already using their processors for AI processing, it is only logical that processors more optimised for the use of AI will appear.Change in Materials
Currently, wafers of silicon form the basis of the processor, where all the transistors and inner electrical pathways are etched. However, new studies have found a different material to improve the speed of CPUs: Graphene. It can handle heat created by the electrical charge much better, potentially removing one of the processors biggest bottlenecks: overheating.Our Writer looks up to the work done at ASML, if you couldn't tell that already. Maybe we can bully her into.. I mean ask her nicely to explain how lithography works, the process to make computer chips that powers all our electronic devices. That way she gets to learn even more about ASML and we get to learn more stuff too! Let our dear writer know in the comments!
A great deal of information regarding the ENIAC were collected from Wikipedia and Penn Engineering at the University of Pennsylvania.
Remove these ads. Join the Worldbuilders Guild
Holy moly. What a beefy article. Thank you, ShadowPhoenix. Computer Adventures is one of the few settings I follow which makes me feel smarter with each entry.
Heh, yeah, there's a lot of information on processors. ^,^" I'm so glad you like the setting, hopefully you'll learn loads more things in the articles to come!