I did not know how processors work, so I wrote a software simulator
A few months ago, I was suddenly struck by the thought that I had no idea about the principles of computer hardware. I still don’t know how modern computers work .
I read the book “But how does he know?” By Clark Scott with a detailed description of a simple 8-bit computer: starting with logic gates, RAM, processor transistors, and ending with an arithmetic logic device and input-output operations. And I wanted to implement all this in code.
Although I'm not soI’m interested in the physics of microcircuits, but the book simply glides along the waves and beautifully explains the electrical circuits and how the bits move around the system - the reader does not need knowledge of electrical engineering. But textual descriptions are not enough for me. I have to see things in action and learn from my inevitable mistakes. So I started implementing circuitry in code. The path was thorny, but instructive.
The result of my work can be found in the simple-computer repository : a simple calculator. He is simple and he calculates.
The processor code is implemented as a terrible bunch of logic gates that turn on and off , but it works. I ran unit tests, and we all know that unit tests are irrefutable proof that the program works.
The code processes the keyboard input and displays the text on the display using the painstakingly crafted glyph set for the professional font, which I called Daniel Code Pro . The only cheat: to take keyboard input and output the result, I had to connect the channels via GLFW , but otherwise it is a fully software simulation of the electrical circuit.
I even wrote a rude assembler that opened my eyes to many things, to say the least. He is not perfect. In fact, even a little crappy, but he showed me problems that other people had already solved many, many years ago.
But why are you doing this?
“Thirteen-year-olds are building processors in Minecraft. Call me when you can make a real CPU out of telegraph relays. ”
My mental model of the CPU device got stuck at the level of computer science textbooks for beginners. The processor for the Gameboy emulator, which I wrote in 2013, actually does not look like modern CPUs. Even if the emulator is just a state machine (state machine), it does not describe the state at the level of logic gates. Almost everything can be implemented using only the operator
switchand maintaining the state of the registers.
I want to better understand how everything works, because I don’t know, for example, what the L1 / L2 cache and pipelining are, and I’m not entirely sure that I understand the articles about the vulnerabilities Meltdown and Specter. Someone said that they optimize the code in such a way as to use the processor cache, but I don’t know how to check it, except to take a word. I'm not quite sure what all x86 instructions mean. I don’t understand how people submit tasks to GPUs or TPUs. And in general, what is TPU? I do not know how to use SIMD instructions.
All this is built on a foundation that needs to be learned first. It means going back to basics and doing something simple. Clark Scott's aforementioned book describes a simple computer. That is why I started with it.
Glory to Scott! He works!
Scott's computer is an 8-bit processor connected to 256 bytes of RAM, all connected via an 8-bit system bus. It has 4 general purpose registers and 17 machine instructions . Someone made a visual simulator for the web : it's really great. It is scary to think how long it took to track all the states of the circuit! Circuit with all components of the Scott processor. Copyright 2009-2016. Siegbert Filbinger and John Clark Scott
The book accompanies you along the route from modest logic gates to bits in memory and registers, and then continues to layering components until you get something similar to the diagram above. I highly recommend reading the book, even if you are already familiar with the concepts. Just not the Kindle version, because charts are sometimes difficult to enlarge and disassemble the “reader” on the screen. In my opinion, this is a Kindle's longstanding problem.
My computer differs from the Scott version except that I updated it to 16 bits to increase the amount of available memory, because storing only glyphs for the ASCII table takes up most of Scott's 8-bit machine, leaving very little space for useful code.
My development process
In general, the development went according to this scheme: reading text, studying diagrams, and then trying to implement them in a general-purpose programming language and definitely not use any specialized tools for designing integrated circuits. I wrote the simulator on Go simply because I am a little familiar with this language. Skeptics may say: “Dumb! Couldn't you learn VHDL or Verilog , or LogSim , or something else. But by that time I had already written my bits, bytes and logic gates and plunged too deep. Maybe next time I will learn these languages and understand how much time I wasted, but these are my problems.
In a large circuit, a bunch of Boolean values are simply transmitted in the computer, so any language that is friendly with Boolean algebra is suitable.
Applying a scheme to these Boolean values helps us (programmers) derive meaning, and most importantly, decide which order of bytes the system will use, and make sure that all components transmit data on the bus in the correct order.
It was very difficult to implement. For the sake of bias, I chose a representation with reverse byte order, but when testing ALU I couldn’t understand why the wrong numbers came out. My cat has heard many, many unprintable expressions.
Development did not go fast: perhaps it took about a month or two of my free time. But when only the processor successfully completed the operation, I was in seventh heaven with happiness.
Everything went on until it came to I / O. The book proposed a system design with a simple keyboard and display interface to enter data into the machine and output the result. Well, we have already gone so far , it makes no sense to stop halfway. I set a goal to implement typing on the keyboard and displaying letters on the display.
Peripheral devices use an adapter template as a hardware interface between the CPU and the outside world . It is probably easy to guess that this template is borrowed from software design. How I / O adapters connect to a GLFW window With this separation, it turned out to be quite simple to connect a keyboard and display to a window running GLFW. In fact, I just pulled most of the code from my emulator and modified it a bit to make the Go channels work as I / O signals.
We start the computer
This is probably the hardest part, at least the most cumbersome. It is difficult to write in assembler with such a limited set of instructions, and in my rough assembler it is even worse, because you cannot fool anyone but yourself.
The biggest problem was juggling with four registers, tracking them by pulling data from the registers and temporarily storing them in memory. In the process, I remembered that the Gameboy processor has a stack pointer register for the convenience of loading and loading registers. Unfortunately, this computer does not have such a luxury, so you had to constantly manually transfer data to and from memory.
I decided to spend time on just one pseudo-instruction
CALLto call a function, and then return to the point. Without this, calls are available only one level in depth.
In addition, since the machine does not support interrupts, I had to implement a terrible keyboard state polling code. The book discusses the steps necessary to implement interrupts, but this seriously complicates the circuit.
But stop whining, I still wrote four programs , and most of them use some kind of common code for rendering fonts, keyboard input, etc. This is not an operating system, but it gives an understanding of what a simple OS does.
That was not easy. The most difficult part of the text-writer program is correctly figuring out when to jump to a new line or what happens when you press Enter.
The main text-writer program loop
main-getInput: CALL ROUTINE-io-pollKeyboard CALL ROUTINE-io-drawFontCharacter JMP main-getInput
I didn’t bother to implement the Backspace key and modifier keys. But I realized how much work the development of text editors requires and how tedious it is.
It was a fun and very useful project for me. In the midst of assembly language programming, I almost forgot about the logic gates below. I went up to the upper levels of abstraction.
Although this processor is very simple and far from the CPU in my laptop, it seems to me that the project taught me a lot, in particular:
- How bits move across the bus between all components.
- How does simple ALU work?
- What a simple Fetch-Decode-Execute loop looks like .
- That a machine without a stack pointer register and a stack concept - sucks.
- That car without interruptions also sucks.
- What is assembler and what does it do.
- How peripherals interact with a simple processor.
- How simple fonts work and how to display them on the display.
- What a simple operating system might look like .
So what's next? The book says that no one has produced such computers since 1952. This means that I have to study the material over the past 67 years. It will take me a while. I can see that the x86 manual is 4800 pages long enough for a pleasant, easy reading before bedtime.
Maybe I’ll be a little pampered with the operating system, the C language, I’ll kill the evening with the PiDP-11 build kit and a soldering iron, and then abandon this thing. I do not know we will see.
Seriously, I'm thinking of exploring the RISC architecture, possibly RISC-V. It is probably best to start with early RISC processors to understand their origin. Modern processors have much more features: caches and more, I want to understand them. There is a lot to learn.
Will this knowledge come in handy in my main job? Perhaps useful, although unlikely. In any case, I like it, so it doesn’t matter. Thank you for reading!