Enjoy DIY Learning Modules designed for engineers:
- Short, self-paced learning anywhere, anytime
- Technologies, Applications & Trivia
See Current Essentials Offerings
What Do You Want to Learn Next? Share Your Suggestion by Replying below!
Enjoy DIY Learning Modules designed for engineers:
See Current Essentials Offerings
What Do You Want to Learn Next? Share Your Suggestion by Replying below!
I know a compiler takes source code and turns it into binary object (ones and zeros). That object gets loaded into a computers RAM. So what I am trying to understand is what part of computer interprets this object code and turns on the binary "1" bits. In this case turns on the right most bit - 0000 0001.
I know a compiler takes source code and turns it into binary object (ones and zeros). That object gets loaded into a computers RAM. So what I am trying to understand is what part of computer interprets this object code and turns on the binary "1" bits. In this case turns on the right most bit - 0000 0001.
To understand what happens in a PC when it executes a program, what happens when data is read from the memory or what happens when an instruction is read or executed, etc. it is necessary to study Computer Architecture (there are wonderful books like the classic Tanenbaum), to study the assembly of a CPU.
Obviously, you also need to know digital electronics and programming. It takes a long time both to study and to "digest" the knowledge acquired.
I would advise you to use a different, less theoretical, and more experimental approach. Consider the boards, the sensors, the actuators as simple blocks to assemble to accomplish a certain task. Look at simple projects already made, read the component datasheets, try to understand what the circuit does and what the code does, try to make your own modifications to the circuit and code, and see what happens ...
Gradually even the theory will seem simpler. Take one step after another.
The trick is always to have fun and always be curious.
Thanks for the advice - I just ordered Tanenbaum book but not confident that its going to answer my question about turning on the "1" binary bit in example. Fifty years ago I was a assembler language programmer on IBM mainframes. I recently took an interest in hardware stuff. I can't remember how many times I got called in at 2AM to debug a systems dump that prevented a stream of sequential programs to run. So I used to know how to read hex like the back of my hand. In those days of virtual storage constraint we used each of the 8 bits in a byte for logical on/off switches.
- Lots of assembler application programming
- A little channel programming - DASD - seek
- Systems programmer and some protect key zero code to analyze usage of pageable link pack area to remove stuff to free up virtual storage
- Tried some Arduino sensor stuff - flex sensors, photo sensors, turn on led (pinmode - digital write)
- Can create gates and test (truth tables) on breadboard using voltage, switches, resistors, transistors and leds - know you can also use cmos chips like cd4011 (quad nand gate)
- Know about buses, registers and ALU
In the case of Arduino, the digital write instruction initiates something that causes voltage to flow to turn on the led.
Is the Tanenbaum book going to fill in the blank?