Enjoy DIY Learning Modules designed for engineers:
- Short, self-paced learning anywhere, anytime
- Technologies, Applications & Trivia
See Current Essentials Offerings
What Do You Want to Learn Next? Share Your Suggestion by Replying below!
Enjoy DIY Learning Modules designed for engineers:
See Current Essentials Offerings
What Do You Want to Learn Next? Share Your Suggestion by Replying below!
nelson64 Hi, the First thing I would do is to scrap the Arduino's pitiful IDE and use Eclipse, I use just plain C with my Arduinos, but I still use the Arduino libraries for the most part. you can do this two ways:
first is to write using Arduino FrameWork ie. setup() and loop() (which leaves a lot to be desired
The second is to write using C (you run into trouble with scoping rules with C++ ).
I normally keep one function on one page of code which is included + its header file.
ie..
count.h
int counter; // thsi could now be a Global but you will have to initialize it before you use it
file count_P.h
MaxCount = 8;
MinCount = 1;
#define UP 1
#define DOWN 0
file counter.c
#include counter_P.h
int counter.c ( int updown ) {
switch (updown) {
case UP: { if( counter = MaxCount ) conter = MinCount;
else counter++;
case DOWN:
~~ Cris
Try to go to the web page: nelson64, you will find tutorials, simple projects with diagrams, codes, and videos of the final results. Maybe start by reproducing some simple project you like and then modify it, add features, etc. In this period, when we spend most of our time locked up at home for the pandemic, a hobby like electronics is what it takes to keep our brains active!
Element14 is an inexhaustible source of interesting and curious projects. Have a good time nelson64
Carlo
Im new. Read book about computer gates, buses, RAM, flip-flops, etc. Also searched high and low on internet to find out how/what causes a bit to get turned on (energized/5v). Hope I am not violating any forum rules by asking.
How do "1" bits get energized/volts? Assume a program has defined a constant named ONE that has an initial value of 1 (0000 0001). Does the right most bit get energized (5V) when the program gets loaded into memory? What component of the computer causes that to happen? Same question for how do bits in op codes and operands get energized?
Ok where to start? what do you what to set to (TRUE or 1)
~~Cris
I know a compiler takes source code and turns it into binary object (ones and zeros). That object gets loaded into a computers RAM. So what I am trying to understand is what part of computer interprets this object code and turns on the binary "1" bits. In this case turns on the right most bit - 0000 0001.
To understand what happens in a PC when it executes a program, what happens when data is read from the memory or what happens when an instruction is read or executed, etc. it is necessary to study Computer Architecture (there are wonderful books like the classic Tanenbaum), to study the assembly of a CPU.
Obviously, you also need to know digital electronics and programming. It takes a long time both to study and to "digest" the knowledge acquired.
I would advise you to use a different, less theoretical, and more experimental approach. Consider the boards, the sensors, the actuators as simple blocks to assemble to accomplish a certain task. Look at simple projects already made, read the component datasheets, try to understand what the circuit does and what the code does, try to make your own modifications to the circuit and code, and see what happens ...
Gradually even the theory will seem simpler. Take one step after another.
The trick is always to have fun and always be curious.
It would be great to feature a whole series on MAKING techniques.
Engineers get very little training in using tools to make their designs, and lots of hobbyists get no formal training.
some potential topics are:
How about a course on how to design for test?
This can make your prototype designs much easier to get working.
Thanks for the advice - I just ordered Tanenbaum book but not confident that its going to answer my question about turning on the "1" binary bit in example. Fifty years ago I was a assembler language programmer on IBM mainframes. I recently took an interest in hardware stuff. I can't remember how many times I got called in at 2AM to debug a systems dump that prevented a stream of sequential programs to run. So I used to know how to read hex like the back of my hand. In those days of virtual storage constraint we used each of the 8 bits in a byte for logical on/off switches.
- Lots of assembler application programming
- A little channel programming - DASD - seek
- Systems programmer and some protect key zero code to analyze usage of pageable link pack area to remove stuff to free up virtual storage
- Tried some Arduino sensor stuff - flex sensors, photo sensors, turn on led (pinmode - digital write)
- Can create gates and test (truth tables) on breadboard using voltage, switches, resistors, transistors and leds - know you can also use cmos chips like cd4011 (quad nand gate)
- Know about buses, registers and ALU
In the case of Arduino, the digital write instruction initiates something that causes voltage to flow to turn on the led.
Is the Tanenbaum book going to fill in the blank?