First language should be scratch. It is useless for anything professional but will teach the basic syntax and explain the ideas. Then you can move onto C or Java and create something useful using the same concepts as scratch.
I agree with all that. Probably for some application developers it applies too, since even during non-embedded application development it is useful knowing how assembler instructions work and computer architecture to a reasonably low level, because when the software crashes and a core file is generated, it can be useful to help decipher what went wrong. It basically ends up becoming an extra tool to help in the troubleshooting process.
Fair enough. I have to admit my experience with programming is decidedly not closely related to EE. The same is also true for most of the programmers I know, and those for whom it is not true probably just don't make a fuss about compilers, so that would certainly explain away my anecdotal evidence. The lowest-level programming that I have personally done has been C-coded GPIO programs for the Raspberry Pi. Thanks for the info, you and johnbeetem have really opened my eyes when it comes to low-level coding.
I would like to note that I still don't think your average electrical engineer is ever going to need assembly, and certainly hobbyists won't. But, it is certainly true that assembly of one kind or another would help any embedded engineer or computer engineer or other similar field.
The thing is, electrical engineers are not computer scientists, and most of them likely never want to be. They need a programming language that works on many different platforms, and they need to learn it in addition to other, non-compute sciencey things. Even embedded engineers are likely not going to be using one specific platform their whole career, thus necessitating learning either a multi-platform language or multiple assembly languages.
The thing is, the EE is quite likely to be the one designing the hardware. He'll need a way to debug/validate/stress-test his hardware design at a point where OS/compilers/libraries may not be available. It becomes very tricky to get the software to make the hardware repeatably do something to help with your hardware debug when there's an OS & optimising compiler in the way.
EE's may or may not like the fact, but in todays world they increasingly need to have a foot in both worlds. They're also far more likely to need to know asm as they're involved earlier... While I think you're right that the EE probably doesn't want to be a CS, the CS can't do a thing if the EE hasn't done his job and the hardware doesn't work or is unreliable.
The lines are blurred these days, it's becoming much more difficult for either a CS or EE to function without knowing a lot about the other side.
As far as compilers go, pretty much every compiler out there has been tested and retested and tweaked and fixed for years, if not decades. Unless you need to compile for an architecture that is brand spanking new, you probably won't get any compiling errors. I will not say that every compiler is entirely bug-free, but I know people who have been programming for longer than I have been alive who have never once had a problem using a compiler.
Follow the linux kernel mailing list for a while and you'll come across many issues where compilers have done something dumb. The problems tend to get highlighted much more when you're writing at the OS level and consequently you need to be very aware of what the hardware capabilities/limitations are. This can come down to odd stuff like a compiler packing a structure differently such that it no longer fits in a single cache line and you get an order of magnitude performance degradation in a critical path. There's nothing wrong with the compiler as such, there are no errors in compilation, and a CS working at a higher level may never notice.
May be you are correct. But from my personal experience, it's a little different. I learned assembly for 8051 and 8085. But now I'm using completely different platforms like AVRs, MSPxxx, ARM for my projects. And mostly i don't have to write anything in ASM. But when I have to write, I can grab their datasheets and application notes and mostly can figure out what to do without diving deep into their ASM codes.
Definitely I agree with you that learning ASM can give you a better insight of how machines work. But given the context, I cannot recommend anyone to learn ASM as their first language. And also, when I was in learning 8051 asm, I was under the assumption that that is the only asm and I can use it universally much like C. But when I started seeing 8085, it's a little different. When I saw AVR a little more different. And with the ARMs that I'm working now, a lot more different. May be this kind of misunderstandings can happen to other newbies also. So I just tried to avoid that.
I have to agree with you. If you want to be a computer scientist, learning assembly is a great step. I'm actually taking a class in x86 assembly next semester. Nothing tells you more about what the computer is doing than learning assembly, save, perhaps, learning to build your own CPU.
The thing is, electrical engineers are not computer scientists, and most of them likely never want to be. They need a programming language that works on many different platforms, and they need to learn it in addition to other, non-compute sciencey things. Even embedded engineers are likely not going to be using one specific platform their whole career, thus necessitating learning either a multi-platform language or multiple assembly languages.
As far as compilers go, pretty much every compiler out there has been tested and retested and tweaked and fixed for years, if not decades. Unless you need to compile for an architecture that is brand spanking new, you probably won't get any compiling errors. I will not say that every compiler is entirely bug-free, but I know people who have been programming for longer than I have been alive who have never once had a problem using a compiler.
I can't imagine an electrical engineering program that doesn't teach some form of assembly language. Although tedious, I think if you are going into embedded systems you should learn it at some point. You may never use it again, but it will give you insight into microprocessors. In fact, I think you should decode assembly language into machine language so that you gain a deep understanding of what these microprocessors are actually doing. The more abstracted you are from the program, the less you actually understand what it is doing.
I get the impression that if you want to do ASM, you go to EE rather than CS. When I was an undergrad it was considered important for a CS major to master assembly language as a fundamental part of the science. However, years later it seemed that CS no longer had faculty who were themselves masters of ASM, and the subject atrophied in CS and it was up to EEs to fill the gap.
Regarding ASM versus machine language: some ASMs have nice, clean machine-language codings and it can be beneficial to read octal/hex dumps. Then there's ARM...
I can't imagine an electrical engineering program that doesn't teach some form of assembly language. Although tedious, I think if you are going into embedded systems you should learn it at some point. You may never use it again, but it will give you insight into microprocessors. In fact, I think you should decode assembly language into machine language so that you gain a deep understanding of what these microprocessors are actually doing. The more abstracted you are from the program, the less you actually understand what it is doing.
Edit:
In many 8051 based microcontrollers, of which there are still new designs created, there is a watchdog timer. The job of the watchdog timer is to reset the system if a certain time has passed. This timer allows systems that have locked up to reset themselves. If too many variables are declared before the main function can be reached, the system you are programming will reset indefinitely. This is because the watchdog timer is on by default on boot and must be disabled if you want to turn it off. The only way to shut that watchdog timer off before reaching main is by making an assembly file (usually startup.a51) that disables the watchdog timer before the system initializes.
And don't try to learn assembly. In my opinion, it's a waste of time. It will be only useful if your are writing some device drivers/OS modules.
Chris Pilcher wrote:
This is especially true when you consider that pretty much every platform is going to have its own assembly language. Personally, I'd rather learn one language that works on three different platforms than three different languages.
The beauty of assembly language is that you are understanding what the computer is really doing. If you don't care how the computer does things, then you don't need ASM. However, IMO if you want to be a true computer scientist, you should be curious and fascinated by how computers really work. ASM is the difference between being there and watching over a TV camera.
Some problems are hard to understand and debug without knowing the ASM level. For example, C lets you use high-level language notations to write your programs, but they execute -- and fail -- at the machine language level. Sometimes the high-level model is not enough to understand the failure mode. This is particularly true with embedded systems, where you have less OS and thus less protection. Also, compilers have bugs sometimes. If you can't understand the ASM produced by the compiler, you cannot tell if a bug is in your program or the compiler.
And then maybe someday you'd like to write your own compiler. That's pretty hard to do if you've never been one.
Note: I'm not recommending ASM as a first programming language. I personally think Pascal makes an excellent first programming language, as I said at the beginning of the comments.
This is especially true when you consider that pretty much every platform is going to have its own assembly language. Personally, I'd rather learn one language that works on three different platforms than three different languages.
Top Comments