Actually, I'm doing a home course to brush-up on my assembly chops. I remember now why I didn't like it the first time around in college.
Some assembly languages are harder than others. The best one I've ever used is PDP-11 -- very clean, very orthogonal, simple instruction coding that allows you to read an octal dump and immediately recognize and decode common instructions. C was originally designed to produce excellent PDP-11 code, and you see the PDP-11's influence on C, in particular stacks growing towards low memory (most C compilers evaluate their arguments in right-to-left order) and the auto-increment and auto-decrement expressions. A C statement like "*s++ = *t++;" translates into one single-word PDP-11 instruction.
ARM started out as a RISC machine (Acorn RISC Machine) and after a few versions became Advanced RISC Machine and later just ARM. ARMv7 is quite hard to understand: each version added new instructions by sticking them into gaps in the previous versions' instruction codings. If you like complex railroad time schedules with myriad footnotes, take a look at the quick reference cards at arm.com. Then for even more fun, download the entire ARMv7-AR Architecture Reference Manual (the "ARM ARM") and try to figure out how many instruction formats there are. This doesn't impact performance, since the chip-level hardware doesn't care if instruction fields are scattered all over the place, but not a good first ASM IMO.
x86 isn't much better. Again, not a good first ASM. And I've heard that PIC is pretty nasty.
PowerPC is quite regular and I recommend it. I've looked a little at the MSP430 and so far it looks like a PDP-11 combined with some RISC philosophy, so I'd definitely consider it. AVR isn't bad.
Since most people program in C, the complexities of the underlying ASM don't matter to most programmers. But those of us who write compilers and (dis)assemblers do see it and cringe and/or gag, depending on the ASM. Go write an assembler and/or disassembler for ARMv7-A, with both ARM and Thumb-2 codings. It's loads of fun! It's nothing like an 8080 assembler, believe me.
I think EEs should learn an assembler language that runs on processors they are most likely to encounter when they get out into the work force, not for processors that are simple enough to be hand-assembled. Nobody ever does that these days. And I doubt many EEs are ever going to need to write an assembler or compiler.
I have about 20 languages listed on my resume (as a CS guy), including assembler for several different processor families, but if I was a new EE grad right now, having C and ARM+Thumb assembler would open the most doors.
I really can't understand the obsession with teaching Electronic/Electrical Engineers assembler and C as a first language - or is the general interpretation of EE "Embedded Engineer" ?
Perhaps we should all use ADA more to encourage precision !
Electronic/Electrical Engineers should be studying hardware first so they should have a first programming language that assists that - I vote for Python but Matlab is technically OK but has other issues (see rant above).
I think EEs should learn an assembler language that runs on processors they are most likely to encounter when they get out into the work force, not for processors that are simple enough to be hand-assembled.
I once saw a juggler perform before an audience of mostly children. He told them: "If you want to learn to juggle, don't start with eggs. Start by juggling silk handkerchiefs, because then you're juggling in slow motion."
This applies to any difficult subject: start with simple fundamentals, and then work your way up one simple extension at a time. If you want to teach your child to read, don't start with Moby ***. Start with Green Eggs and Ham.
When learning ASM, there are architecture-independant fundamentals that you need to know. IMO you're much better off learning those on a simple architecture so that you're not distracted by the complexities of something like ARM or x86. Once you've mastered those fundamentals and a few extensions, you can apply that same knowledge to any other architecture that you come across later on. If you start with a complex architecture you end up "trying to drink from a fire-hose" and you'll say "But Assembly's hard!" like Sagar.
Another challenge with ASM is that if you run it on a real processor it can be really hard to debug. People sometimes refer to C as "programming without seatbelts". Well, ASM is "programming without seats, lights, brakes, tires, or a even a body". So you may be better off starting with a simulator, so that your program is running in a controlled environment with better fault handling and debugging capability.
x86 isn't much better. Again, not a good first ASM. And I've heard that PIC is pretty nasty.
PIC asm isn't so bad, but any PIC I've used is Harvard architecture so if you've come from a Von Neumann style system it takes a while to adjust.
The problem with the question is that there's no real answer and I agree with Michael about that. Beyond that starting point though, it's pointless learning a language you'll never use.
For CS it's much more simple, C#,C++,C in no particular order. For webbies, JS, PHP etc. An EE on the other hand might need PIC today, AVR tomorrow, ARM, x86 and on down the list depending on the project. C might seem like a good choice, but without all of the libraries it's advantage is limited.
As an EE myself I've found none of the choices very useful, much more valuable is the general understanding of the principles that you can then apply to the problem at hand in any language.
Beyond that starting point though, it's pointless learning a language you'll never use.
Gee, the first programming language I ever used was FOCAL (formula calculator), an interpretive language than ran on the PDP-8. Nice, simple introduction to the principles of programming without the complexities of a serious language. Haven't used that one in over 40 years. So much more fun to program PDP-8 in ASM
I think it's important that the first programming language teach good programming practices and not create sloppy habits and woolly thinking that will be hard to undo later. However, I wouldn't go as far as Edsger Dijkstra:
It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.
Thing is, language fads come and go. What's considered good today won't be in 1/5/10 or 40 years time. Todays good practices will become tomorrows woolly thinking. It's simply inevitable that things move on.
Thing is, language fads come and go. What's considered good today won't be in 1/5/10 or 40 years time. Todays good practices will become tomorrows woolly thinking. It's simply inevitable that things move on.
The fundamental principles of programming languages haven't changed much since I started over 40 years ago. You still have data types, variables, conditional execution, iteration, linked data structures, recursion, and people arguing over dynamic versus static typing. Machine languages got more complex, but then went back to RISC for performance rather than cost -- how different is today's typical RISC load/store architecture from IBM 360? (Ignoring EX, of course.) Many languages have come and gone, but many others like Fortran and C keep getting upgraded. A lot of promising ideas like functional programming never get traction.
My pet peeve? How come almost all languages are plain ASCII so they can be edited on a "glass TTY"? Why are they still essentially decks of cards? When John Backus et. al. developed Fortran, they had to make a lot of compromises to transcribe mathematical notations into something compatible with an IBM 026 keypunch. Remember .LT. and .LE.? Well, you can still use those good old classic notations in HTML End of rant.
Thing is, language fads come and go. What's considered good today won't be in 1/5/10 or 40 years time. Todays good practices will become tomorrows woolly thinking. It's simply inevitable that things move on.
The fundamental principles of programming languages haven't changed much since I started over 40 years ago. You still have data types, variables, conditional execution, iteration, linked data structures, recursion, and people arguing over dynamic versus static typing. Machine languages got more complex, but then went back to RISC for performance rather than cost -- how different is today's typical RISC load/store architecture from IBM 360? (Ignoring EX, of course.) Many languages have come and gone, but many others like Fortran and C keep getting upgraded. A lot of promising ideas like functional programming never get traction.
My pet peeve? How come almost all languages are plain ASCII so they can be edited on a "glass TTY"? Why are they still essentially decks of cards? When John Backus et. al. developed Fortran, they had to make a lot of compromises to transcribe mathematical notations into something compatible with an IBM 026 keypunch. Remember .LT. and .LE.? Well, you can still use those good old classic notations in HTML End of rant.
John, I absolutely agree, the fundamental principles haven't changed. C stands the test of time because as you observed it's really just portable asm.
The rest is just window dressing and those promising ideas get shouted down by a more vocal group who refuse to consider that their own ideas may be wrong.
So that's why I really believe that people should learn the fundamental principles then apply them. It's all too easy to learn a language and then become trapped in it's particular philosophy.
What you said earlier about fundamental principles is the important thing, that and how to apply them and create basic algorithms and their concepts are things that people new to programming need. Really the particular first language learned isn't the important thing, as long as it is a language that you can learn that underlying knowledge from. Once you have that knowledge you can pretty much pick up any language relatively easily, quite often just being a case of applying a different syntax to the problem you are trying to solve.
What you say is probably true within procedural languages. There is a more fundamental difference in mind set between functional/logic languages (SML, Prolog, F#, OCAML, LISP and so on) and procedural languages (FORTRAN, C, C++, Java, BASIC etc). The first language I learned was FORTRAN (on a main frame with punched cards!), I didn't come to functional languages (SML and PROLOG) until much more recently and it was not easy making the transition.
Top Comments