I think EEs should learn an assembler language that runs on processors they are most likely to encounter when they get out into the work force, not for processors that are simple enough to be hand-assembled. Nobody ever does that these days. And I doubt many EEs are ever going to need to write an assembler or compiler.
I have about 20 languages listed on my resume (as a CS guy), including assembler for several different processor families, but if I was a new EE grad right now, having C and ARM+Thumb assembler would open the most doors.
I really can't understand the obsession with teaching Electronic/Electrical Engineers assembler and C as a first language - or is the general interpretation of EE "Embedded Engineer" ?
Perhaps we should all use ADA more to encourage precision !
Electronic/Electrical Engineers should be studying hardware first so they should have a first programming language that assists that - I vote for Python but Matlab is technically OK but has other issues (see rant above).
I think EEs should learn an assembler language that runs on processors they are most likely to encounter when they get out into the work force, not for processors that are simple enough to be hand-assembled.
I once saw a juggler perform before an audience of mostly children. He told them: "If you want to learn to juggle, don't start with eggs. Start by juggling silk handkerchiefs, because then you're juggling in slow motion."
This applies to any difficult subject: start with simple fundamentals, and then work your way up one simple extension at a time. If you want to teach your child to read, don't start with Moby ***. Start with Green Eggs and Ham.
When learning ASM, there are architecture-independant fundamentals that you need to know. IMO you're much better off learning those on a simple architecture so that you're not distracted by the complexities of something like ARM or x86. Once you've mastered those fundamentals and a few extensions, you can apply that same knowledge to any other architecture that you come across later on. If you start with a complex architecture you end up "trying to drink from a fire-hose" and you'll say "But Assembly's hard!" like Sagar.
Another challenge with ASM is that if you run it on a real processor it can be really hard to debug. People sometimes refer to C as "programming without seatbelts". Well, ASM is "programming without seats, lights, brakes, tires, or a even a body". So you may be better off starting with a simulator, so that your program is running in a controlled environment with better fault handling and debugging capability.
x86 isn't much better. Again, not a good first ASM. And I've heard that PIC is pretty nasty.
PIC asm isn't so bad, but any PIC I've used is Harvard architecture so if you've come from a Von Neumann style system it takes a while to adjust.
The problem with the question is that there's no real answer and I agree with Michael about that. Beyond that starting point though, it's pointless learning a language you'll never use.
For CS it's much more simple, C#,C++,C in no particular order. For webbies, JS, PHP etc. An EE on the other hand might need PIC today, AVR tomorrow, ARM, x86 and on down the list depending on the project. C might seem like a good choice, but without all of the libraries it's advantage is limited.
As an EE myself I've found none of the choices very useful, much more valuable is the general understanding of the principles that you can then apply to the problem at hand in any language.
Beyond that starting point though, it's pointless learning a language you'll never use.
Gee, the first programming language I ever used was FOCAL (formula calculator), an interpretive language than ran on the PDP-8. Nice, simple introduction to the principles of programming without the complexities of a serious language. Haven't used that one in over 40 years. So much more fun to program PDP-8 in ASM
I think it's important that the first programming language teach good programming practices and not create sloppy habits and woolly thinking that will be hard to undo later. However, I wouldn't go as far as Edsger Dijkstra:
It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.
Thing is, language fads come and go. What's considered good today won't be in 1/5/10 or 40 years time. Todays good practices will become tomorrows woolly thinking. It's simply inevitable that things move on.
Thing is, language fads come and go. What's considered good today won't be in 1/5/10 or 40 years time. Todays good practices will become tomorrows woolly thinking. It's simply inevitable that things move on.
The fundamental principles of programming languages haven't changed much since I started over 40 years ago. You still have data types, variables, conditional execution, iteration, linked data structures, recursion, and people arguing over dynamic versus static typing. Machine languages got more complex, but then went back to RISC for performance rather than cost -- how different is today's typical RISC load/store architecture from IBM 360? (Ignoring EX, of course.) Many languages have come and gone, but many others like Fortran and C keep getting upgraded. A lot of promising ideas like functional programming never get traction.
My pet peeve? How come almost all languages are plain ASCII so they can be edited on a "glass TTY"? Why are they still essentially decks of cards? When John Backus et. al. developed Fortran, they had to make a lot of compromises to transcribe mathematical notations into something compatible with an IBM 026 keypunch. Remember .LT. and .LE.? Well, you can still use those good old classic notations in HTML End of rant.
John, I absolutely agree, the fundamental principles haven't changed. C stands the test of time because as you observed it's really just portable asm.
The rest is just window dressing and those promising ideas get shouted down by a more vocal group who refuse to consider that their own ideas may be wrong.
So that's why I really believe that people should learn the fundamental principles then apply them. It's all too easy to learn a language and then become trapped in it's particular philosophy.
What you said earlier about fundamental principles is the important thing, that and how to apply them and create basic algorithms and their concepts are things that people new to programming need. Really the particular first language learned isn't the important thing, as long as it is a language that you can learn that underlying knowledge from. Once you have that knowledge you can pretty much pick up any language relatively easily, quite often just being a case of applying a different syntax to the problem you are trying to solve.
What you say is probably true within procedural languages. There is a more fundamental difference in mind set between functional/logic languages (SML, Prolog, F#, OCAML, LISP and so on) and procedural languages (FORTRAN, C, C++, Java, BASIC etc). The first language I learned was FORTRAN (on a main frame with punched cards!), I didn't come to functional languages (SML and PROLOG) until much more recently and it was not easy making the transition.
What you say is probably true within procedural languages. There is a more fundamental difference in mind set between functional/logic languages (SML, Prolog, F#, OCAML, LISP and so on) and procedural languages (FORTRAN, C, C++, Java, BASIC etc). The first language I learned was FORTRAN (on a main frame with punched cards!), I didn't come to functional languages (SML and PROLOG) until much more recently and it was not easy making the transition.
Top Comments