Introduction
On its surface, this is a simple question and chances are, if you’re already reading this, you’re smarter than the average electronically inclined bear and already know the answer. For those of you who are still getting a handle on the basics, let’s dive into the details.
What is a Microprocessor?
Like a brain to a body, a microprocessor is an integral part of any computer system. You can separate the two for sure, but they’re not all that useful on their own. A microprocessor is a central processing unit (CPU) contained entirely within an integrated circuit. At a given clock speed, it reads binary data from external memory into its registers and then executes instructions upon said data before presenting the result as its output.
You have pedantic permission to frown at anyone who says they program microprocessors, because that’s nonsense. You may program a microcontroller, or write a program for a certain microprocessor, but that’s it. Microprocessors require external hardware to do meaningful work, the least of which is a place to read data and instructions from (i.e. memory), and a place to send it, such as a serial or parallel bus or back to memory.
Microprocessors vary wildly in size and complexity, and of the key separating factors, word length is one with which most people are somewhat familiar. When you talk about a 64 or 32 bit system, your referring to the word length of the processor; that is generally how many bits can be processed in one cycle. Intel genuinely ushered in a new era of computing with the introduction of the venerable 4004 processor in 1971 which had a whopping 4-bit word length. While that doesn’t sound like much, even today “smaller” processors like an 8-bit 6052 can and often do address 16 bits or more, although at the expense of extra clock cycles to do so.
Nowadays, you likely won’t encounter a microprocessor outside of “building” a PC as you, carefully, drop it into a socket in a motherboard. However, variants of the 6502 and 8080 and other classic architectures abound, and the hobby of creating your own “personal home-computer” a là the 70’s and 80s is alive and well. If you’re feeling like embarking on an epic computer engineering journey though, you might love James Newman’s MEGAPROCESSOR, which is a fully functional, 16-bit processor scratch-built from individual transistors.
(Megaprocessor James Newman)
What is a Microcontroller?
Microcontroller units (MCU) however, are a complete “computer on a chip” and contain a CPU, as well as various types of ROM, RAM, and other peripherals like an FPU (floating point unit) for faster mathematical operations or a UART (universal asynchronous receiver and transmitter) for handling serial data.
As they are nearly fully integrated computers, many MCUs can run with minimal, or no, external components; give ‘em a power source and they’ll happily chug along at a reduced spec. Although you’ll often need to add a few passives as well as an oscillator for ideal clock frequency for peak operation. Either way, this makes development a breeze since your component count is simplified and the chip is designed to be programmed for standalone functionality.
While the latest and greatest microcontrollers are increasingly fast, running at hundreds of megahertz with 32-bit architectures, they’re meant to fill a compact, cost effective computing niche. While I would love to play with a 2GHz AVR with 8GB of RAM, it’s more power than I really need (but still want!). Much in the same way that we use bicycles and 18-wheelers for transportation, I wouldn’t want to peddle an engine block across the state, nor would I want to pick up groceries in a big rig. MCUs are great where they are, running microwaves, timing washing machines, and keeping our Casio watches ticking.
As an aside: most microcontrollers are of a reduced instruction set computer (RISC) architecture, while “desktop” Intel/AMD x86-based CPUs are often of complex instruction set computer (CISC) architecture. The nitty gritty of the differences between the two is beyond the scope of this article, but given that most mobile devices use ARM-based (RISC) system-on-a-chip (SoC) designs, this means wider understanding, and development for software that runs natively on RISCy hardware is greater than ever.
But what about SoCs?
If you reach into your pocket and pull out your shiny glass communicator, then you’ve probably heard something about its specs, which includes the processor’s many features which you definitely need to spend an extra $300 to get. While for the sake of marketing simplicity, you often just hear the term “processor,” in reality the Qualcomm Snapdragon 9999 in your phone is really an SoC that has a CPU, ROM, RAM and other useful peripherals like radios or graphical processing units (GPUs). Hmm, sounds like a microcontroller. What gives? SoCs aren’t as specialized as either a microprocessor or microcontroller - quite the opposite. They are often the “kitchen-sink” edition of portable computing, attempting to stuff as many useful features into one component. SoCs may contain a CPU or a Microcontroller, or both. In fact, a common term like “motion co-processor” is just marketing speak for a dedicated microcontroller that handles the real-time, intensive tasks of reading an inertial measurement unit (IMU) and similar sensors.
Conclusion
I’ll be honest, prior to starting my own electrical engineering journey, these and many other terms were lost on me as pop cultural depictions of technology are often inaccurate at best or aggressively wrong at their worst. Hollywood seems to love using the term “microchip” or “microprocessor” for just about any small component with pins sticking out of it (bonus points for using a full dev board littered with LEDs and still calling it a microchip), but a microprocessor is a specific component and not a catch-all term. I can’t recall ever hearing the word “microcontroller” used on T.V. or in a movie though...
(By the way, did you know that if you hot glue a Raspberry Pi to a nuclear warhead and short pins 1 and 5 in morse code to sound out “SHUTDOWN” absolutely nothing happens, because that’s not how any of this works?)*
If you’d like to share an analogy with someone who is interested, I particularly like to explain electronics in terms of food: a microprocessor is like a chef, given organized ingredients, she can cook a set number of styles and deliver consistent meals as an output. One piece of the equation. However, a microcontroller is the whole restaurant with the chef, waiters, hosts, cleaning crew, line cooks, etc. necessary for a customer to enjoy a meal.
*You obviously have to use “SUDO SHUTDOWN,” which I can neither confirm nor deny does the trick.
Top Comments