RoadTest: Microchip AVR-IoT WG Dev Board
Author: florinescu
Creation date:
Evaluation Type: Development Boards & Tools
Did you receive all parts the manufacturer stated would be included in the package?: True
What other parts do you consider comparable to this product?: Arduino MKR WiFi, ESP-32, Arduino Uno WiFi
What were the biggest problems encountered?: connection reliability
Detailed Review:
Hello! My name is Florin, I am a major in Systems Engineering currently employed in an automotive company and here is my review of the Microchip AVR-IOT WG Development Board. This review is more of an analysis of the board and its capabilities, than a walkthrough of a demo project using it. I try to answer the questions "How can it help me develop an IoT solution? What differentiates it from similar products?".
My first concern with reviewing this board is power consumption and whether it would be useful for battery powered projects where it would last weeks without a charge.
I received a small box containing just the board. It is a bit disappointing that they didn't include a microUSB cable to connect the board to the PC, but since most likely everyone has one lying around, it's not that big of an issue and it helps bring costs down for Microchip. There is no manual inside which, again, is not necessary as everything you need can be found on the manufacturer's page.
First thing when you connect the device to your PC is the Demo app. The debugger enumerates as a USB storage with some files with board info. There is a CLICK-ME.html file which opens a page in the avr-iot.com domain showing you how to set up your device to use your wifi credentials and showing you data retrieved from Google Cloud once the device is successfully connected
With regards to the guy who had his PC destroyed by this board, the autorun.inf file only specifies the icon to display for the drive. Even if your device has been tampered with and a malicious .exe has been added to the storage, since windows 7 & up, the OS DOES NOT automatically execute what the autorun file points to.
When connection is successful, the web page redirected to from CLICK-ME.html looks like this:
The data looks ok, even if I don't have a second device to compare brightness or temperature. It is updated in real time once every second. The measurement units aren't displayed. For the light sensor, it's lux, and for temperature it's °C. Sorry USA!
A big downside of the board: it gets a bit warm. When thinking of a development board with light & temperature sensors, the first thing that comes to mind is using it for ambient measurements. This however is not possible since the temperature sensor warms up to the board's 33-35 °C. In some projects board temperature can be relevant, however if you want to monitor room temperature you would need to connect a separate sensor unless it's summer and over 35 degrees outside. Even with proper sleep mode to reduce current, the board was hovering around 26 °, a few degrees above room temperature.
Microchip's device page contains plenty of documentation, the first being the User Guide. It shows you how to first use the board and how to set up your environment to be able to build for the device. I am not a particular fan of splitting the documentation in several files, as is the case here. The Technical Summary duplicates some of the information in the User Guide, but provides more in-depth information such as power generation for the board, some particularities regarding power supply and current sensing and what all the on-board LEDs do in the default project. There are also some documents like "Development Board Overview" and "Pocket Guide" which also show very basic information like board pinout. There is also a schematic, a must for any kind of modification. It's very nice that they also provide full datasheets for everything relevant on the board.
It can get a bit confusing with so many documents if you are looking for that one bit of info that you can't remember where you've read, but once you read through all of them a few times you get used with where to find what.
The board is comparable in size and features to other embedded development boards like the Arduino MKR WiFi or the ESP-8266. It appears as if it was designed to allow you to cut off the debugger and power supply side for maybe an end-product where no programming would be necessary and power is supplied some other way.
Reasons why this board is preferable over other solutions:
Things that could be improved with the hardware design:
The board has a very nice power supply system, definitely designed with low power applications in mind. It uses a battery charge IC to provide either USB or battery voltage to the 3.3V buck regulator which powers the system (microcontroller, debugger and all other ICs). 3.3V, not 5V as is the case with the more popular development boards.
Highlights:
The light sensor is a phototransistor which the development board technical summary says will draw 10-50 uA between 20-100 lux. In full daylight you can expect over 1000 lux which increases the current consumption by quite a lot more.
The temperature sensor is a bit more sophisticated:
The ATMega4808, an 8bit microcontroller part of the megaAVR 0 series, is the brain of this board. Highlight of features compared to the ATMega328P (and other older AVRs) from the Arduino Uno:
Overall this controller feels really great to work with. It has many obvious improvements: some directly targeted at low power usage, some at simplifying programming for beginners, some as proof that they have learned from their experiences with older ATMegas and are bringing new stuff to the table. It is built around IoT. The Core Independent features (Event Controller and Configurable Custom Logic) allow for great power saving and move lots of not-so-trivial SW to the HW.
Doesn't take much to get used to the differences if coming from older ATMegas.
As previously mentioned, the board doesn't come with any kind of pin header. You need to mount them yourself, based on what you may need). Should go for female straight headers if you want to use MikroElektronika click boards.
The board advertisement placed a lot of emphasis on it having a pinout called mikroBUS compatible with MikroElektronika click boards. For those that never heard of these before, they are similar to the arduino shields in that they are a secondary board attaching to all of your board's pins, even if not using all of them. The reasoning behind these click boards is that you just plug&play the board to add features. For a board trying to be the IoT equivalent to beginners' Arduino, I am sure it is a most welcome feature. But I am not a fan of this design since you cannot access pins not used by the click board, and I do not have any of these boards around.
Also if you put a click board over this one, it completely covers the ambient light sensor. Bit of a slip on the board design.
Anyway, what pins are available? As it turns out, not all of them. All pins of the microcontroller are exposed (including the I2C and SPI busses) except some connected to other modules.
The microcontroller pins not exposed on the board are:
The 5V supply to the mikroBUS header is not connected by default. To enable 5V to the header, solder in a 0-ohms resistor (0603) or a solder blob over the footprint under the 5V pin. It's worth mentioning that this pin is connected to the output of the battery charge circuit, so it is 5V only while powered from USB, otherwise the battery voltage will be present here.
The fact that it includes battery management and a battery connector is amazing and uncommon for development boards. The connector is tiny and I always feel like I will rip it from the board when disconnecting the battery. I had to do a little research to find the right crimp terminals and connector housing, since Microchip only provides the connector part number in the BOM. Here are the manufacturer codes to spare other people some trouble: JST PHR-2 and SPH-002T-P0.5S.
The board has two straps that can be cut to measure current usage for the entire system (3V3 rail) excluding the debugger, and for the debugger separately. It could've been prettier to have a solder blob joining two pads from the factory, a removable 0 ohm resistor, or even a pin header with a jumper. For me it's a slight annoyance that the 2 pads, once cut, don't have a 0.1" spacing, but there are 2mm jumpers or 0805 SMD 0 ohm resistors instead to connect things back in a pretty way.
For a board marketed for IoT, obviously the biggest concern is power usage. Based on the datasheets for all of the devices, I have estimated the current draw as follows:
Component | Stand-by current (if applicable) | Typical on current | Max. on current |
---|---|---|---|
Battery Charger | 28 uA battery only | 28 uA battery only 180 uA USB only 250 uA USB + battery charged 2.5 mA USB + battery charging | 50 uA battery only 300 uA USB only 350 uA USB + battery charged 3.8 mA USB + battery charging |
Buck converter | 20 uA | 20 uA | 32 uA |
Debugger | 0 A if disabled | 3.4 mA | 3.7 mA |
Microcontroller | 0.1 uA | 2.3 mA | 5 mA |
Temperature sensor | 0.1 uA | 200 uA | 400 uA |
Light sensor | 5 uA | 50 uA | 1 mA full sunlight |
Secure element | 2 uA if device is sent to sleep | 2 mA | 14 mA processing |
WiFi | 380 uA | 100 mA | 290 mA while transmitting |
Status LEDs | 0 A (no LED on) | 0.4 mA blue LED+ 1 mA green LED | 2.4 mA (3 LEDs on) |
Battery charge LEDs | 0 A | 0 A no battery connected or battery charged 1 mA USB + battery connected | 2 mA charged for more than 6 h |
Debugger power LED | 0 A | 1 mA | 1 mA |
Total | 104 uA | 111 mA battery only, USB only or USB + battery charged 113 mA USB + battery charging | 315-320 mA |
To measure the actual current, I have done the following:
1) Mount a battery connector to my battery. The battery I used is a Cellevia LP573450 with 980mAh capacity and 5C rated discharge current. It has its own protection circuit, which disconnects the battery when it discharges past 3.3 V. I remember having a battery from MikroElektronika with this particular connector but I cannot find it.
I used regular pliers to tighten the crimp terminals to the wire and the insulation since every terminal type comes with its own crimp tool. A thing I do in this case to make sure the terminal makes a good connection to the wire is to add a bit of solder to where the terminal crimps the wire conductor. You need to be very quick with this or the terminal heats and melts the insulation.
You can see in the pictures what the connectors look like, how ugly I crimped them, and the board finally powered from the battery.
2) Next I cut off the straps for target and debugger supply and soldered wires to them. My intention is to use aligator clips to connect them to my multimeters for current measurement and connect the wires to each other to re-do the straps when this is not needed.
Here I also soldered pin headers to the board and mounted some screws so I wouldn't risk ripping the wires from moving the board around the desk.
3) Here is the final board with the aligator clips connected to the wires.
4) Current measurement running off battery only. Left multimeter is target current, right one is nEDBG current. All following current measurements are done with default app running.
5) Current measurement running off USB. First picture is before WiFi chip is initialized and communicating, second one is after WiFi is running
6) Battery fully charged
7) 6 hour timer kicks in. For my 980 mAh battery, 100 mA current for charging is too slow.
Contrary to the charger datasheet which says that if the timer kicks in you must disconnect and reconnect the battery, I simply reconnected the USB cable leaving the battery as-is and it started a new charge cycle. Funny.
As we can see I was off by a lot in my estimates, especially regarding WiFi. To sum up my findings:
This is huge and the default application kills my 980mAh battery in around 10-11 hours.
Battery charge time was similar, taking around 10-11 hours for a full charge. This however can be improved by replacing the R104 resistor according to the charger datasheet.
Unfortunately my battery protection circuit kicks in at 3.3 V open circuit, way before the charger's 3.1 V threshold for the Low Battery indication. I have seen the low battery LED on occasion. Also the battery sometimes got charged to 4.25V and still the red LED was on, as the charger thought it wasn't yet done. This may also be because of the battery protection which would draw more current than the charge termination logic would expect.
As you can see, the debugger, LEDs and WiFi play a major role in optimizing board power consumption.
The 4 status LEDs are connected to the microcontroller and therefore can be disabled in software to save power. The battery LEDs do not bother during normal usage but you can remove the series resistor R102 so the low battery LED doesn't draw power while the signal is active. 1mA may not seem much, but compared to the sleep low power consumption, that is a big increase! You could warn about a low battery via WiFi once instead of having an LED on continuously.
The debugger supply is separate and can be cut off by cutting the VCC nEDBG strap. The debugger LED turns off while only battery powered, but the debugger does not enter any low power state. Therefore i recommend leaving its supply cut off. Have a jumper there so you can easily switch between debugging and battery testing. Very important to set all microcontroller pins connected to the debugger as output LOW, and do not connect the USB while the debugger supply is disconnected. Otherwise the debugger gets powered parasitically via its I/Os and will be destroyed by this.
The web page showing you device data streamed has a section to help you migrate your device to your Google Cloud Platform account, which is actually a github repo which needs to be cloned into some folder into your console account. A complicated process to be honest, but it is something you need to do once and then it's done. From now on all the data resides in my Google account and I can view it in Firebase Console in the Database menu. The hashed things are my project ID and device ID. The timestamp is an epoch timestamp (seconds since 1 January 1970).
Extending the data displayed to further fields is surprisingly easy. I just modified the sendToCloud() function with a new JSON field for human detection in hopes that it will show up in firebase and it did, updated in real time. Also the application linked to the private account (slightly different to the sandbox one) automatically started displaying a chart for the new field.
int len = sprintf(json, "{\"Light\":%d,\"Temp\":\"%d.%02d\",\"Human\":\"1\"}", light, rawTemperature / 100, abs(rawTemperature) % 100);
The demo project also shows you how to use a toggle on the website to send data TO the board and react to it. Unfortunately, once switching over to my private account, adding toggles or fields to the web app is gone from the interface. There is not much help available with regards to adding functionality to the web application and there is a lot of digging necessary here. The bottom of the page is full however with messages telling you to get in touch with Leverege, the web developer behind the interface, to have them design it for you. Not much help from them or Microchip for the casual hobbyist.
There are 3 methods for flashing/communicating with the device:
Writing code for the controller is a bit different in C than for older ATMegas. The first and most striking is the fact that the registers are not defined individually, but based on their associated peripheral, defined via C structures. This means that the initial address of registers of a peripheral is defined as the address of the struct, and all struct members follow as offsets of that address. This is done in the iom4808.h header, which is the controller specific header assigned when you include <avr/io.h>. The datasheet also refers to the peripherals in this manner. Neat trick to keep things organised. For instance, PORTD's registers on ATMega328P are DDRD for direction, PORTD for output, PIND for input. On the ATMega4808, PORTD.DIR is for direction, PORTD.OUT and PORTD.in for output and input, respectively.
There is another trick implemented in hardware regarding GPIOs. There are certain registers which allow you to configure a pin without disturbing the other pins. If you write value 0x02 to PORTB.DIR to set pin B1 as output, you also automatically reconfigure other pins in port B as inputs. Instead, you can write a 0x02 to PORTB.DIRSET to only set bit 1 in PORTB.DIR (pin B1 as output), without changing other registers. Similarly, writing 0x02 to PORTB.DIRCLR will clear bit 1 in PORTB.DIR (B1 as an input). There are similar registers called PORTx.DIRTGL, OUTSET, OUTCLR and OUTTGL which are pretty self explanatory. I can see this being useful as a fail-safe for beginners, but good luck porting that to older controllers!
This controller also sets the notion of virtual port. This means that the basic port registers (not the strobe command ones like DIRSET, DIRCLR), normally located in the extended I/O memory space are also mapped to the bit-accessible I/O space. This may make the compiler turn a 8bit write to PORT.DIR into a 1bit write to VPORT.DIR. It was a bit confusing for me at first when I saw that the code generated by the Atmel Start configurator used VPORTD but also includes functions for PORTD. Again, using VPORT will decrease portability.
You can modify fuses from Atmel Studio and presumably from MPLAB as well.
To get used to programming the board, I thought I could work on the demo app to try to improve its power consumption as much as possible.
First I just wrote a quick demo function to put the controller in sleep mode resulted in a 600 uA current draw. Not bad!
#include <avr/io.h> #include <avr/sleep.h> #include <avr/interrupt.h> void sleep() { set_sleep_mode(SLEEP_MODE_PWR_DOWN); //Enable interrupts to be able to wake up again sei(); //Go in sleep mode and clear sleep enable bit when woken up sleep_mode(); } int main(void) { ... while (1) { ... sleep(); } }
Then I switched to the demo application and started modifying. First, only init the I2C, nothing else, and inserted the sleep function call into while (1). Modified sleep function to also disable I2C. Goal here is to later talk to the temp sensor and secure element to disable them.
void sleep() { TWI0.MCTRLA &= ~TWI_ENABLE_bm; set_sleep_mode(SLEEP_MODE_PWR_DOWN); //Enable interrupts to be able to wake up again sei(); //Go in sleep mode and clear sleep enable bit when woken up sleep_mode(); }
This should have just enabled the TWI, then disable it and enter sleep. Why is then the consumption so much worse? 2 things happening here:
I also noticed that the current usage would randomly move from ~400 to 700+ uA. Turns out it was a cloudy day and the sun was 5 minutes brighter, 5 minutes behind clouds. At this point the light sensor is starting to become a big power drain.
I tried to put some black tape over it but it didn't do much. I wouldn't want to remove its series resistor, so I did the most sensible thing to do: cover the board with something big and thick.
Now I put the temperature sensor into shutdown mode by adapting the existing sensors_handling file with the following sleep function which will be called by my sleep routine. I also included a wakeup function for later.
void SENSORS_sleep(void) { uint16_t regValue = I2C_0_read2ByteRegister(MCP9809_ADDR, MCP9809_REG_CONFIG); I2C_0_write2ByteRegister(MCP9809_ADDR, MCP9809_REG_CONFIG, regValue | 0x0100); } void SENSORS_wakeup(void) { uint16_t regValue = I2C_0_read2ByteRegister(MCP9809_ADDR, MCP9809_REG_CONFIG); I2C_0_write2ByteRegister(MCP9809_ADDR, MCP9809_REG_CONFIG, regValue & ~0x0100); }
Now we are starting to see real results!
Next I thought hey, since the badly designed mcu_init() is there, why don't I use it to disable all pin digital input buffers at reset, and later enable only the ones that are needed?
The results are amazing! The input buffer current draw obviously depends on what is connected to the pin and the pin state.
void mcu_init(void) { /* On AVR devices all peripherals are enable from power on reset, this * disables all peripherals to save power. Driver shall enable * peripheral if used */ /* Disable input buffers for all pins */ for (uint8_t i = 0; i < 8; i++) { *((uint8_t *)&PORTA + 0x10 + i) |= PORT_ISC_INPUT_DISABLE_gc; } //... //similarly for all ports }
This photo shows the lowest possible current I have achieved with this board (WiFi disabled though):
Next, set the ATECC608 to sleep. I found the library function for this, called atcab_sleep(). Needed to uncomment cryptoauth lib init and add sleep command to our sleep function. No difference in power consumption which means that the ATEC was anyways sleeping while not yet initialized.
Also no power difference when enabling CLI (command line interface) and its corresponding UART.
For WiFi, things get a little more complicated. You can either downright disable it, which means reconnecting every time it wakes up, or you can set it to manage its sleep mode by itself, while keeping the WIFi connection alive. I chose the auto deep sleep mode that must be configured once after the module is initialized, before connecting. Added the following portion to the reInit() function in cloud_service.c, while also enabling the scheduler and everything back. Also increased the data send interval to 10 seconds via MAIN_DATATASK_INTERVAL and the M2M_LISTEN_INTERVAL to 10 (WiFi router can queue up to 10 beacon intervals of data before actually sending it).
static uint8_t reInit(void) { ... //Configure WiFi sleep mode tstrM2mLsnInt strM2mLsnInt; strM2mLsnInt.u16LsnInt = M2M_LISTEN_INTERVAL; m2m_wifi_set_sleep_mode(M2M_PS_DEEP_AUTOMATIC, true); m2m_wifi_set_lsn_int(&strM2mLsnInt); ... }
Finally by integrating all of the above I have achieved a 0.7 mA sleep current while keeping WiFi connected. To test the sleep current I forced the microcontroller to remain stuck in sleep mode after communication to Google Cloud was established. Otherwise my multimeter couldn't sample accurate data because of the WiFi communicating for DTIM sync once every second. There were some times where getting into sleep mode for good still caused a power draw of ~12 mA, which I think is because I was abruptly interrupting the WiFi module, but it was pretty sporadic and just for the sake of demonstration, I didn't look into it further.
My results with different transmission intervals can be seen in the following tables. They assume my 980mAh battery.
No power save
average current | ~110 mA |
battery life | 9 h |
Sending data every second
State | Duration (ms) | Current (mA) | Charge (mA * ms) |
---|---|---|---|
sleep | 835 | 0,7 | 584,5 |
WiFi DTIM sync | 15 | 100 | 1500 |
actively sending data | 150 | 250 | 37500 |
total charge | 39584,5 mA * ms | ||
average current | ~40 mA | ||
battery life | 24,5 h |
Sending data every 10 seconds
State | Duration (ms) | Current (mA) | Charge (mA * ms) |
---|---|---|---|
sleep | 9835 | 0,7 | 6884,5 |
WiFi DTIM sync | 150 | 100 | 15000 |
actively sending data | 150 | 250 | 37500 |
total charge | 59290 mA * ms | ||
average current | ~5,9 mA | ||
battery life | 165 h (1 week) |
Sending data every minute
State | Duration (ms) | Current (mA) | Charge (mA * ms) |
---|---|---|---|
sleep | 59835 | 0,7 | 41884,5 |
WiFi DTIM sync | 900 | 100 | 90000 |
actively sending data | 150 | 250 | 37500 |
total charge | 168765 mA * ms | ||
average current | ~2,8 mA | ||
battery life | 348 h (2 weeks) |
As you can see, choosing your timing strategy and implementing proper sleep is very important to improving your project. Just by implementing proper sleep modes, my project's battery life increased by 275%, up to 1 day. By transmitting 10 times less often, I had around a 7x increase in battery life. By further increasing transmission interval to once a minute, which is completely realistic for a light&temperature sensor, it is able to last for 2 weeks. All in all, an almost 15 times lifetime increase! For receiving data there shouldn't be major power increases, as this would happen in one of the DTIM sync windows.
I have generated a graph showing power draw based on send interval data from the previous tables.
There are however 2 tipping points. One at 25 sec transmission interval, where maintaining the WiFi connection and syncing with the router takes up as much power as sending a single packet. And another point at 55 seconds, where sleep mode starts to eat up more than actually sending the data once. After this point you wouldn't see much benefit by increasing the transmit interval. Following chart demonstrates battery life and power draw for intervals between 1 second and 24 hours.
Now what if you were to fully disable WiFi during sleep (via enable pin) and reconnect to router, send a message, then sleep again? This way the sleep current diminishes to 100 uA so it must be good, right? As it turns out, that doesn't make much sense for really low send intervals, since it takes 5-10 seconds to connect to the router. Only if you send really rarely, at around once every 20 minutes, it does actually break even with having kept the WiFi connected all that time. Increasing the connect&send interval past that point begins to increase battery life exponentially. It probably won't be possible to reach a month of battery life due to the battery leakage, temperature and etc, so I cut the graph at the 1 hour mark. You can see that even when sending data once an hour, it takes almost 90% of total power just to connect to the router.
Of course this is just theory and real life values will be different, but the trend is still clear. If you can get away with transmitting data and reacting to received data more rarely, do it. Around the "communication once every 20 minutes" mark, it starts to make sense to turn off WiFi completely between transmissions. If I were to, say, monitor light&temperature in a field, getting minute updates wouldn't even be that relevant. Throw in a small palm-size solar panel and I may only need to change the battery once every few years when it won't hold a charge any more.
I tried to test the setup for transmission once every 10 sec to see the real life behavior, but somewhere around 12 hours I noticed the connection dropped and I am not sure how much earlier that happened. This random disconnect and lock up has happened repeatedly while trying to retest. I suspect this is due to the way my SW sometimes enters sleep mode while the WiFi was supposed to be doing something. Still, usually after 8 hours the battery voltage dropped from 4.2 V to 4.0 V, and after 12 hours, the battery voltage dropped to 3.9 V, so it should have way over half its capacity left, somewhat better than my estimates. This is accounting for the fact that battery capacity isn't linearly proportional to voltage.
I have come up with a basic project to test the board overall functionality. It is an automated porch light which uses an HC-SR04 sensor to sense if someone is present, and if there is, turn on an LED but PWM it so the room doesn't get too "bright". The device gets the current time from NTP and will not turn the light on if it is still day (between sunrise and sunset). There is a button to completely kill the lights. Everything is output to the cloud, where it could be used in whatever way. The idea behind this is to use as much of the core independent functionality as possible.
Here is the code I have made for this, only showing the relevant parts that I have added:
The NTP part is done automatically by the WiFi stack, so all I need to do is extract the saved timestamp, set my geo location for sunrise&sunset calculations and get those values.
void time_init() { set_position(45.7489 * ONE_DEGREE, 21.2087 * ONE_DEGREE); } { time(×tamp); sunrise = sun_rise(×tamp); sunset = sun_set(×tamp); timestamp += 2 * ONE_HOUR; sunrise += 2 * ONE_HOUR; sunset += 2 * ONE_HOUR; memcpy(¤tTime, localtime(×tamp), sizeof(struct tm)); memcpy(&sunriseTime, localtime(&sunrise), sizeof(struct tm)); memcpy(&sunsetTime, localtime(&sunset), sizeof(struct tm)); if (difftime(timestamp, sunrise) > 0 && difftime(sunset, timestamp) > 0) { //Day TCA0.SPLIT.HCMP1 = 0x00; } else { //Night if (humanPresent) { if (light < 500) TCA0.SINGLE.CMP0BUF = TCA0.SINGLE.PER / 100 * ((500 - light) / 5); else TCA0.SINGLE.CMP0BUF = 0; } else { TCA0.SINGLE.CMP0BUF = 0x00; } } }
For the ultrasonic, I periodically pulse the trigger line and use TCB (Timer/Counter B) in pulse width input mode to measure the duration of the sensor's output pulse. Since TCB does not directly have an input pin, the pin state must be routed through the EVCTRL (Event Controller), also demonstrating this functionality.
void ultrasonic_init() { EVSYS.CHANNEL2 = EVSYS_GENERATOR_PORT0_PIN0_gc; EVSYS.USERTCB0 = EVSYS_CHANNEL_CHANNEL2_gc; PORTC.PIN0CTRL = PORT_ISC_INTDISABLE_gc; TCB0.INTCTRL = TCB_CAPT_bm; TCB0.CTRLA = TCB_RUNSTDBY_bm | TCB_CLKSEL_CLKDIV2_gc; TCB0.CTRLB = TCB_CNTMODE_PW_gc; TCB0.EVCTRL = TCB_CAPTEI_bm; TCB0.CTRLA |= TCB_ENABLE_bm; } void ultrasonic_trigger() { PORTC.OUTSET = 1 << 1; _delay_us(10); PORTC.OUTCLR = 1 << 1; } ISR(TCB0_INT_vect) { ultrasonicResult = TCB0.CCMP / 29; }
The LED uses TCA for PWM output and its duty cycle is adjusted once a second according to the light sensor. For the button I used the CCL to debounce it and produce an output that toggles once for each button press. This button state is fed by the event system into the TCA to enable/disable counting. One button press allows the PWM, another one blocks it. Only time the CPU has to run for this is the small bugfix CCL interrupt and changing the duty cycle.
void bulb_init() { //Bulb PWM pin PORTA.OUTCLR = 1 << 0; PORTA.DIRSET = 1 << 0; //Event 0: CCL LUT0 out -> TCA0 count enable EVSYS.CHANNEL0 = EVSYS_GENERATOR_CCL_LUT0_gc; EVSYS.USERTCA0 = EVSYS_CHANNEL_CHANNEL0_gc; TCA0.SINGLE.EVCTRL = TCA_SINGLE_EVACT_HIGHLVL_gc | TCA_SINGLE_CNTEI_bm; TCA0.SINGLE.CTRLB = TCA_SINGLE_CMP0EN_bm | TCA_SINGLE_WGMODE_SINGLESLOPE_gc; TCA0.SINGLE.CTRLC = 0 << TCA_SINGLE_CMP0OV_bp; TCA0.SINGLE.PER = 10000; TCA0.SINGLE.DBGCTRL = TCA_SINGLE_DBGRUN_bm; TCA0.SINGLE.CTRLA = TCA_SINGLE_CLKSEL_DIV1_gc | TCA_SINGLE_ENABLE_bm; } ISR(CCL_CCL_vect) { //Lights were turned off from button //Set TCA0 output low (timer is stopped but output is stuck to whatever its last state) TCA0.SINGLE.CTRLESET = TCA_SINGLE_CMD_RESTART_gc; //CCL interrupt flag isn't cleared automatically CCL.INTFLAGS |= CCL_INT0_bm; } void button_init() { //Light button pin PORTD.DIRCLR = 1 << 6; PORTD.PIN6CTRL = PORT_PULLUPEN_bm | PORT_ISC_INTDISABLE_gc; //Event 3: PD6 -> CCL LUT0A & LUT1A EVSYS.CHANNEL3 = EVSYS_GENERATOR_PORT1_PIN6_gc; EVSYS.USERCCLLUT0A = EVSYS_CHANNEL_CHANNEL3_gc; EVSYS.USERCCLLUT1A = EVSYS_CHANNEL_CHANNEL3_gc; //Configure CCL LUT 0 & 1 as inputs into JK CCL.SEQCTRL0 = CCL_SEQSEL0_JK_gc; //CCL LUT0 is fed into J CCL.LUT0CTRLB = CCL_INSEL1_MASK_gc | CCL_INSEL0_EVENTA_gc; CCL.LUT0CTRLC = CCL_INSEL2_MASK_gc; CCL.TRUTH0 = 0x01; CCL.LUT0CTRLA = CCL_EDGEDET_EN_gc | CCL_ENABLE_bm; //CCL LUT1 is fed into K CCL.LUT1CTRLB = CCL_INSEL1_EVENTB_gc | CCL_INSEL0_EVENTA_gc; CCL.LUT1CTRLC = CCL_INSEL2_MASK_gc; CCL.TRUTH1 = 0x0d; CCL.LUT1CTRLA = CCL_EDGEDET_EN_gc | CCL_ENABLE_bm; //Configure CCL JK output to interrupt on falling edge (lights off) CCL.INTCTRL0 = CCL_INTMODE0_FALLING_gc; CCL.CTRLA = CCL_ENABLE_bm; }
This is what it looks like assembled and running (sorry no video):
And this is what it looks like on the website:
I have tried to highlight as many of the board's quirks I could find. It has its flaws, obviously, but it is incredibly well designed to support its target field. Development and prototyping is easy and the board is a pleasure to work with.
It is an interesting product and a great step toward providing better IoT development possibilities to enthusiasts. The Google Cloud integration supports scaling your product so it is not just something for hobbyists.
I definitely recommend buying this board if you are interested in this field and I will probably buy a few more myself.
Thank you to element14 for giving me this great opportunity to further my insight and give something back to the community of enthusiasts!