element14 Community
element14 Community
    Register Log In
  • Site
  • Search
  • Log In Register
  • About Us
  • Community Hub
    Community Hub
    • What's New on element14
    • Feedback and Support
    • Benefits of Membership
    • Personal Blogs
    • Members Area
    • Achievement Levels
  • Learn
    Learn
    • Ask an Expert
    • eBooks
    • element14 presents
    • Learning Center
    • Tech Spotlight
    • STEM Academy
    • Webinars, Training and Events
    • Learning Groups
  • Technologies
    Technologies
    • 3D Printing
    • FPGA
    • Industrial Automation
    • Internet of Things
    • Power & Energy
    • Sensors
    • Technology Groups
  • Challenges & Projects
    Challenges & Projects
    • Design Challenges
    • element14 presents Projects
    • Project14
    • Arduino Projects
    • Raspberry Pi Projects
    • Project Groups
  • Products
    Products
    • Arduino
    • Avnet Boards Community
    • Dev Tools
    • Manufacturers
    • Multicomp Pro
    • Product Groups
    • Raspberry Pi
    • RoadTests & Reviews
  • Store
    Store
    • Visit Your Store
    • Choose another store...
      • Europe
      •  Austria (German)
      •  Belgium (Dutch, French)
      •  Bulgaria (Bulgarian)
      •  Czech Republic (Czech)
      •  Denmark (Danish)
      •  Estonia (Estonian)
      •  Finland (Finnish)
      •  France (French)
      •  Germany (German)
      •  Hungary (Hungarian)
      •  Ireland
      •  Israel
      •  Italy (Italian)
      •  Latvia (Latvian)
      •  
      •  Lithuania (Lithuanian)
      •  Netherlands (Dutch)
      •  Norway (Norwegian)
      •  Poland (Polish)
      •  Portugal (Portuguese)
      •  Romania (Romanian)
      •  Russia (Russian)
      •  Slovakia (Slovak)
      •  Slovenia (Slovenian)
      •  Spain (Spanish)
      •  Sweden (Swedish)
      •  Switzerland(German, French)
      •  Turkey (Turkish)
      •  United Kingdom
      • Asia Pacific
      •  Australia
      •  China
      •  Hong Kong
      •  India
      •  Korea (Korean)
      •  Malaysia
      •  New Zealand
      •  Philippines
      •  Singapore
      •  Taiwan
      •  Thailand (Thai)
      • Americas
      •  Brazil (Portuguese)
      •  Canada
      •  Mexico (Spanish)
      •  United States
      Can't find the country/region you're looking for? Visit our export site or find a local distributor.
  • Translate
  • Profile
  • Settings
Embedded and Microcontrollers
  • Technologies
  • More
Embedded and Microcontrollers
Embedded Forum TDD and Embedded Programming
  • Blog
  • Forum
  • Documents
  • Quiz
  • Polls
  • Files
  • Members
  • Mentions
  • Sub-Groups
  • Tags
  • More
  • Cancel
  • New
Join Embedded and Microcontrollers to participate - click to join for free!
Actions
  • Share
  • More
  • Cancel
Forum Thread Details
  • Replies 15 replies
  • Subscribers 468 subscribers
  • Views 1386 views
  • Users 0 members are here
  • tdd
Related

TDD and Embedded Programming

mconners
mconners over 11 years ago

A while ago, during the "Help a STEM Educator Out" thread, the topic of TDD was discussed briefly. I noted that I was a proponent, but I felt the challenges to developing the tools and frameworks for embedded development would be daunting.

 

I was not aware at the time, but someone has addressed this.

 

https://www.renaissancesoftware.net/blog/

 

This is the blog of James Grenning, who had written a book on the topic in 2011.

 

The Pragmatic Bookshelf | Test Driven Development for Embedded C

 

There exist on the web links to free versions of an ebook, but I'm not sure whether they are legitimate copies, so I won't link them here.

 

I have skimmed the book and it appears to be very well written. With a forward by Bob Martin, who is one of the chief proponents of TDD and agile.

 

I was fortunate enough to have been taught TDD by "Uncle Bob", as he is called, and another person mentioned in the book, Michael Feathers.

 

So to me, his endorsement carried a lot of weight.

 

One of the things I like about the book is that it emphasizes software development as a craft, and developers as craftsman.

 

I plan to investigate adding these methods to my embedded development.

 

Mike

  • Sign in to reply
  • Cancel

Top Replies

  • morgaine
    morgaine over 11 years ago in reply to mconners +3
    Michael Conners wrote: in order for things to be testable, they have to be decoupled. and Andy Clark wrote: It makes you think more about what you are trying to achieve These are two slightly…
  • mconners
    mconners over 11 years ago in reply to johnbeetem +1
    Hi John, I definitely think you missed something. Realize the design/development part of this is at the unit level. You still have system level integration tests and all the other things that are important…
  • morgaine
    morgaine over 11 years ago +1
    I suggest that this thread be moved out of Top Members and into some area where it's more directly on topic and is available to everyone, for example the Embedded Group . Morgaine.
Parents
  • johnbeetem
    johnbeetem over 11 years ago

    OK, TDD sounded interesting from the above comments so I took a look at Wikipedia.

     

    First impression: Yikes!

     

    Second impression: Sounds like a hacker's dream methodology -- write code that works for the test cases, and who gives a flying bit whether it works for anything else.  [Update: By "hacker", I mean someone who writes code by "hacking away" in a non-professional way until it seems to work.  IIRC this was the most common use back when I was a student, e.g., "he's not a computer scientist -- he's just a hacker."  I don't know if this meaning is still in use.  In any case, since "hacker" means so many different things nowadays it's IMO always necessary to define it.]

     

    Third impression: This reminds me of my assembly-language programming class back in Summer 1974.  We used Don Knuth's MIX assembly language.  Our last assignment was to write a subroutine to do a specified function.  The instructor had the main program that tested the subroutine, and you didn't know what it would do.  So you had to write a subroutine to do the specified function correctly (including all the special cases), do your own testing, but then hand the subroutine in as a deck of cards image which the instructor would run to get the final grade.  Concentrates the mind wonderfully, as they say.

     

    The third impression is like real life.  Your code goes out to users, who will discover all sorts of things you didn't test.  If you write code to make a handful of tests work instead of thinking through the complete problem domain, you can expect a disaster when you ship IMO.

     

    Fourth impression: Maybe I'm missing something?

     

    Fifth impression (update): Does Jive use this methodology?  That would explain a lot image

     

    JMO/YMMV

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
  • mconners
    mconners over 11 years ago in reply to johnbeetem

    Hi John,

     

    I definitely think you missed something. Realize the design/development part of this is at the unit level. You still have system level integration tests and all the other things that are important to ship code with the fewest amount of bugs possible.

     

    As i mentioned in the earlier thread, we adopted tdd about 8 (maybe 9 by now) years ago and we have had 5 releases over that period of time. The people that track these things tell me this project has the lowest defect rate of all previous releases. Our defects tend to be more along the lines of missed requirements or improperly specified requirements as opposed to broken software. Although I don't believe this is a valuable metric other than relative size, we are over 2 million lines of code. One of the interesting things, is with each new release, even with added functionality, our code size doesn't grow very much. We are able to reuse code that we have written over time, perhaps with a slight adaptation to allow for new functionality, without worry, because we have the tests to prove that it still works as it did when first written.

     

    One of the things that I have found during this period of development is that in order for things to be testable, they have to be decoupled. This allows our modules to be well defined, and specific.

     

    I have heard many of the same comments about hackers paradise whenever people first hear about this methodology, I think I felt this way at first, but ultimately it takes quite a bit of discipline to work in this manner.

     

    It's not for everyone, and there is no magic bullet in software development, but this is a technique that has worked for me and my team. It gives us the confidence to make changes to the code, to make it clearer and more readable. To make adjustments to facilitate performance, and to make changes when new requirements are introduced. We have a large collection of tests that run in a few minutes, 17000+ test methods, and who knows how many assertions inside those methods.

     

    as you say YMMV  image ,

     

    Mike

    • Cancel
    • Vote Up +1 Vote Down
    • Sign in to reply
    • Cancel
Reply
  • mconners
    mconners over 11 years ago in reply to johnbeetem

    Hi John,

     

    I definitely think you missed something. Realize the design/development part of this is at the unit level. You still have system level integration tests and all the other things that are important to ship code with the fewest amount of bugs possible.

     

    As i mentioned in the earlier thread, we adopted tdd about 8 (maybe 9 by now) years ago and we have had 5 releases over that period of time. The people that track these things tell me this project has the lowest defect rate of all previous releases. Our defects tend to be more along the lines of missed requirements or improperly specified requirements as opposed to broken software. Although I don't believe this is a valuable metric other than relative size, we are over 2 million lines of code. One of the interesting things, is with each new release, even with added functionality, our code size doesn't grow very much. We are able to reuse code that we have written over time, perhaps with a slight adaptation to allow for new functionality, without worry, because we have the tests to prove that it still works as it did when first written.

     

    One of the things that I have found during this period of development is that in order for things to be testable, they have to be decoupled. This allows our modules to be well defined, and specific.

     

    I have heard many of the same comments about hackers paradise whenever people first hear about this methodology, I think I felt this way at first, but ultimately it takes quite a bit of discipline to work in this manner.

     

    It's not for everyone, and there is no magic bullet in software development, but this is a technique that has worked for me and my team. It gives us the confidence to make changes to the code, to make it clearer and more readable. To make adjustments to facilitate performance, and to make changes when new requirements are introduced. We have a large collection of tests that run in a few minutes, 17000+ test methods, and who knows how many assertions inside those methods.

     

    as you say YMMV  image ,

     

    Mike

    • Cancel
    • Vote Up +1 Vote Down
    • Sign in to reply
    • Cancel
Children
  • COMPACT
    COMPACT over 11 years ago in reply to mconners

    Hi All,

     

    IMHO,

     

    TDD and Agile are just descriptions of a two SDLC methodologies (that are not necessarily poised against each other).

    I had a look at the TDD Wiki article and had a laugh because I found it incomplete and quite corny. (e.g. Who tests the tests? - Does the test dictate the desired result?)

    I also had a look at an officially released excerpt of Test-Driven Development for Embedded C and within that excerpt I found errors in it.

    The excerpt describes many of the principles that I have used since the 1970's that are still just as relevant right now!.

    Many of the new methodologies such as TDD and Agile are not new at all and have their genesis from techniques from 1957 and before.

     

    People (including egos or lack of appropriate skills) and business drivers (including $$$ & time to market) are diametrically opposed to the methodology principles and as such can taint or corrupt SDLCs and reduce the quality of their output.  On the other hand, you can impose too many bureaucratic measures and stifle progress.

     

    Irrespective of what is used the bottom line is that one needs the appropriate communication and comprehension thereof, controls, skills, standards, enthusiasm and discipline in place to produce a reliable and resilient product.

     

    "It is far to easy to get stuck in such irrelevant fluff!" - Very Compact

     

    "None of my inventions came by accident. I see a worthwhile need to be met and I make trial after trial until it comes. What it boils down to is one per cent inspiration and ninety-nine per cent perspiration." - Thomas Edison

     

    "Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it."- Brian W. Kernighan and P.H Plauger

     

    "Short Cuts can only be found at the Butcher Shop." - Tommy Emmanuel


    Very Compact

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
  • mconners
    mconners over 11 years ago in reply to COMPACT

    Irrespective of what is used the bottom line is that one needs the appropriate communication and comprehension thereof, controls, skills, standards, enthusiasm and discipline in place to produce a reliable and resilient product.

     

     

     

     

    I agree, that is the most important take away from this. You can get bad code or a bad product from any methodology if the principles you mention are ignored.

     

    I also love the Kernighan quote.

     

    Mike

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
  • morgaine
    morgaine over 11 years ago in reply to mconners

    Michael Conners wrote:

     

    in order for things to be testable, they have to be decoupled.

    and

    Andy Clark wrote:

     

    It makes you think more about what you are trying to achieve

     

    These are two slightly different aspects of the same very important message:

     

    If you don't know where you want to go, you probably won't get there.  Or, in SoftEng speak, know your requirements.

     

    Moreover, you need to know your requirements at every stage and in every iteration of the development process (regardless of which methodology you are using officially or unofficially), and hand-waved requirements in the back of your mind are absolutely not adequate --- that approach will get you into a tangle very rapidly even on one-person projects.  Software is far too complex to be created by the seat of your pants unless it's very trivial indeed.

     

    Any form of TDD will help you to know where you want to go because it is documenting requirements (as executable tests), and as Michael mentioned, the need to test your code will drive you towards creating modular, cohesive, and strongly decoupled subunits.  This assumes of course that you embrace testing strongly and honestly and don't just pay it lip service --- it pays back value in proportion to the time invested in it.  What's more, and this is extremely important for almost every project under the sun, executable tests allow you to perform regression testing automatically.  This helps immensely not only during hour-to-hour development as updated modules are checked in, but even more so when it's a long-lived project and understanding how older parts work ends up being forgotten because old personnel have left or simply because of the mists of time.  The tests won't forget.

     

    Unfortunately there is a fly in the ointment.  Testing is limited by extent of test coverage and by intrinsic observability and testability of what you want to test.  Full test coverage of non-trivial software systems is generally not possible, because the state space of even quite small software systems tends to be vastly beyond the capability of current-day computers to examine and test fully in reasonable time.  As a result, developers have to pick and choose what to test carefully, and it's a delicate balance that is aided by experience but never fully understood.  Theory doesn't help much in this area, because "what's important" is a semantic issue that CompSci is rather powerless to address.  My suggestion:  add semantic information to your tests so that explanations appear in your regression fault logs.  Then when things fail during regression testing, some of the thinking in the designer's mind will appear in the logs, which helps a huge amount by persisting old understanding and triggering the memory of it.

     

    TDD *can* be an excuse for not thinking about and documenting your requirements and design, but it can also be a very effective means of capturing requirements and design details in executable code which can be invaluable.  As in so many areas, an expert craftsman will produce an expert job while a poor one will produce a mess, whatever tools they are given.  The blame for delivering a poor product can't fall on TDD (nor on any other methodology), only on people who use their tools poorly.

     

    Morgaine.

    • Cancel
    • Vote Up +3 Vote Down
    • Sign in to reply
    • Cancel
  • DAB
    DAB over 11 years ago in reply to morgaine

    Morgaine is correct.

     

    Until you vet all of your requirements and determine what you need to test and what values you need to measure, you are not ready to build anything.

    Requirements analysis and test planning are concurrent activities.  They must both be completed before you begin to design your system.

    You need the testable data to determine if you use hardware or software techniques for implementation.

    Your design establishes the needed performance for each of your HW and SW components.

     

    These issues are at the heart of TDD.  Once you know explicitly what you need to do, you have a much better chance of a successful project.

     

    Unless you really like to spend long hours testing, rewrite, rebuild, testing, ...

     

    Never liked that approach myself, that is why I quickly learned structured design methods.  They do work.  They also free up your weekends for other things.

     

    DAB


    .

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
  • morgaine
    morgaine over 11 years ago in reply to DAB

    DAB wrote:

     

    Requirements analysis and test planning are concurrent activities.

     

    That's a very good way of expressing it.

     

    I would add that they are interdependent activities, because a requirement that is not testable cannot be known to have been achieved, and it is also at risk of damage in subsequent development.  The two activities have a very intimate relationship.  Treating them as separate (for example, one party setting requirements without involving the test writers) is a recipe for failure.

     

    Morgaine.

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
element14 Community

element14 is the first online community specifically for engineers. Connect with your peers and get expert answers to your questions.

  • Members
  • Learn
  • Technologies
  • Challenges & Projects
  • Products
  • Store
  • About Us
  • Feedback & Support
  • FAQs
  • Terms of Use
  • Privacy Policy
  • Legal and Copyright Notices
  • Sitemap
  • Cookies

An Avnet Company © 2025 Premier Farnell Limited. All Rights Reserved.

Premier Farnell Ltd, registered in England and Wales (no 00876412), registered office: Farnell House, Forge Lane, Leeds LS12 2NE.

ICP 备案号 10220084.

Follow element14

  • X
  • Facebook
  • linkedin
  • YouTube