element14 Community
element14 Community
    Register Log In
  • Site
  • Search
  • Log In Register
  • Community Hub
    Community Hub
    • What's New on element14
    • Feedback and Support
    • Benefits of Membership
    • Personal Blogs
    • Members Area
    • Achievement Levels
  • Learn
    Learn
    • Ask an Expert
    • eBooks
    • element14 presents
    • Learning Center
    • Tech Spotlight
    • STEM Academy
    • Webinars, Training and Events
    • Learning Groups
  • Technologies
    Technologies
    • 3D Printing
    • FPGA
    • Industrial Automation
    • Internet of Things
    • Power & Energy
    • Sensors
    • Technology Groups
  • Challenges & Projects
    Challenges & Projects
    • Design Challenges
    • element14 presents Projects
    • Project14
    • Arduino Projects
    • Raspberry Pi Projects
    • Project Groups
  • Products
    Products
    • Arduino
    • Avnet & Tria Boards Community
    • Dev Tools
    • Manufacturers
    • Multicomp Pro
    • Product Groups
    • Raspberry Pi
    • RoadTests & Reviews
  • About Us
    About the element14 Community
  • Store
    Store
    • Visit Your Store
    • Choose another store...
      • Europe
      •  Austria (German)
      •  Belgium (Dutch, French)
      •  Bulgaria (Bulgarian)
      •  Czech Republic (Czech)
      •  Denmark (Danish)
      •  Estonia (Estonian)
      •  Finland (Finnish)
      •  France (French)
      •  Germany (German)
      •  Hungary (Hungarian)
      •  Ireland
      •  Israel
      •  Italy (Italian)
      •  Latvia (Latvian)
      •  
      •  Lithuania (Lithuanian)
      •  Netherlands (Dutch)
      •  Norway (Norwegian)
      •  Poland (Polish)
      •  Portugal (Portuguese)
      •  Romania (Romanian)
      •  Russia (Russian)
      •  Slovakia (Slovak)
      •  Slovenia (Slovenian)
      •  Spain (Spanish)
      •  Sweden (Swedish)
      •  Switzerland(German, French)
      •  Turkey (Turkish)
      •  United Kingdom
      • Asia Pacific
      •  Australia
      •  China
      •  Hong Kong
      •  India
      •  Japan
      •  Korea (Korean)
      •  Malaysia
      •  New Zealand
      •  Philippines
      •  Singapore
      •  Taiwan
      •  Thailand (Thai)
      •  Vietnam
      • Americas
      •  Brazil (Portuguese)
      •  Canada
      •  Mexico (Spanish)
      •  United States
      Can't find the country/region you're looking for? Visit our export site or find a local distributor.
  • Translate
  • Profile
  • Settings
Spring Clean!
  • Challenges & Projects
  • Project14
  • Spring Clean!
  • More
  • Cancel
Spring Clean!
Spring Clean Projects 2026 Alex - Search and Rescue Robot
  • News and Projects
  • Forum
  • Members
  • More
  • Cancel
  • New
Join Spring Clean! to participate - click to join for free!
  • Share
  • More
  • Cancel
Group Actions
  • Group RSS
  • More
  • Cancel
Engagement
  • Author Author: Samuel.
  • Date Created: 15 May 2026 8:28 AM Date Created
  • Views 38 views
  • Likes 1 like
  • Comments 3 comments
  • Search and Rescue Robot
  • lidar
  • colour sensor
  • raspberry pi
  • robotic arm
Related
Recommended

Alex - Search and Rescue Robot

Samuel.
Samuel.
15 May 2026

Introduction: 

Search-and-rescue operations are among the most difficult yet important applications of robotics engineering. In disasters like building collapses, industrial accidents, nuclear incidents, and so on, the ability to quickly find survivors, assess dangerous environments, and deliver needed supplies without putting people in danger is paramount. Tele-operated ground robots meet this need directly, letting operators navigate dangerous, GPS-denied areas with little situational awareness and under a lot of time pressure. These challenges are reflected in the mission scenario underpinning this project. Alex, the tele-operated rescue robot developed, is designed to navigate its environment under remote control via a dual-operator architecture, locating items via floor-mounted colour markers, retrieving them using a robotic arm, and delivering them to another location. A 360° LIDAR sensor provides continuous spatial awareness to support real-time operator decision-making and post-mission mapping. Robots like the iRobot PackBot and Quince have been deployed in active conflict zones and nuclear disaster sites, respectively, performing precisely this kind of remote sensing, manipulation, and environmental mapping under conditions hostile to human entry. Alex is designed and evaluated against the same functional requirements that define these platforms.

System Architecture:

Alex involves 11 devices, namely the Arduino Mega, Raspberry Pi, DC Motors, Colour Sensor, Motor Shield, Arm Servos, E-stop Button, LiDAR and three laptops for the teleoperator. Figure below shows the system architecture of our Alex robot, which shows all components of Alex and its connections.

image

Hardware Design:

imageimage

image

image

 

Firmware Design:

High-Level Algorithm:

  1. Hardware Configuration: Timers and pins setup so that the Arduino can properly control the robot
  2. Initialisation: Synchronise hardware and establish safety protocols.
  3. Receive User Command: Listen for operator input using USART.
  4. Carry Out Command: Process input, check if e-stop is active (if it is, the packet will be disregarded), transmit data packets, and execute hardware movement.
  5. Loop: Repeat steps 2 and 3 until objective is met

image

Further Breakdown

  1. Hardware Configuration

  • Communications (USART): Configured to a 9600 baud rate to ensure stable, lowerror serial communication with the Raspberry Pi.
  • Safety (E-Stop): The designated E-Stop pin is configured to trigger a hardware interrupt on any logical change.
    • The Interrupt Service Routine (ISR) implements software debouncing to prevent multiple triggers from a single press.
    • The logic evaluates both the system state and the physical button state: 
      • if the E-stop is disabled and the button is depressed, it halts the system 
      • if enabled and released, it resumes operation.
  • Colour Sensor (TCS3200): The output frequency is scaled to 20% to keep the signal within the Nyquist-equivalent sampling limits of the microcontroller.
    • External Interrupt 1 (INT1) is initialised to increment a variable on every logical change. Over a 100ms window, this yields discrete frequency values for the red, green, and blue color channels.
  • Robotic Arm (Servos): The servo pins are initialised as outputs, and Timer 5 is configured for Output Compare Match.
    • Instead of relying on four separate timers, the Timer 5 ISR is programmed as a sequential state machine to "daisy-chain" control pulses.
    • The ISR toggles the respective PORTK pins and dynamically recalculates the next interrupt interval to deliver precise pulse widths. A final idle phase is calculated to maintain a stable 20ms refresh frame.
  • Locomotion (Motors): The L293D motor driver shield is configured utilising the standard AFMotor library.

2. Initialisation

  • Interrupt Activation: Global interrupts are initialized via the sei() command. Individual hardware components are subsequently activated by configuring their respective control registers. Specifically, the EIMSK register is modified to enable both the INT0 (E-Stop) and INT1 (Colour Sensor) external interrupts, while the TIMSK5 register is updated to activate the Timer 5 servo interrupts. Crucially, the hardware-level configuration of INT0 guarantees that the E-Stop mechanism will instantly preempt the main program loop under any condition.
  • Default Hardware States: The servos are initialized to safe, pre-calculated base angles to prevent erratic, unpredictable movements upon startup. The default motor speed is set to a duty cycle of 78.4% via the AFMotor library.
  • Sensor Start, GUI Update and Handshake will be discussed under Software Design.

3. Receive User Command

To prevent unpredictable hardware behavior, the firmware uses a strict, structured communication protocol to ensure data integrity between the Raspberry Pi and the Arduino. 

  • Packet Polling: The Arduino continuously polls the UART receive buffer, ignoring all incoming bytes until it detects the specific two-byte magic header (0xDE, 0xAD).
  • Payload Retrieval: Upon verifying the magic bytes, the Arduino reads the subsequent 101 bytes (comprising a 100-byte payload and a 1-byte checksum).
  • Data Verification: The Arduino computes a local checksum by performing a bitwise XOR across all payload bytes. If this computed value differs from the received checksum byte, the system assumes bit-corruption occurred during transmission and discards the entire packet.

  1. Carry Out Command

The complete command processing workflow on the Raspberry Pi is detailed in Section 6.

  1. Loop

Repeat steps 3 to 4 until the objective is met.

Software Design

image

High-Level Software Algorithm Flowchart for Arm Control

image

High-Level Software Algorithm Flowchart for Color Sensor

1. Initialisation

Baud Rate Synchronisation: The Raspberry Pi explicitly locks the serial baud rate at 9600 to match the Arduino’s configuration, ensuring stable, error-free UART telemetry between the microcontrollers.

Handshake Protocol: The Pi enters a TCP listening state, waiting for the secondary operator terminal to connect and bind to local port 65432.

Sensor Start (LiDAR): The Pi initialises the LiDAR module via a USB/Serial interface. The SLAM algorithm takes control of the data stream, resetting the occupancy grid and localizing the robot's physical origin to (0,0).

GUI Update: The SLAM interface immediately renders the initial LiDAR scan to the operator's display, establishing situational awareness prior to any locomotion.

2. Receive User Command (The full list of inputs are provided below)

  • E-Stop Safety Architecture (“e”): The system continuously monitors for software safety triggers. When “e” command is sent, a two-step halt sequence occurs:
    • Pi (High-Level): Instantly updates its global state to block any new movement commands from being queued, while simultaneously transmitting a priority halt packet to the Arduino.
    • Arduino (Low-Level): Upon receipt, the Arduino immediately sets its internal E-Stop state to active, stopping all motor and servo PWM signals. It ignores all subsequent movement packets and sends an acknowledgment packet back to the Pi.
  • Chassis Locomotion (“w”, “a”, “s”, “d”): Directional keys are passed with a duration parameter.
    • Use case: [value]. For example, “w 5000” makes the robot move forward for 5000 milliseconds
  • Velocity Control (“v”): Adjusts the global chassis speed using an 8-bit parameter
    • Use case: [value]. e.g. “v 255” makes the robot move at max speed
    • The effective duty cycle percentage is calculated via 255 × 100% and implemented using the AFMotor library on the Arduino. Firmware-level limiters automatically cap values exceeding 255.
  • Arm Manipulation (“sh”, “b”, “el”, “gr”): Controls the shoulder, base, elbow, and gripper respectively. (This process will be elaborated in the next section)
    • Use case: [value]. e.g. “b 180” makes the base turn toward the 180° direction.
    • Software limiters restrict the inputs to valid physical ranges, preventing hardware strain and self-collision.
  • Colour Detection (“c”): Triggers the TCS3200 sensor to process an RGB frequency reading of the floor marker. (This process will be elaborated in the next section)

image

 

Complete and functional project video (https://youtu.be/QglzhW2I7To)

2 most important lessons learned in this project:

  1. The Challenge of Concurrent Stream Management:
    During the integration phase, I realised that relying on sequential, blocking execution is entirely insufficient for a complex, dual-controller robotic system. My Raspberry Pi was tasked with managing high-level processing, which included handling asynchronous data such as LiDAR point clouds, highly restricted camera frames, and real-time commands from two separate operators simultaneously. Relying on blocking code created severe bottlenecks. For instance, a delay in one process such as waiting to fetch one of my limited visual frames would "freeze" the robot. Because the final mission has a strict 8-minute time limit and requires immediate reaction to the physical E-Stop, these freezes were unacceptable for safety and performance risks. This highlighted the absolute necessity of non-blocking, eventdriven programming and prioritised hardware interrupts in real-time embedded systems.
  2. The Criticality of Pre-Implementation Architecture:
    Throughout this project, I learned that the cost and complexity of design changes increase exponentially as integration progresses. My initial ad-hoc software development led to fractured logic that was highly prone to failure and incredibly time-consuming to debug during my trial runs. Furthermore, assembling the hardware without prior spatial planning caused compounding mechanical issues. Because I had to precisely manipulate an 8x5x5cm medpak without touching high walls, sensor visibility was paramount. I repeatedly had to disassemble the chassis to relocate the camera for better gripper visibility and shift towering components that were obstructing the LiDAR's field of view. Ultimately, I learned that rigorous pre-implementation planning, mapping out both software architecture and a physical 3D layout is non-negotiable to prevent wasting critical testing phases on structural rebuilds.

2 greatest mistakes:

  1. Physical Integration & LiDAR Field of View (FoV):
    One of my primary hardware oversights was the initial mounting position of my 360° LiDAR unit. I placed the sensor where chassis wiring and structural supports partially obstructed its laser sweep. Because a core mission objective was submitting an accurate, hand-drawn map of the unknown base layout, this oversight was critically damaging. The obstructions created blind spots in my SLAM mapping, causing the robot to misinterpret its distance from walls or fail to detect them entirely, risking heavy penalties for environmental collisions. Moving forward, I implemented a strict "clearance-first" design rule for the third layer of the chassis, ensuring the sensor’s optical path remained entirely unobstructed by objects.

  2. Low-Level Resource Contention (Timer Conflicts):
    I initially failed to maintain a comprehensive Timer Resource Map. When I began testing the locomotion system, I realised too late that the PWM signals driving my four independent DC motors via the L293D shield were competing for the same hardware timers used by the 4-DoF arm servos. This conflict rendered some motors completely unresponsive to operator commands. This mistake taught me a vital lesson: bare-metal programming requires a deep, uncompromising understanding of the microcontroller’s datasheet. Peripheral and timer allocation must be planned as strictly as the software logic itself to prevent hardware-level interrupt clashing.

Continuation of half-complete project:

The Hardware Trap and Loss of Momentum

Building a reliable physical foundation for a robotics platform is rarely as straightforward as writing code. The process of sourcing compatible equipment is inherently tedious; in robotics, a single mismatched component can stall the entire build. By the time the compatible hardware actually arrived, my initial excitement had likely faded. The project became a literal box of parts, sitting half-complete while I waited on shipping and logistics, sapping my momentum before the complex assembly even began.

The Intimidation of the System Architecture

The Intimidation of the System Architecture The core of my project inertia stems directly from how daunting the integration phase is. Looking at the high level system architecture of the final system, I realize I'm not just building one thing; I am building distinct sub-systems that must communicate flawlessly. I had to figure out how to make a Raspberry Pi, which handles the high level algorithm for software design like arm control and color detection, communicate with an Arduino, which manages the high level algorithm for the firmware design. Developing the communication protocol to handle the format of messages and responses between these microcontrollers is often the most complex and failure-prone part of a robotics project. It is incredibly common to delay starting because mapping out this protocol feels overwhelming.

 

  • Sign in to reply
  • kmikemoo
    kmikemoo 1 hour ago in reply to Samuel.

    Samuel. At least it's no longer in the box. RelaxedThumbsup

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • More
    • Cancel
  • Samuel.
    Samuel. 6 hours ago in reply to michaelkellett

    Hi Michael, thanks for taking the time to read through the post and share your feedback!

    To give you a bit more context, I was a college student when this project first started. It began as a highly ambitious passion project with a few friends, but as coursework piled up and we hit the hardware roadblocks I mentioned, we eventually had to set it aside.

    I recently picked the project back up on my own to see what I could salvage and learn from our past mistakes. When writing this post, I was actually piecing together our old group notes, post-mortems, and self-critiques to organize my thoughts. I missed editing a few of those sections to reflect my current solo perspective.

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • More
    • Cancel
  • michaelkellett
    michaelkellett 10 hours ago

    Hello Samuel,

    This is an interesting post but it needs a bit more context.. It looks as if it's a report on a group project that didn't quite work out. In many ways this is a better subject for a blog than one where everything went well.

    Could you explain who you are and the context in which the project was started (was it a college project or a competition, what were the other entries like etc).

    The last couple of paragraphs look as if they were part of a critique by an examiner - is that the case ?

    MK

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • More
    • Cancel
element14 Community

element14 is the first online community specifically for engineers. Connect with your peers and get expert answers to your questions.

  • Members
  • Learn
  • Technologies
  • Challenges & Projects
  • Products
  • Store
  • About Us
  • Feedback & Support
  • FAQs
  • Terms of Use
  • Privacy Policy
  • Legal and Copyright Notices
  • Sitemap
  • Cookies

An Avnet Company © 2026 Premier Farnell Limited. All Rights Reserved.

Premier Farnell Ltd, registered in England and Wales (no 00876412), registered office: Farnell House, Forge Lane, Leeds LS12 2NE.

ICP 备案号 10220084.

Follow element14

  • X
  • Facebook
  • linkedin
  • YouTube