Table of contents
Abstract
DIY Pick N Place Machine
Project
I apologize for the late write up/submission. Due to some unforeseen life circumstances, I missed he deadline. I wanted to say that while unable to meet the deadline or finish this project, I hope to continue and update the community on this. With that said here is the current status and details for the DIY Pick N Place Machine.
## GOAL
Build an affordable, small scale, pick and place machine, that can be easily built and used by hobbyist at home.
## Design
### BOM
-
Axes and Motion
- Z-Axis: The Z-axis will be driven by two motors, each paired with an M8 lead screw. This design ensures stability and precise vertical movement for picking and placing components.
- X and Y Axes: These axes will use single motors paired with GT2 tracks and pulleys. This setup is reliable and widely used in the 3D printing world for smooth, controlled motion.
-
Part Handling
The vacuum pump end replaces the traditional hot end. It will pick up components via suction and release them accurately onto the PCB. -
Motor Control
- Z-Axis Motors: Controlled by the TMC5272-EVAL-KIT, which offers smooth and precise movement.
- X and Y Axes: Driven by the Infineon BTN8982TA motor drivers, ensuring efficient and reliable operation.
-
Brains and Vision
The machine will be powered by an Nvidia Jetson Nano, connected to a Pi Camera. The Jetson’s AI capabilities will handle:- Object Recognition and Detection: Identifying components to pick.
- Alignment: Ensuring parts are placed accurately on the PCB.
### Frame
The frame came together through careful modification of the original Green Mamba design (you can find the base files here), with a focus on 3D printing mounts for 2020 aluminum extrusion. These precision-printed components allow for a robust and customizable frame that can be precisely adjusted to meet the specific requirements of a pick and place machine.
Axis Configuration
The machine's frame is built around three primary axes:
- Y-axis:
- 2 Guide Rails
- TX2 Pulley
- 1 Nema17 Motor
- X-axis:
- 2 Guide Rails
- TX2 Pulley
- 1 Nema17 Motor
- Z-axis:
- 2 Guide Rails
- 2 Lead Screws
- 2 Nema17 Motors
### Code For Motion
## Vision System
My system now includes:
- Dual-mode operation supporting both live camera feed and test images
- Robust component detection using SSD-MobileNet v2
- Template matching for placement verification
- Comprehensive error handling and validation
- Built-in support for testing and validation
The system is built around a Python class called PCBVisionSystem
that handles all aspects of the detection pipeline. I've implemented several key improvements:
First, I added flexible input handling that supports both live camera feeds and test images. This makes it much easier to develop and validate the system using a set of reference PCBs before deploying it in production.
The component detection pipeline uses NVIDIA's optimized deep learning libraries through the Jetson inference API. I am using an SSD-MobileNet v2 model that's been trained to recognize various PCB components including header pins, SMD pads, and IC footprints.
For placement verification, I've implemented a template matching system that compares detected components against reference images. This helps ensure that components are not only detected but correctly oriented and placed.
## Review
Unfortunately this is as far as I made it. I did get the motor controls and the vision system working and hope to soon be able to get these integrated together and do some basic testing. It will also require adjusting the arduino code to also handle the motor driving the vacuum pump which will be attached to the X axis. While I know future reports to this project will not be counted, I hope the community and people interested in this project will stick around and see the final outcome. Again my apologies for the incomplete project and late submission, sometimes life happens.