Enter Your Electronics & Design Project for a chance to win a Grand Prize of a SNES Classic and $200 Shopping Cart of Product! | Project14 Home | |
Monthly Themes | ||
Monthly Theme Poll |
Having spent some time thinking about a possible activity for the Project14 Electronic Toys competition I have decided to try and make some sort of augmented reality game. Augmented Reality, as far as this project is concerned, is taking a video stream and processing it in some way and adding additional information on screen as an overlay. I thought about Virtual Reality but I think that is just too far out of my reach.
Although I have done some simple image processing on static images using MATLAB I do not have any experience of using live video streams. Nor am I sufficiently good at programming to take an existing image processing package and use it to achieve augmented reality in the time scale available. Instead I am aiming to take a number of low cost, existing systems and modules and attempt to bodge them all together to achieve some sort of working augmented reality system. The system will be designed to recognise specific objects within an environment. A small mobile robot containing a video camera will be moving about in the environment and transferring the augmented video stream to a head-mounted display. The orientation of the video camera will be synchronised to the head-set orientation using some sort of 3D orientation sensor. The mobile robot movements will be controlled by the user, probably with some sort of joystick or similar manual controller and the aim will be to move through an environment, identifying the pre-selected targets, and then if possible, having some method of 'shooting' at them, using the headset to aim. A bit like a video game, except with a 'real' mobile robot moving over a module landscape.
To make sure that this all seems at least remotely possible, I have connected together a PixyCam to my laptop and run the supplied PixyMon programme. I have trained the Pixycam to recognise yellow objects, in this example they are yellow Lego men. I will not be able to stick a complete laptop on the headset so I have used an LCD display connected via a HDMI cable. The trial LCD display is quite large but a much smaller one, probably 3.5 inches or maybe 5 inches, looks likely to work just as well. Below is a video showing the trial system actually working (yes, I was surprised as well). It appears a bit blurry as the Pixycam doesn't have a very good lense and it is a video of a display, but it does seem to demonstrate that the concept might work.
There is a white line box drawn around each of the two yellow Lego men. It is a bit faint but it is there and a label is given to these, top left. The Pixy cam also outputs the size and type of object that it has identified, using a serial communication port (either I2C or SPI - I cannot remember at present) which can be connected to an Arduino for further processing. It is possible to train the PixyCam to recognise several different colour objects (it actually recognises the colour and not the object shape) so some could be friendly and others enemies to shoot at. The headset will then be used to move the PixyCam so that an objected to be shot at is within the centre of the image and then either a light can be shone at the object, or possibly even something physical fired at the object. At present I am not sure how to register a hit if using light, maybe a light sensor in the object. If I can get a gun that fires some sort of physical object, maybe rubber disks, then that could be used to knock over the object.
Well, it might all work, you never know, but it should be fun trying.
Dubbie
Top Comments