<?xml version="1.0" encoding="UTF-8" ?>
<?xml-stylesheet type="text/xsl" href="https://community.element14.com/cfs-file/__key/system/syndication/rss.xsl" media="screen"?><rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:slash="http://purl.org/rss/1.0/modules/slash/" xmlns:wfw="http://wellformedweb.org/CommentAPI/"><channel><title>Experimenting with Sensor Fusion</title><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/</link><description>In this competition, participants have the opportunity to experiment, test, or build a sensor fusion project with the AMD Xilinx SP701 Spartan-7 FPGA Kit.</description><dc:language>en-US</dc:language><generator>Telligent Community 12</generator><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/sensor-fusion-for-firefighters-summary-blog?CommentId=8c094e8b-35a8-491e-8dab-7695877290a1</link><pubDate>Tue, 21 Feb 2023 20:45:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:8c094e8b-35a8-491e-8dab-7695877290a1</guid><dc:creator>dang74</dc:creator><description>I got lazy during the approach of Christmas holidays. I am finally awaking from my slumber now, which has allowed me to finish this excellent series of blogs. The work that you did for this project is a great example of what can be accomplished with perseverance and the willingness to work long hours. Congratulations once again for the well earned first prize win in this challenge.</description></item><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/announcing-the-winners-of-experimenting-with-sensor-fusion?CommentId=8c27b1c4-f6ec-4460-b70e-d21a9ea3a6bb</link><pubDate>Tue, 24 Jan 2023 19:27:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:8c27b1c4-f6ec-4460-b70e-d21a9ea3a6bb</guid><dc:creator>dougw</dc:creator><description>Congratulations winners! Well done - FPGA projects require lots of knowledge and significant work.</description></item><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/announcing-the-winners-of-experimenting-with-sensor-fusion?CommentId=2f41ec2b-a8ff-47a8-b893-7fff99ab9c92</link><pubDate>Sat, 21 Jan 2023 00:52:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:2f41ec2b-a8ff-47a8-b893-7fff99ab9c92</guid><dc:creator>_david_</dc:creator><description>This was such an unreal experience! Thank you Element14 for organizing this event and for giving me such an awesome opportunity! @ javagoza , I was blown away by the scope of your project and how well executed it was, this was really well deserved! Keep up the work with the great blogs, it&amp;#39;s articles like yours that really helped me find my way in this field! And thank you everyone for all the kind comments, whether it was Randall, the judges, or members of the community. I was worried that this might not be well-received, so I found your feedback really uplifting, thank you!</description></item><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/announcing-the-winners-of-experimenting-with-sensor-fusion?CommentId=694def0d-d308-4e46-872e-bfac89052648</link><pubDate>Fri, 20 Jan 2023 17:37:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:694def0d-d308-4e46-872e-bfac89052648</guid><dc:creator>redcharly</dc:creator><description>Great projects!!</description></item><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/announcing-the-winners-of-experimenting-with-sensor-fusion?CommentId=22f1bf4b-1b07-49fe-b40e-0325dfd3cc6c</link><pubDate>Fri, 20 Jan 2023 15:38:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:22f1bf4b-1b07-49fe-b40e-0325dfd3cc6c</guid><dc:creator>amgalbu</dc:creator><description>Congratulations to the winners!</description></item><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/announcing-the-winners-of-experimenting-with-sensor-fusion?CommentId=5833c1ad-311c-478f-98f0-f893ec1630d4</link><pubDate>Fri, 20 Jan 2023 10:42:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:5833c1ad-311c-478f-98f0-f893ec1630d4</guid><dc:creator>BigG</dc:creator><description>Congratulations to all participants who completed this challenges and to the winners. Very impressive projects. Well done.</description></item><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/sensor-fusion-for-firefighters-displaying-heads-up-video-on-the-live-feed?CommentId=9067ce24-6eb4-4e16-8241-73da2cda82cb</link><pubDate>Fri, 20 Jan 2023 10:34:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:9067ce24-6eb4-4e16-8241-73da2cda82cb</guid><dc:creator>javagoza</dc:creator><description>The hardware acceleration is awesome, in the video example I am incrementing/decrementing the two counters 1 out of 10 times the main loop runs and I am using a cheap HDMI/USB grabber to capture the image on the PC. With a high-end HDMI screen the appearance is much better.</description></item><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/announcing-the-winners-of-experimenting-with-sensor-fusion?CommentId=b93cd0f2-0fd1-407b-9a03-1bc82b5574bf</link><pubDate>Fri, 20 Jan 2023 10:10:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:b93cd0f2-0fd1-407b-9a03-1bc82b5574bf</guid><dc:creator>javagoza</dc:creator><description>Honored to have been selected as the winner. I would like to express my gratitude to the judges, sponsors and all participants for their contributions and support. David did an excellent job. It is hard to imagine the complexity and time spent behind the work he has done. Congratulations on his project!</description></item><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/announcing-the-winners-of-experimenting-with-sensor-fusion?CommentId=d320b218-75c0-4498-80b6-8cf3a4dcbc92</link><pubDate>Fri, 20 Jan 2023 08:55:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:d320b218-75c0-4498-80b6-8cf3a4dcbc92</guid><dc:creator>maxpowerr</dc:creator><description>Сongratulations to all the winners!</description></item><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/announcing-the-winners-of-experimenting-with-sensor-fusion?CommentId=a6aab9a5-1af4-4b0f-9f11-26147c5daa29</link><pubDate>Fri, 20 Jan 2023 04:18:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:a6aab9a5-1af4-4b0f-9f11-26147c5daa29</guid><dc:creator>dang74</dc:creator><description>Congratulations to the winners and participants. The scope and complexity of Javagoza&amp;#39;s project in particular is something to behold. A well deserved top spot.</description></item><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/announcing-the-winners-of-experimenting-with-sensor-fusion?CommentId=c140a96f-db9d-4fd6-8f76-e90cc54b24b9</link><pubDate>Fri, 20 Jan 2023 02:46:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:c140a96f-db9d-4fd6-8f76-e90cc54b24b9</guid><dc:creator>kmikemoo</dc:creator><description>Congratulations!! Great projects!</description></item><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/announcing-the-winners-of-experimenting-with-sensor-fusion?CommentId=136462a8-022e-476e-8d6e-3d2561f81611</link><pubDate>Fri, 20 Jan 2023 00:47:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:136462a8-022e-476e-8d6e-3d2561f81611</guid><dc:creator>robogary</dc:creator><description>As always, incredible well done projects. Congrats.</description></item><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/announcing-the-winners-of-experimenting-with-sensor-fusion?CommentId=31832fb8-4f9c-4489-afad-75fd9122db8f</link><pubDate>Thu, 19 Jan 2023 23:02:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:31832fb8-4f9c-4489-afad-75fd9122db8f</guid><dc:creator>shabaz</dc:creator><description>Congrats to the winners! They are great projects.</description></item><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/announcing-the-winners-of-experimenting-with-sensor-fusion?CommentId=198ae51e-4620-4e21-a73e-67fa4541947c</link><pubDate>Thu, 19 Jan 2023 22:28:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:198ae51e-4620-4e21-a73e-67fa4541947c</guid><dc:creator>genebren</dc:creator><description>Congratulations to the winners! I was very impressed with the both the Grand and Runner Up projects, but I would agree with the judges that javagoza did an amazing job and is a well deserved Grand Prize Winner.</description></item><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/sensor-fusion-for-firefighters-displaying-heads-up-video-on-the-live-feed?CommentId=3d9705fd-571a-488e-bc5c-c348be470e41</link><pubDate>Thu, 19 Jan 2023 21:56:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:3d9705fd-571a-488e-bc5c-c348be470e41</guid><dc:creator>dougw</dc:creator><description>Nice job. The update rate makes it look very responsive.</description></item><item><title>Blog Post: Announcing the Winners of 'Experimenting with Sensor Fusion'</title><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/announcing-the-winners-of-experimenting-with-sensor-fusion</link><pubDate>Thu, 19 Jan 2023 21:38:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:05e2c6fd-cab0-4762-b88d-c9d63f607b7a</guid><dc:creator>rscasny</dc:creator><description>The Experimenting with Sensor Fusion Design Challenge, sponsored by AMD Xilinx, and featuring Spartan-7 SP701 FPGA Evaluation Kit has officially concluded. We had 4 participants, including the Grand Prize Winner and Runner-up. Our judges have read each blog and tallied up the final scores, and element14 is ready to announced the winners. In this blog, I&amp;#39;ll review the program (for newcomers) and announce the winners with a summary and links to their work. A Quick Review: What is Sensor Fusion? Sensors are an extension of the five human senses. They allow us to perceive the world and often observe details to a degree that our human senses cannot. However, in some situations, they still fall short of the user requirements, regardless of how well they perform. For example, in an automobile, a LIDAR sensor can determine whether there is an obstacle ahead. But if you want to know the exact nature of the obstacle, you also need an on-board camera. Moreover, if you want to sense the motion state of this object, you&amp;#39;ll also need a millimeter-wave (mmWave) radar. When multiple pieces of information on the features of the object are integrated, a more complete and accurate picture can be derived for system operation. This method of integrating multiple sensors is called “sensor fusion.” By definition, sensor fusion is the use of computer technology to automatically analyze and synthesize information and data from multiple sensors or sources under certain criteria to conduct the information processing required for making decisions and estimations. Two common types of sensor fusion are image and motion sensor fusion, used in automotive surround view and navigation applications, respectively. Other uses could include (a) determining the orientation of a system in three-dimensional space, or (b) trackers where their data is fused with data from wearable heart rate monitors, temperature sensors, etc. as part of telehealth services or remote monitoring of patient conditions What is the Experimenting with Sensor Fusion Design Challenge? element14&amp;#39;s Experimenting with Sensor Fusion is a hands-on competition for electronic engineers. The participants had the opportunity to receive a sensor fusion dev kit from our sponsor FREE of charge. They were challenged to conduct experiments and blog about what they learned. Their blogs would be judged for technical merit and creativity. The top two participants would receive some great prizes. Who are the Winners of the Experimenting with Sensor Fusion Design Challenge? Our four participants conducted experiments and produced 13 technical blogs. Our judges have made their decisions, so let&amp;#39;s meet the winners! Grand Prize Winner of the Experimenting with Sensor Fusion Challenge: javagoza javagoza is currently a back-end software developer for payment solutions in the payment card industry, specializing in PCI and EMV compliance. He had said on his application that the main reason he was interested in the Experimenting with Sensor Fusion program was to &amp;quot;learn more about FPGA design and image processing applications.&amp;quot; He experimented with building a prototype for a portable alert monitor that provides heads up display information for firefighters. He built the system in stages and experimented with new sensors and integrated them into the system. The sensors included, the Pcam 5C camera, 5MP fixed focus color camera module, various HDMI displays, the Pmod NAV module, 9-axis IMU plus barometer, the Pmod HYGRO module, digital humidity and temperature sensor, the Pmod CMPS2 Module, 3-Axis Compass, the SparkFun Environmental Combo Breakout Module(CCS811 Equivalent CO2 and Total Volatile Organic Compounds and BME280, Humidity, Temperature and Barometric Pressure) and finally the SparkFun IR Array Breakout: 110 degree FOV, MLX90640, Sensor Array Thermopile 32x24. He also experimented with the different functionalities of the AMD Xilinx Vivado development environment taking advantage of the hardware accelerators with the Spartan-7 FPGA. The system combines a thermal array sensor, real video image, a magnetometer compass, and the IMU to detect movement and orientation of the head and a time-of-flight (ToF) sensor to estimate the distance to nearby objects. With respect to the SP701 development board, javagoza said, &amp;quot;The SP701 board makes prototyping solutions based on the Spartan-7 FPGA really easy.&amp;quot; He designed the hardware, a heads display block IP, using high level synthesis tools (Vitis HLS). He used different IP blocks provided by Xilinx within the Vivado Block Designer and built other software drivers with Vitis IDE. One of our judges called javagoza&amp;#39;s blogs as &amp;quot;an excellent set of posts.&amp;quot; You can read all of his blogs here. Runner Up Prize Winner of the Experimenting with Sensor Fusion Challenge: _david_ _david_ is a 2021 graduate with a background in computer engineering and a concentration in robotics. He&amp;#39;s worked with Xilinx FPGA&amp;#39;s for about 2 years. He started out by interfacing MIPI CSI cameras in bare metal, then progressively switched over to embedded Linux where he learned how to use XRT to deploy accelerated computer vision applications for drones. His goal for this challenge was to implement some kind of sensor fusion application. Initially, he wanted to do experiments around Visual-Inertial Odometry (VIO), a type of sensor fusion which uses image sensors and IMU data to compute the position and orientation of an object. But due to time constraints, he chose instead only to take on part of this problem, namely the inertial odometry part. https://youtu.be/vpfd9Z3yzMQ He planned on collecting data from an IMU and accelerate the computation of an object&amp;#39;s pose. Once computed, he would generate a visualization of a set of unit vectors which will be fused with a live camera feed. In essence, this is an augmented reality (AR) application which will attempt to project the pose of an object in real time. He calls it a &amp;quot;drone pose&amp;quot; because drones are probably the best example of a rigid body robot that experiences linear and angular accelerations. At a high level, he wanted to be able to collect data from an IMU in order to calculate the pose of a rigid body and then represent it as a coordinate frame on a live camera feed. One of our judges said, his blogs were &amp;quot;well-explained and he didn&amp;#39;t try to do an excessive amount, and his two blogs were clear. The bulk of the content was in the second blog, which actually explains a lot to people new to the technology. I&amp;#39;d rate his content very high.&amp;quot; You can read all of his blogs here . I&amp;#39;d like to thank all the element14 members who participated in this challenge: jwr50 - FGPA-based VSLAM for Indoor Navigation He explored Visual Simultaneous Localization and Mapping (VSLAM) for indoor spaces. VSLAM is a class of algorithms that combines images sequences with pose information to construct a map of a device’s surroundings and at the same time estimate the location within that map. This technique is well suited for indoor environments where GPS is unavailable and in the absence of other positioning markers, beacons, etc. You can read his blogs here . guillengap - Sensor Fusion Bird Detector He attempted to develop a bird detector prototype with the Spartan-7 SP701 FPGA Evaluation Kit. He was motivated to take on this challenge because there is a great diversity of migratory birds in his area, so he wanted to experiment with pigeons, ravens, swallows and hummingbirds, since they have different types of behaviors. You can read his blogs here . Last Word: A Big Thank You to Our Judges! We&amp;#39;d like to thank Top Members Don Bertke and Shabaz for judging the Experimenting with Sensor Fusion Challenge! Their input on the projects was invaluable to our final decisions.</description><category domain="https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/tags/mmwave">mmwave</category><category domain="https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/tags/experimenting%2bwith%2bsensor%2bfusion">experimenting with sensor fusion</category><category domain="https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/tags/AMD%2bXILINX">AMD XILINX</category><category domain="https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/tags/sp701">sp701</category><category domain="https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/tags/lidar">lidar</category><category domain="https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/tags/winners">winners</category><category domain="https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/tags/vslam">vslam</category><category domain="https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/tags/vivado">vivado</category><category domain="https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/tags/vitis%2bhls">vitis hls</category><category domain="https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/tags/Visual%2bSimultaneous%2bLocalization%2band%2bMapping">Visual Simultaneous Localization and Mapping</category></item><item><title>Wiki Page: Experimenting with Sensor Fusion</title><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/w/documents/27784/experimenting-with-sensor-fusion</link><pubDate>Wed, 18 Jan 2023 21:35:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:47d03dcc-6f4e-49a7-aadb-5b6074b6496f</guid><dc:creator>pchan</dc:creator><description>In this competition, participants will have an opportunity to experiment, test, or build a sensor fusion project with the AMD Xilinx SP701 Spartan-7 FPGA Kit. APPLICATION DEADLINE HAS BEEN EXTENDED TO OCTOBER 12th, 2022. Experimenting with Sensor Fusion About Competition | Blogging | Example Application | Resources | The Dates | The Prizes | The Kit | The Judges | Terms &amp;amp; Conditions | Summer of Sensors | The Challengers Sensors are an extension of the five human senses. They allow us to perceive the world and often observe details to a degree that our human senses cannot. However, in some situations, they still fall short of the user requirements, regardless of how well they perform. For example, in an automobile, a LIDAR sensor can determine whether there is an obstacle ahead. But if you want to know the exact nature of the obstacle, you also need an on-board camera. Moreover, if you want to sense the motion state of this object, you&amp;#39;ll also need a millimeter-wave (mmWave) radar. A sensor can only sense features of an object based on its individual capabilities. When multiple pieces of information on the features of the object are integrated, a more complete and accurate picture can be derived for system operation. This method of integrating multiple sensors is called “sensor fusion.” By definition, sensor fusion is the use of computer technology to automatically analyze and synthesize information and data from multiple sensors or sources under certain criteria to conduct the information processing required for making decisions and estimations. Two common types of sensor fusion are image and motion sensor fusion, used in automotive surround view and navigation applications, respectively. Sensor fusion is largely dependent on software. The development of efficient algorithms based on actual applications has become the top priority of sensor fusion development. In terms of algorithm optimization, the introduction of artificial intelligence has made an ongoing impact on sensor fusion development. Although software is critical for sensor fusion, hardware also plays a crucial role. Using a hardware evaluation kit by AMD Xilinx, this competition focuses on experimenting with sensor fusion. Participants will have an opportunity to experiment, test, or build a sensor fusion project. Before we talk about what the participants will receive in the Challenger&amp;#39;s kit, and the great prizes they are competing for, let&amp;#39;s talk about how to enter this competition, the timeline, and some FAQs. How do you enter the Experimenting with Sensor Fusion competition? Log onto the Community and go to the Experimenting with Sensor Fusion enrollment page . Complete all the required information on the form and click submit . Who is Eligible to Enroll in the Experimenting with Sensor Fusion Competition? Any element14 member can enroll in the Experimenting with Sensor Fusion competition; however, to receive one of 4 FREE kits, you need to submit an application entry form by the enrollment deadline, October 12, 2022. If you are not a Community member please register here to join. Because of the cost of the kit, this challenge is limited to only challengers who have been selected to participate. Only challengers who have been selected are eligible to win prizes. What are the milestones of the competition? Enrollment Begins: August 31, 2022 Enrollment Ends: October 12, 2022 Applicants Selected: October 19, 2022 Challenge Begins: October 20, 2022 First Blog Due: November 11, 2022 Second Blog Due: January 6th, 2023 Challenge Ends: January 6th, 2023 Winners Announced: January 2023 Blogging Requirements: Only 2 Blogs! In order to successfully finish this competition, you are required to blog twice during the competition period. You can blog more than twice, if you wish. The due dates for the blogs are described below: Write Blog 1 : Introduce yourself in the blog and explain what experiments you plan to perform. The due date for publishing the first blog on element14 is November 11, 2022. Write Blog 2 : Write up the results of your experiments, using images, screen captures, videos, tables, charts, etc. Then tell us what you learned about gesture sensors. The due date for publishing the second blog on element14 is January 6th, 202 3 . Please also tag your blogs with &amp;#39;Experimenting with Sensor Fusion&amp;#39;. The Prizes There will be two big prizes awarded in this competition: a Grand Prize and a Runner Up prize. The Grand Prize iPad Mini + Apple Series 7 Watch Approximate value ($1,000) The Runner Up Apple Series 7 Watch Approximate value ($400) The Kit element14 is offering 4 kits FREE of charge. To be eligible to receive one of them, you must submit an application by the enrollment deadline (October 12, 2022) Sensor Fusion Kit* Buy Kit Spartan-7 SP701 FPGA Evaluation Kit* Buy Now The SP701 Evaluation Kit, equipped with the best-in-class performance-per-watt Spartan-7 FPGA, is built for designs requiring sensor fusion such as industrial networking, embedded vision, and automotive applications. Kit Contains: Evaluation Board XC7S100-2FGGA676C Power Cords Air Click Adapters Ethernet &amp;amp; Micro USB Cables Digilent Pcam 5C Imaging Module Buy Now Pcam 5C is an imaging module meant for use with FPGA development boards. The module is designed around the Omnivision OV5640 5 megapixel (MP) colour image sensor. This sensor includes various internal processing functions that can improve image quality including automatic white balance, automatic black level calibration and controls for adjusting saturation, hue, gamma and sharpness. Data is transferred over a dual-lane MIPI CSI-2 interface which provides enough data bandwidth to support common video streaming formats such as 1080p (at 30 frames per second) and 720p (at 60 frames per second). Digilent Pmod NAV 9-axis IMU Plus Barometer Buy Now The 410-326 from Digilent is a Pmod NAV 9-axis IMU plus barometer. The Digilent Pmod NAV uses the LSM9DS1 3-axis accelerometer, 3-axis gyroscope, 3-axis magnetometer, plus the LPS25HB digital barometer to provide users with 10-DOF functionality. The Pmod NAV provides a variety of orientation related data allowing users to easily determine the exact position the module is in and where it is headed. *Note price of evaluation kit is over $800. Anyone Can Apply to join the Experimenting with Sensor Fusion Competition Any element14 member can apply to be a participant to receive 1 of 4 FREE kits. You need to submit an application by the deadline, October 12, 2022. Resources/Technical Documentation Spartan-7 SP701 FPGA Evaluation Kit: Manufacturer Info Quick Start Guide User Guide Tutorials Digilent Pcam 5C Imaging Module: Datasheet Digilent Pmod NAV 9-axis IMU Plus Barometer: Datasheet Frequently Asked Questions (FAQs) What kind of experiments can the participants perform with the sensor fusion kit? Combining the data from a moisture sensor with a temperature sensor to calculate relative humidity Determining the orientation of a system in three-dimensional space Healthcare and medical electronics where body-worn systems which monitor the movement of limbs can be helpful for physiotherapy, to ensure exercises are performed correctly Wearable activity trackers where their data is fused with data from wearable heart rate monitors, temperature sensors, etc. as part of telehealth services or remote monitoring of patient conditions Can I still be eligible for the Grand Prize or Runner Up Prizes if I am not selected for one of the 4 kits? No, this competition is limited to only challengers who have been selected to participate. What do I need to do to win the Grand or Runner Up Prize? After the enrollment period is completed and the 4 FREE kits are shipped, you will have 8 weeks to complete your experiment, write the two required blogs, and share what you learned about sensor fusion. You will be judged by the quality of your final blog and what you have learned. Can I write more than two blogs? Yes. To finish the competition, you have to write and post a minimum of two blogs to the Experimenting with Sensor Fusion group. Sometimes the participants will write more than two blogs. You do not get extra points for writing more blogs, but writing more blogs will provide the judges with more information to help determine the best experimenters. Two blogs meet the basic requirements of participation in this activity. Tips on Writing Your Application If you want a chance to receive one of the 4 FREE kits, you will need to submit an application no later than October 12, 2022. The key to writing a winning application is to provide as much meaningful information about your proposed experiments as possible. The application entry form should be detailed enough to give a good idea of what you plan to do and how you plan to pull it off. But you don&amp;#39;t have to write a book! By answering each of the following questions in your application, you will provide enough information: (a) Describe your technical background. (b) Why are you interested in this competition? (c) What kind of experiment(s) do you plan to perform? (Be as specific as you can) (d) Have you participated in the element14 Community? If so, please provide some links to what you&amp;#39;ve done. If you are a new member, answer &amp;quot;New Member.&amp;quot; All interested element14 members must submit an application entry form before the end of enrollment on October 12, 2022. Here are some other suggestions for completing a winning application: Please complete all required information (contact information, etc.) Please use the email address that is associated with your element14 profile. Answer all of the application questions. Tell us why you want to be selected. Before deciding what you want to write, think about the following things: You are entering a competition. The most persuasive applications are the ones that attract the eye of the judges. A single sentence application will never be selected. This competition is not a game of chance. Be as detailed as possible, but don&amp;#39;t write a book. Anyone Can Participate in Experimenting with Sensor Fusion Competition Any element14 member can enroll in the &amp;#39;Experimenting with Sensor Fusion&amp;#39; competition. To receive one of the 4 FREE kits, you need to submit an application by the deadline, October 12, 2022. The Judges Top Members of the element14 Community will be our judges. They are: Top Member DAB Don spent 35 years in the aerospace industry working on many advanced projects. His range of experience covers nearly every scientific field and most ranges of the electromagnetic spectrum. He has a very broad interest and knowledge in science with extensive analysis in image, multispectral and hyper-spectral analysis. He has also worked on a wide range of embedded computer applications, including integrated search and rescue systems. Top Member shabaz Shabaz has studied Electronics Engineering followed by Law, and worked primarily in the fields of radio communications (military), telecommunications (infrastructure used by phone companies), data networks, and information technology. He was originally involved in hardware design, followed by software engineering and technical marketing. Thank you to our Judges for offering their time and service. General Questions For any general questions about the ‘Experimenting with Sensor Fusion’ competition, please post a comment on this page. To keep up-to-date with this competition, please bookmark it. Terms &amp;amp; Conditions community.element14.com/.../8244.SensorFusionTermsU.pdf</description><category domain="https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/tags/experimenting%2bwith%2bsensor%2bfusion">experimenting with sensor fusion</category><category domain="https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/tags/xilinx">xilinx</category><category domain="https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/tags/summer%2bof%2bsensors">summer of sensors</category><category domain="https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/tags/sensor%2bfusion">sensor fusion</category></item><item><title>Wiki: Documents</title><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/w/documents</link><pubDate>Wed, 18 Jan 2023 21:35:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:28ffa141-da33-4015-97d4-2b0fc95e9fa9</guid><dc:creator /><description /></item><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/experimenting-with-sensor-fusion-augmented-reality---drone-pose-blog-2?CommentId=a4a89943-f3d9-4c8d-9305-851ef81c0889</link><pubDate>Mon, 16 Jan 2023 14:53:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:a4a89943-f3d9-4c8d-9305-851ef81c0889</guid><dc:creator>shabaz</dc:creator><description>Sorry, I just saw your response now. Thanks for the detailed response. HLS does seem neat, and DFX for dynamic reuse sounds like an excellent accelerator for code too. I hope Xilinx can simplify it over time, and put it in the hands of more software developers, since most will not have encountered this. Today with typical development on a processor/microcontroller one just only just expects that some functions in code will partially get accelerated through the math libraries etc using the integrated accelerator peripherals like NEON, DSP peripherals etc, which is easily understandable and integrates well with existing workflow, since you&amp;#39;re not doing any extra steps, just building your code as usual), but doesn&amp;#39;t have anywhere near the potential gain in performance that the HLS/DFX could attain.</description></item><item><title /><link>https://community.element14.com/challenges-projects/design-challenges/experimenting-with-sensor-fusion/b/blog/posts/experimenting-with-sensor-fusion-augmented-reality---drone-pose-blog-2?CommentId=fbbda40e-e755-4b24-91ec-c856e01026ab</link><pubDate>Fri, 13 Jan 2023 23:31:00 GMT</pubDate><guid isPermaLink="false">93d5dcb4-84c2-446f-b2cb-99731719e767:fbbda40e-e755-4b24-91ec-c856e01026ab</guid><dc:creator>_david_</dc:creator><description>These are the three main locations I go to find the documentation I need: Vitis Vision Library source code , Vitis Vision Library documentation , and Vitis HLS pragmas . The Vitis Vision Library has three different levels: L1, L2, and L3. This can be very confusing for people when they first start using the library. L1 is the approach that I used throughout the article. L1 allows you to generate a custom IP (.zip) that you can import to your Vivado design. This flow is more involved on the hardware side of things, but it is the easiest to understand if you are already familiar with the Vivado to Vitis workflow. L2 tries to abstract away as much of the hardware side of things as possible by taking advantage of something called &amp;quot;platforms&amp;quot; and xclbin files. Generally, someone creates a platform so that someone else can create a custom accelerator (.xclbin) that links up with it without having to open up Vivado. Xilinx provides basic platforms for a lot of their boards, but if you don&amp;#39;t have one for yours, then you are out of luck and will have to create your own. You can do this in Vivado by selecting the option to make your block design a Vitis extensible platform. At some point you will have to invoke vitis&amp;#39;s v++ tool to link up the xclbin file with your pre-generated Vitis extensible platform. Vitis IDE can do this transparently, but you can also do this directly using the commandline interface. L3 is very similar to L2. The only difference is that their examples involve more than one accelerated function. These can be quite involved, and if I&amp;#39;m not mistaken, they all require a Linux image created in petalinux. L3 is a great starting point if they have an example design starting your board, but be forewarned that it can be very challenging if it is a different board and are new to the workflow. If you&amp;#39;re like me, you want to start at the simplest example and slowly expand on that so that you don&amp;#39;t get overwhelmed by it. For that, if you really want to get started with the Vitis Vision Library, I would 1000% recommend starting with the axiconv example . This was the exact place I started and would recommend it to anyone else. It is basically just the skeleton for an AXI4-Streaming bus which will read in an image and output an image unchanged. Most of the examples use AXI4-Memory Map buses, so definitely keep that in mind. For the most part, this just changes the interface definition of the Top Level function. You can pretty much copy over the accelerated code from AXI4-Memory Map to AXI4-Streaming without any changes. Once you understand how the axiconv example works, then check the lut example . I believe this is the simplest example that actually does some form of image processing. If you pay close attention to the write() and read() function calls , you can see how to modify the images. It definitely isn&amp;#39;t easy, but looking at the examples is the best way I found to learn how to do this. Another point of emphasis is that you don&amp;#39;t need to install OpenCV to run the accelerated code, but I would highly recommend it so that you can run testbenches. The installation can be a bit of a headache, and I&amp;#39;d be glad to give tips if you need any or help with setting up library/include paths, etc.</description></item></channel></rss>