RoadTest Product Review of the AMD Xilinx Kria KV260 - AI Vision Starter Kit -- by Steve K

Table of contents

RoadTest: AMD Xilinx Kria KV260 Vision AI Starter Kit

Author: skruglewicz

Creation date:

Evaluation Type: Development Boards & Tools

Did you receive all parts the manufacturer stated would be included in the package?: True

What other parts do you consider comparable to this product?: the “Lattice MACHXO3L Starter Kit” and “OrangeCrab Dev BD”

What were the biggest problems encountered?: No major problems

Detailed Review:

The AMD XILINX KriaTm KV260 Vision AI Starter Kit is a platform for developing advanced vision applications, equipped with a non-production version of the production K26 SOM. It comes with an evaluation carrier card optimized for vision applications, featuring multi-camera support and Raspberry Pi connectors. Developers of all types can get applications up and running in under an hour, with no FPGA experience needed. The kit is designed to simplify hardware and software development requirements, making it the fastest and easiest platform for application development with the goal of volume deployment on Kria K26 SOMs. The kit includes the development platform, accessory pack, and a 13MP autofocus RGB camera module.

This is my review of the first starter kit in the AMD XILINX Kria portfolio, the KV260 Vision AI starter kit. This kit is advertised as “an out-of-the-box platform for advanced vision applications”. According to the marketing material, “it allows users to target the starter kit as soon as it’s out of the box, with NO installation or FPGA knowledge required”, a capability that I will try to prove in this review. This expands on the line of devices starting with FPGA’s and SoCs in 1984 then Accelerator Cards in 2018, onto Versal ACAP in 2019 and now, the release of System-On Modules in 2021.

A little background on myself, I am a retired Senior Software Engineer with a Bachelor’s of Science Degree in Computer Science from Boston University. I graduated from BU in 1980 and had been working as a Software Engineer since then, until I retired in 2018. I have a limited knowledge of FPGA design but have participated in RoadTest and Design Challenges sponsored by various vendors on element14, where I have acquired a tremendous amount of knowledge in embedded systems. I reviewed an FPGA starter kit by Lattice for the Element14 Summer of FPGAs event in the summer and fall of 2021 and wrote a blog about it. Now, I have been given the opportunity to try an AMD XILINX product through a RoadTest.

At the heart of the kit, is the AMD KRIA K26 SOM (System On  Module). The SOM integrates a custom-built AMD Xilinx Zynq UltraScale+ MPSoC. The SOM provides an embedded processing system with tightly integrated programmable logic and a rich set of configurable I/O capabilities. Also on the SOM, there is DDR memory, nonvolatile storage devices, and a security module. An aluminum thermal heat spreader and a 12V Fan is mounted on the heat sink and is attached on top of the SOM. 

The SOM is attached to a carrier card for Development and Deployment purposes. The Carrier card communicates with the SOM. This carrier card is built by AMD, but a carrier card can be designed by a user for a specific application, following the guidelines to communicate with the SOM. The carrier card PCB contains the connectors needed to work with the SOM interfaces.

My package arrived on February 6th, 2023 at 1:30 PM, and now the fun begins. I am very excited to finally get to review an AMD product. I have heard so many good things about their product line in this community. I have watched several webinars on the AMD line of products and most recently, a two-day webinar on the Kria KV260 Vision AI Starter Kit.

image

I can’t wait to crack it open but first I need to prepare an unboxing video.

3 Goals of the RoadTest

There are 3 goals given on the landing page for this RoadTest that I have tried to cover in this review.

  1. Produce a 2- to 3-minute unboxing video.
  2. Test the out-of-box experience of the product.
  3. Build a project and show how you did it.

The next 3 sections describe how I met these goals.

GOAL#1 - Produce a 2 to 3 minute unboxing video

I actually went overtime on this. You don’t have to watch the whole thing, if you choose. It’s my first attempt at a video production for a RoadTest. I used the free version of KAPWING (kapwing.com), to edit my videos and I just scratched the surface of the capabilities of this excellent online media editor.

My Video

Here you go:

A Better Video by AMD Xilinx

More professional and to the point. It describes how quickly and easy it is to launch the Smart Camera accelerated application, with NO FPGA experience or tools required.

GOAL#2 - Test the out-of-box experience of the product.

Getting Started

After unboxing the kit as described in the video above, I went to the small pamphlet included in the kit.

image

Using the highlighted link. https://www.xilinx.com/KV260-Start, or the scanning the UPC code, brought me to the page https://www.xilinx.com/products/som/kria/kv260-vision-starter-kit/kv260-getting-started/getting-started.html

image

Now, I will describe my experience following the instructions on this page:

The first thing I did was to upgrade my Ubuntu version to 22.04 on my laptop, as I anticipated using it to test the kit. However, it turns out that I did not need to upgrade because the Kria KV260 will have Ubuntu v22.04 installed on the SIM card inserted in the kit. To access the OS, you can either use a serial terminal connection from your PC or attach a mouse, keyboard, and a second HDMI monitor to take advantage of the GNOME Desktop and use the kit as a standalone Linux workstation, which is pretty cool!

The introduction posted an important update that was required, followed by boot firmware updates. I noted these instructions but did not do the upgrade, which may have been the reason why I had some trouble with my Ubuntu test.

image

image

  • When do I perform this operation? and where do I get the upgrade? The page described above? I noted these instruction but I did not do the upgrade. Probably the reason why I had some trouble with my Ubuntu test?

UPDATE 01/21/2024 --  In answer to the above questions I did some more reserch and created a blog in the FPGA group to clear this up

Update Kria Boot Firmware on SOM before installing Ubuntu version 22.04

This post we’ll go through the steps for updating the boot firmware of your Kria SoM. I describe how I updated my Kria SoM boot firmware on the KV260 starter kit. If you are unaware Ubuntu 22.04 is NOT compatible with the Boot Firmware that comes installed on the KV260 SoM.

Ubuntu 22.04 + Ubuntu AI Demos

This demo offers the following:

  • Enable a desktop-like experience without the need of connecting to a laptop/PC.
  • Access to rich set of third-party software libraries in the Ubuntu community.
  • Run the latest of several accelerated applications from the App Store to evaluate KV260 running Ubuntu.

Now onto actually performing the demo described in the Out-of-the-Box with the Kria KV260: Up and Running in Under an Hour Video presented above. From the startup page I went to the Get Started with Ubuntu link and followed all the steps outlined below.

image

Step 1. Setting up the SD Card Image (Ubuntu)
Success
Step 2. Connecting Everything (Ubuntu)

Success

Step 3. Booting your Starter Kit (Ubuntu)

Success, But now it’s been over an hour. And I have not run the examples yet. So the claims in the marketing material are untrue for a novice user like me.   However, I was able to use the kit as a standalone GNOME Desktop workstation by connecting a display, keyboard, and mouse. To take advantage of the GNOME Desktop, use the "Instructions for GNOME Desktop" section below, which requires a keyboard, mouse, and monitor to be connected.

Instructions for GNOME Desktop
  • To login into the GNOME Desktop, you must connect a DisplayPort or HDMI monitor as well as a USB Keyboard and Mouse.
  • Power ON the Starter Kit by connecting the power supply to the AC plug. The power LEDs should illuminate, and after about 10-15 seconds, you should see console output on the connected display. After about a minute, the desktop login screen should appear.
    • Please note that the Starter Kit powers up immediately as you connect the AC plug to a wall. (There’s no ON/OFF switch on the board.)
  • Use your changed password and the username:ubuntu
  • Once logged in, you should see the default Ubuntu 22.04 LTS GNOME 42 desktop.
  • Open a terminal and verify internet connectivity via “ping” or “DNS lookup.
    • ping 8.8.8.8
    • If you can observe that packet transmit/receive worked and there is no packet loss with the above ping command, this means your Internet connectivity is working and active.
    • Please note: Without Internet connectivity, you will not be able to perform the ROS 2 Perception Node application steps or install the necessary tools & packages.
  • Set up the XILINX Development & Demonstration Environment for Ubuntu 22.04 LTS
    • install the xlnx-config snap that is required for system management:
      • sudo snap install xlnx-config --classic --channel=2.x
    • For more information on using the xlnx-config snap, please refer to the xlnx-config snap page.
  • Run the xlnx-config.sysinit command to install Kria specific PPAs which will also run apt update and apt upgrade - accept all defaults when prompted:
    • sudo xlnx-config.sysinit
    • For more detailed information regarding setting up the environment, please refer to: Getting Started with Ubuntu 22.04 LTS
  • Note that in order to launch some Kria Apps, the use of the USB-UART serial port to access the command line interface is required, rather than using the GNOME Desktop to start the applications. Some applications, such as the Smart Camera Application, will occupy the entire display output area while they are running, but you will regain access to the GNOME Desktop once the application exits.
Step 4. Launching the Smart Camera Application (Ubuntu)

Kria SOM has official Ubuntu support with a certified Ubuntu image. Currently, there is one application (NLP-smartvision) ported for out-of-box support in the Ubuntu image. Kria SOM also has PYNQ support, bringing Python productivity to the embedded platform. The example contains hardware (A microphone that did not come with the kit), so I could not complete the examples.

This has apparently changed since I started my review, as I was able to find a link to a camera application. The Quick Start page Setting up the Board and Application deployment , outlines the process of setting up the board and deploying the application. The introduction section states, "This document shows how to set up the board and run the smartvision application. This guide and its prebuilt smartvision firmware are targeted for Ubuntu 22.04 and XILINX 2022.1 toolchain. The previous version of this application (targeted for Petalinux on XILINX 2021.1 toolchain) is still available online." I intended to try this new smartcam on Ubuntu, but unfortunately ran out of time.

Step 5. Next Steps and Additional Resources (Ubuntu)

By the time you get to this section, you have successfully completed the Ubuntu flow. I read through this section. Links were reviewed  and I noted pages of interest that I will come back to. I listed them here and decided to try PetaLinux instead of Ubuntu?

For more information on developing with Ubuntu, refer to the Ubuntu wiki pages: Getting Started with Ubuntu and Tips and Tricks

To continue developing with KriaTm, the following resources are recommended:

 

Linux OS Options

There are several Linux OS options available for Getting Started with the KV260.

Ubuntu 20.04 + Ubuntu AI Demos

  • Enable a desktop-like experience without the need of connecting to a laptop/PC. 
  • Access to a rich set of third-party software libraries in the Ubuntu community.
  • Run the NLP-SmartVision (more accelerated apps from the Kria App Store will be made compatible over time) demo to evaluate KV260 running an Ubuntu application.
  • Continue with Ubuntu 20.04

PetaLinux 2021.1 + Accelerated Applications

  • Obtain everything necessary to customize, build, and deploy Embedded Linux solutions on AMD processing systems.
  • Offers baseline for full customization via Yocto layers and corresponding PetaLinux Starter Kit BSP.
  • Aligned with the PetaLinux production SOM enablement in the K26 SOM BSP.
  • Run all the accelerated applications available in the Kria App Store (current instructions are specific to PetaLinux based apps) to evaluate KV260 running a PetaLinux application.
  • Continue with PetaLinux

GOAL#3 - Build a project and Show How you did it.

Typically for evaluating products for element14 RoadTest, I have been experimenting with example programs available for the kit to gain knowledge with the toolchains. The examples help an intermediate embedded developer like me, to evaluate the usage of the product. A well written example is worth its weight in gold. This section contains the example from the Website that I chose to experiment with.

I was not able to get the examples working under Ubuntu so I tried to use PetaLinux instead.

image

Read thru

K26 Wiki to directly use PetaLinux BSPs for application development and deployment, rather than Ubuntu, the latest PetaLinux BSPs are available.

Github.io page Further technical documentation is available here

PetaLinux 2021.1 + Accelerated Applications

image

  • Follow this page to use PetaLinx instead of Ubuntu. Use another Sim card that I have lying around.
  • Document my activity

Step 1. Setting up the SD Card Image (PetaLinux)

The Starter Kit has a primary and secondary boot device, isolating the boot firmware from the run-time OS and application. This allows you to focus on developing and updating your application code within the application image on the secondary boot device, without having to touch the boot firmware. The primary boot device is a QSPI memory located on the SOM, which is pre-programmed (pre-loaded QSPI image) at the factory.  The secondary boot device is a microSD card interface on the carrier card.

For setting up the microSD card, you’ll need to download the latest SD card image and then write it using an Image Flashing tool.

  1. Download the KriaTm KV260 Vision AI Starter Kit Image and save it on your computer. The file I download at the time is: petalinux-sdimage-2021.1-update1.wic.xz.
  2. Use  Balena Etcher to Flash the image onto another SD card. I’m using a 32GB card that I have.

Step 2. Connecting Everything (PetaLinux)

The following are the key connections for the AMD KriaTm KV260 Vision AI Starter Kit:

  1. Insert the microSD card containing the boot image in the microSD card slot (J11) on the Starter Kit
  2. Get your USB-A to micro-B cable (a.k.a. micro-USB cable), which supports data transfer.* Do not connect the USB-A end to your computer yet. Connect the micro-B end to J4 on the Starter Kit.
  3. Connect the IAS Camera Module to J7 (or USB camera module to U44 or U46)
  4. Connect to a monitor/display with the help of a DisplayPort/HDMI cable
  5. Grab the Power Supply and connect it to the DC Jack (J12) on the Starter Kit. Do not insert the other end to the AC plug yet.

Step 3. Booting your Starter Kit (PetaLinux)

Instructions for Windows

  1. Configure your terminal program (TeraTerm) with the settings shown below

Baud rate = 115200

Data bits = 8

Stop bits = 1

Flow control = None

Parity = None

  1. Power ON the Starter Kit by connecting the power supply to the AC plug. The power LEDs should illuminate and a Linux UART response can be seen on the terminal program interface.
  2. You will see activity on the terminal and eventually you’ll see a login prompt.
  3. When the system boots to Linux and asks for a login username, enter default username as “petalinux” and set a new user password.
    1. image
  4. To enable the root user, use the command below because “petalinux” has limited privileges. Enter the same password that you created for “petalinux” in the above step.
    1. sudo su -l root
    2. image
  5. Verify Internet connectivity via “ping” or “DNS lookup.”
    1. ping 8.8.8.8
    2. image
  6. If you can observe that packet transmit/receive worked and there is no packet loss with the above ping command, this means your Internet connectivity is working and active.
  7. Make sure your internet cable is plugged into the kit. The instructions did not mention this? Without Internet connectivity, you will not be able to dynamically load Smart Camera accelerated application package feed on the Vision AI Starter Kit.

 

QUESTION on petalinux startup??

The following text comes up when booting the system? Just before the login prompt. What does this do? Something about JupyterLab ?    

[I 2018-03-09 04:35:06.108 ServerApp] jupyterlab | extension was successfully linked.

[I 2018-03-09 04:35:06.320 LabApp] JupyterLab extension loaded from /usr/lib/python3.8/site-packages/jupyterlab

[I 2018-03-09 04:35:06.321 LabApp] JupyterLab application directory is /usr/share/jupyter/lab

[I 2018-03-09 04:35:06.343 ServerApp] jupyterlab | extension was successfully loaded.

[I 2018-03-09 04:35:06.344 ServerApp] Serving notebooks from local directory: /home/petalinux/notebooks

[I 2018-03-09 04:35:06.345 ServerApp] Jupyter Server 1.2.1 is running at:

[I 2018-03-09 04:35:06.345 ServerApp] http://192.168.1.7:8888/lab?token=004023867f176c7682afc4f125b411c9ac07cd9646bbc911

[I 2018-03-09 04:35:06.345 ServerApp]  or http://127.0.0.1:8888/lab?token=004023867f176c7682afc4f125b411c9ac07cd9646bbc911

[I 2018-03-09 04:35:06.345 ServerApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).

[C 2018-03-09 04:35:06.368 ServerApp]

To access the server, open this file in a browser:

        file:///home/petalinux/.local/share/jupyter/runtime/jpserver-992-open.html

    Or copy and paste one of these URLs:

        http://192.168.1.7:8888/lab?token=004023867f176c7682afc4f125b411c9ac07cd9646bbc911

[W 2023-02-27 16:02:07.002 LabApp] Could not determine jupyterlab build status without nodejs

The reason that It comes up, is because at this point in time all the software is not loaded yet and try the 1st link after the camera app is running

The following link is run from the browser on the PC

http://192.168.1.7:8888/lab?token=eadda69f8091d4d7d74589f3403f09b0f86858fc160be051

This one does not work?

     or http://127.0.0.1:8888/lab?token=eadda69f8091d4d7d74589f3403f09b0f86858fc160be051 

This completes Step 3 and I am seeing the prompt.

Step 4. Launching the Smart Camera accelerated application (PetaLinux)

image

The KriaTm KV260 Vision AI Starter Kit introduces the concept of accelerated applications. These are pre-built applications for AI and SW developers, giving them a head start to develop their target end-applications You can quickly run one of our accelerated apps and even customize it with different AI models or sensors. Follow the steps on this page, to try the “Smart Camera accelerated app”

The KV260 Vision AI Starter Kit has ability to load and run applications that configure the hardware to implement a variety of functions. The utility “xmutil” is used to load the various applications. Only One accelerated application can be loaded at a given time. We will first load an application that programs the SOM to behave as a smart camera, with the ability to process a 1080p video stream from either a MIPI or USB camera, and display it over HDMI or DisplayPort, or stream it over wired Ethernet.*You will need a viewer capable of displaying video streamed over Ethernet, for example ffplay or vlc.

About the Smart Camera accelerated application:
An Ultra-HD camera with built-in machine learning to do face/pedestrian detection, having support to capture from various video sources (MIPI camera, video files & USB camera) and can output processed video on HDMI, DisplayPort, or RTSP streaming of ROI-based encoded video. Most popular use case: Smart city applications (face, pedestrian detection, and traffic management) and video analytics capabilities. Launching the Smart Camera accelerated application:

You have already made the connections needed to launch this application in Step 3. Now you will:

Dynamically install the smart camera app package feeds on the running target

AMD provides package feeds in run-time package management (rpm) format for users to dynamically load AMD accelerated applications on top of running a Linux starter image using dnf package manager utility command “dnf install”. AMD also provides the xmutil utility (xmutil getpkgs) to search the package feed and query the available accelerated applications package groups for the Vision AI Starter Kit. Run the following commands in your paralinux command window. Commands are given below:

  1. Run the command below , to get the list of available application package groups.
    1. sudo xmutil getpkgs
    2. my output
    3. image

 

  1. Run the command to install Smart Camera accelerated application package group from the above listing. Press “y” when prompted and wait for approximately two minutes for 204 packages to install.
    1. sudo dnf install packagegroup-kv260-smartcam.noarch
    2. my output
    3. image
Load Smart Camera accelerated application firmware

Prior to executing the newly installed application available at /opt/xilinx/bin, the FPGA firmware (PL bitstream + device tree overlay + PL drivers) needs to be loaded using xmutil utility commands. Follow the steps to load Smart Camera accelerated application firmware on the Vision AI Starter Kit.

  1. Run this command to list the existing application firmware available on the Vision AI Starter Kit.
    1. sudo xmutil listapps
    2. This is my output
    3. image
  2. Run the command to unload default “kv260-dp” application firmware.
    1. sudo xmutil unloadapp
    2. image
  3. Run this command to load Smart Camera accelerated application firmware.
    1. sudo xmutil loadapp kv260-smartcam
    2. image
Run the Smart Camera accelerated application.

Now comes the big moment… Place the AR1335 camera module pointing to the users face and run the Smart Camera accelerated app.

  • To run the accelerated app with the IAS camera module, use the following command:
    • sudo kv260smartcam --mipi -W 1920 -H 1080 -r 30 --target dp
  • We have successfully launched the Smart Camera accelerated application! A bounding box should appear around the face of the user in the output video displayed on the DisplayPort/HDMI monitor.
  • IT WORKS GREAT..
  • You can also control the Smart Camera application from a Jupyter notebook. Once the Linux boot is complete, launch the Jupyter notebook by entering the IP address of the SOM into your browser. Step through the cells of the notebook to exercise the Smart Camera functionality of the Kria SOM.
  • For more details about the Smart Camera accelerated application and customization options, visit the GitHub page

If you would like to try the GitHub doc for paralinux, use this link for the camera app. It uses the 2021.1 link  https://xilinx.github.io/kria-appsdocs/kv260/2021.1/build/html/docs/smartcamera/smartcamera_landing.html

 image

 

Quick Start         Setting up the Board and Application deployment

Summary & Conclusions

Website and Customer Support

https://www.xilinx.com/

You can register an account and you will be able to download software

image

This site has forums, a knowledge Base, Blogs, and an advance search.

image

Example Apps – Vision AI

There are example applications on the Xilinx GitHub docs page.  There are 3 release of the Boot Firmware 2022.1, 2021.1, and 2020.2.

image

As you can see from the chart above,  PetaLinux is supported on all 3, but Ubuntu is only supported on 2, as seen below.

.image

What I found, at the time of this writing , is that the navigation  menu on the right hand side, is kind of tricky. In that the pages that are loaded, have similar content but the supported firmware version number is not apparent from the page content.  The only way that you can actual tell what supported firmware version example page you using is to use the URL line, the version number is in the URL. Here the 2022 page is on the Left and the 2021 page is on the Right.

 image       

If you look real hard under the navigation bar between the logo and the search bar you will see the version. 

 image

There are other release context links that you can press at the bottom of the navigation bar, to toggle between release but you cannot get back to the 2022 version context!.

 image

The first time you navigate from the SOM Landing page navigation button, you are placed in version 2022 context as shown below….

  image

This makes it very confusing when trying to follow an example for Ubuntu or PetaLinux. I just wanted to point this out, because I was lost running the examples between Ubuntu and PetaLinux, I needed to go back to the landing page before I realized what was going on with the context  links! A big waste of time, I’d say.

Summary

I was able to meet all the 3 goals that I set out to accomplish with this review.

  • I produced an unboxing video using a wonderful FREE production online tool called Kapwing., (https://www.kapwing.com/)
  • I was able to “Test the out-of-box experience of the product.“ using the guidance of the getting started guides and videos available on the AMD/XILINK web portal
  • I did not ““Build” a project and show how you did it”, but I was able to follow along on an example for a vision AI samara example running on the kit.
  • I was able to get the kit running on 2 different Operating system (Ubuntu and PetaLinux).I already knew a little about Ubuntu, but PetaLinux was new to me
  • I was able to understand the SOM technology and it’s value in rapid deployment of AI vision embedded systems. 

Conclusions

My out-of-box experience of the product was very rewarding. I found it very easy to get the kit up and running and perform some interesting Vision AI examples. IoT is wicked hard and the concept of FPGA can be really daunting to a beginner. Even though, I have some previous knowledge of the concept, for me using the SOM made it more transparent then other FPGA kits I have evaluated.

My Scoring explained 1-5 star ratings

I rated the product based on my beginner status of using FPGA tech. I’m an independent Maker and evaluating Embedded development kits is a hobby of mine. I do not have huge funds to invest in  Kits and I’m fortunate to have been given the opportunity to evaluate embedded MCU evaluation kits from many companies.

Total Score  = 4.5

I never give a perfect score but this kit ranked very high compared to other kits that I have evaluated.

Product Performed to Expectations  5

The Kit was a pleasure to work with. Quite frankly it performed better than my initial expectations

Specifications were sufficient to design with 5

The illustrations in the documentation made it real easy to design with. Even though, I did not do any real deep design work with this review, I was able to get the out of the box experience working, with only a couple of hitches.

Demo Software was of good quality 4

From what I saw at the operation level of evaluating the application examples were excellent.  I scored this based on the doc pages navigation context links, which I found to be confusing.

Product was easy to use 4

The documentation made it easy to use. At points the documentation was a bit hard to navigate, due to links that were leading to the wrong information or sometimes leading to the wrong supported versions of the OS.

Support materials were available 5

There is an abundance of support materials and sometimes duplicate information presented in several different places,

The price to performance ratio was good 5

I think that the price for the kit is warranted, but the price puts me and other independent makers  probably out of being able to afford it. The price is worth the money to someone using the SOM and designing Carrier cards for it.

Possible Future Experiments

I have only scratched the surface of using the capabilities of the Kit. and I will keep on coming back to this review to refresh my knowledge of the kit and continue to experiment with this Vision AI starter kit. Some of the things that I did not have time for,  but will try to came back to are:

  • Experiment with the other examples available using either Ubuntu or PetaLinux.
  • Deep dive into Certified Ubuntu for Xilinx Devices at https://xilinx-wiki.atlassian.net/wiki/spaces/A/pages/1413611532/Canonical+Ubuntu#Certified-Ubuntu-for-XILINK-Devices
  • Understand XILINK Toolchains
    • Vitis Platforms, Vivado Board Support Packages
  • Evaluate the examples more in Depth by analyzing the Code and seeing HOW IT WORKS
  • Use the kit as a GNOME based Ubuntu embedded development platform. Using it for my other IoT development.
  • Use the Kit to learn and Experiment with PYNQ.
    • image
    •  The Kria-PYNQ GitHub link can be found 

 

Resources

Roadtest supplied links:
• Product Brief
• Datasheet
• User Guide
• Getting Started Guide
• Github Applications
• Kria SOM Github
• Avnet Page Brief
• Getting Started Video

Vision-based Applications with Kria Online Workshop

This was an extensive 2 day workshop that I attended on the Kria KV260. I took the workshop before receiving the kit. It is very detailed and takes a deep dive into using the kit. And the examples. 

Session 1: https://attendee.gotowebinar.com/recording/5123003334544520619

Session 2: https://attendee.gotowebinar.com/recording/8740415281647336878

Anonymous
  • I posed the following questions in my initial roadtest 

    When do I perform this operation?

    and where do I get the upgrade? The page described above?

    I noted these instruction but I did not do the upgrade. Probably the reason why I had some trouble with my Ubuntu test?

    UPDATE 01/21/2024 --  In answer to the above questions I did some more reserch and created a blog in the FPGA group to clear this up

    Update Kria Boot Firmware on SOM before installing Ubuntu version 22.04

    This post we’ll go through the steps for updating the boot firmware of your Kria SoM. I describe how I updated my Kria SoM boot firmware on the KV260 starter kit. If you are unaware Ubuntu 22.04 is NOT compatible with the Boot Firmware that comes installed on the KV260 SoM.

  • thanks so much  , for your kind remarks on my review. This is my first experience with a Xilinx product, and I must say I'm very impressed with this generation of there Evaluation kits. It was so easy to use. I  have been following the evolution of the Xilinx product line and it always seemed to be wicked hard to develop on. They hit it out of the park with this Kria line. They have managed to open up there SOC technology to hobbyist like myself.

    Thanks Again 

    Steve K

  • To answer your question  , the branding only appears on the free version. The paid version apparently does not contain the branding and more advanced features. The free version is pretty robust and  useful for the time being. It is an excellent video editor. I use it often and still learning new things with it.

  • Kapwing looks interesting...does their branding always appear in the finished video?

  • Nice review.

    Loved how you detailed all your steps, even your mistakes ! Slight smile I'll take some notes from your review !

    I really appreciate the Appendix A !

    I've seen that you've gone with the 2021 version of PetaLinux and have been able to use the demonstration applications. I've been meddling with the latest Petalinux version (a blog post coming soon) and there's no applications there yet.  I just love being on the bleeding edge !

    Even for Ubuntu it was a struggle, but I got there..

    I'm with you regarding the PYNQ . I thing is a nice way for them to go.

    Thank you for your review ! Very helpful . Glad you liked the KV260 !

  • This was a really fun board to play with  I was able to meet all the 3 goals that I set out to accomplish with this review.

    • I produced an unboxing video using a wonderful FREE production online tool called Kapwing., (https://www.kapwing.com/)
    • I was able to “Test the out-of-box experience of the product.“ using the guidance of the getting started guides and videos available on the AMD/Xilink web portal
    • I did not ““Build” a project and show how you did it”, but I was able to follow along on an example for a vision AI samara example running on the kit.
    • I was able to get the kit running on 2 different Operating system (Ubuntu and PetaLinux).I already knew a little about Ubuntu, but PetaLinux was new to me
    • I was able to understand the SOM technology and it’s value in rapid deployment of AI vision embedded systems. 

    My out-of-box experience of the product was very rewarding. I found it very easy to get the kit up and running and perform some interesting Vision AI examples. IoT is wicked Hard and the Concept of FPGA can be really daunting to a beginner. Even though, I have some previous knowledge of the concept, for me using the SOM made it more Transparent then other FPGA kits I have evaluated.