Avnet UltraZed-EV Starter Kit - Review

Table of contents

RoadTest: Avnet UltraZed-EV Starter Kit

Author: dimiterk

Creation date:

Evaluation Type: Development Boards & Tools

Did you receive all parts the manufacturer stated would be included in the package?: True

What other parts do you consider comparable to this product?: Comparable Xilinx official offerings are the ZCU104 and ZCU106.

What were the biggest problems encountered?: a) Issues with the DisplayPort active adapter which were traced to the display. b) PMOD DVP camera from AVnet EOLed. c) Differences from ZCU106 TRD due to hardware differences make porting HDMI TX/RX designs challenging as you have to go on an archeological level search (no kidding) for device tree fragments.

Detailed Review:

Introduction

Back in June I received an Ultrazed-EVCC kit courtesy of Avnet and Element14. This board contains a powerful SOM populated with an Ultrascale+ 7EV series chipset.

 

The plan for this road-test is as follows:

 

I. First we are going to take a generic look at all the hardware peripherals on board the Embedded Vision Carrier Card hereafter referred as UZ_EVCC

II Second we shall port the PYNQ framework to Ultrazed by taking a time saving shortcut. In practice this took twice as much as the same process on a ZYNQ7000 series due to the released BSPs.

III Third, we shall implement a generic image processing pipeline using the xfopecv framework and PYNQ

IV If time permits a camera to DIsplayport pipeline will be demonstrated at a future point leveraging the HLS IP cores.

 

This review assumes that the reader already has a VirtualBox VM with Vivado 2019.1 or 2020.1 (Vitis) as well as the respective Petalinux version installed. For these designs I have used 2019.1 as 2020.1 literally needs it's own hard drive.

 

The unit comes with an Avnet provided image on the emmc flash. The image is pretty old, since it's dated from 2017.4 Vivado release.

There are a couple of example designs for testing the throughput of the SATA and PCIe interfaces

It's interesting to note that compared to Ultra96 , the device tree for the released Ultrazed images and the BSP's is lacking in certain areas such as Display port, HMDI RX/TX.

The SOM comes with a fan which has a sticky side. The fan is quite noisy though.

 

First some unboxing and setup photos.

{gallery} My Gallery Title

REPLACE THIS TEXT WITH YOUR IMAGEimage

IMAGE TITLE: THEN IMAGE DESCRIPTION

REPLACE THIS TEXT WITH YOUR IMAGEimage

IMAGE TITLE: THEN IMAGE DESCRIPTION

REPLACE THIS TEXT WITH YOUR IMAGEimage

IMAGE TITLE: THEN IMAGE DESCRIPTION

REPLACE THIS TEXT WITH YOUR IMAGEimage

IMAGE TITLE: THEN IMAGE DESCRIPTION

REPLACE THIS TEXT WITH YOUR IMAGEimage

IMAGE TITLE: THEN IMAGE DESCRIPTION

Hardware review.

The ZYNQ ULtrascale+ EV series devices are equipped with a hardened H.265/H.264 Codec IP. There are only three Ultrascale+ models, ZU4EV (192K), ZU5EV (256K) and ZU7EV (504K).

The Ultrazed-EV that was received with the carrier card is the 7EV, which contains a Zynq UltraScale+ MPSoC device; specifically the XCZU7EV-FBVB900.

What makes this device special compared to Ultrascale+ chipsets like the one on the Ultra96 board is that apart from the VCU it also contains UltraRAM. Think of this as a beefed up version of BRAM which works great for CNN based apps like the DPU.

 

image

The Embedded vision carrier card brings out all of the SOM peripherals:

 

a) Displayport

 

The Displayport on the Ultrazed uses a single lane from the GTR_TX3. This differs slightly from the hardware implementations on Xilinx reference boards which use dual lanes.

There are two Displayport connectors on board so the user has to make sure to connect to the right one. The other one is used for the LVDS display.

Implementing a bare metal application for the single lane implementation requires some modifications to the firmware:

https://www.xilinx.com/support/answers/71416.html

 

The nice thing about Displayport is that it can support live video generated from the PL side. This has to be enabled on the configuration options.

A simple pipeline was implemented where the camera input is overlaid on top of a TPG input.

Testing shows that the camera output works but the DP SDK driver was not configured properly due to the UZ peculiarities compared to ZC104.

 

 

image

 

 

As I did not have a DisplayPort monitor at hand I ended up using an active Displayport to HDMI adapter. This is still work in progress.

 

 

b) HDMI TX / RX

 

The HDMI IP needs a license. One cannot use the free IP cores used on ZYNQ7000 series (TMDS module from Digilent ) as these use OSERDES2 primitives and don't have the proper electrical characteristics to interface with the HP bank pins that operate at 1.8V.

So the solution is to use a retimer and equalizer IC. These are controlled by the I2C bus.

The HDMI IP core interfaces with a Video PHY which in turn connects to the retimer IC. The retimer performs the level shifting to TMDS signaling and connects to the HDMI connector.

A similar path is used for the receiver albeit using a different receiver chipset. The good news is that the Video PHY is available for free.

Now as of August 2020, there is no freely available ZYNQ Ultrascale+ HDMI IP cores. This can be rectified by modifying the  ZYNQ7000 TMDS to DVI IP core with a gearbox and a OSERDES3 primitive, however this is not exactly trivial as it needs proper simulation.

 

 

 

{gallery} My Gallery Title

image

IMAGE TITLE: THEN IMAGE DESCRIPTION

image

IMAGE TITLE: THEN IMAGE DESCRIPTION

c) SDI .

 

Two TX 3G SDI and 1 RX lanes are included on board.

I found a Lorex MPX camera on my parts bin thinking it was 3G SDI compatible but it came out to be a competing standard so this was not tested.

 

d) SFP Interfaces

 

There are two optical SFP+ connectors on the carrier board. Each of them uses two GTH lanes. This peripheral was not tested on this roadtest.

 

e)USB to UART

 

The carrier card contains a Silabs USB to UART. This enumerates as port12 and 13 on my laptop. I used port 13 (PS UART0) for communication with the serial to USB chip.

 

f) LVDS Display Panel

 

There was no Display panel with the kit.

Checking the reference and schematics shows that this is compatible with :

https://www.avnet.com/shop/us/products/avnet-engineering-services/aes-ali3-ampire10-g-3074457345635221593/

 

This display uses a DisplayPort cable, however the display panel has a serial LVDS interface. This means that the 24 bit parallel video

interface must be serialized by a factor of 7:1, before being transmitted on four LVDS differential pairs. Referring to the schematics shows that a fifth LVDS differential pair is used to transmit the pixel clock.

 

On the Avnet Github repo you'll find the ALI3 IP , but ... it will not work since the IP uses Oserdes2 primitives which do not exist on the ZYNQ Ultrascale+ series.

Again , here one has to take the same approach by using a gearbox and replacing the OSERDES2 with OSERDES3 primitives. As this peripheral was not part of the kit, I was not able to test it.

 

 

g) Power supplies

The Inifineon PMICs are I2C programmable. As shown below, one of the power supplies is not used.

It's interesting to note that not all power supplies of the Infineoon chipsets are used. More on this later.

Here's a radical and outlandish idea: Why not tie that 4-th unused power supply to the VADJ line of the FMC so that it can be programmed for 1.2V operation also.

 

 

image

 

 

An Infineon USB to I2C cable is needed to change the voltage levels. This was not part of the kit.

An I2C header is on the EVCC which allows the PS to be adjusted. The same can be done from the ZYNQ side or if you are adventurous . It is advised to use Kernel modules if one does want to risk frying a 1K SOM .

image

 

h) Clocking

 

The Ultrazed Board has a 300 MHz differential clock supplied by a DSC1103CI5 clock generator with an LVDS output.

Another programmable chip is used to generate all the clocking for the  HDMI , 3G-SDI, Loopback and PCIe.

The second programmable clock chipset is used to generate the differential clocks for the DisplayPort, SATA, USB, PCIE.

 

 

i) FMC connector

The FMC connector is an HPC slot type. It is connected to an HP bank so all voltages are fixed at 1.8V. Just wondering why the designer choose to use a fixed voltage for the VADJ, particularly when the IRPS5401MTRPBF PMIC – U22, 0x1A

channel D profile sits there unused.

 

image

 

j) SATA interface and PCIE connector

 

The board does not have a WIFI module so the PCIe can be used to add WIFI functionality. This peripheral was not tested on this roadtest.

 

The EVCC contains a SATA connector which allows adding external storage easily. Avnet provides a reference design for measuring the throughput to an external HD.

 

 

 

m) ZYNQ Ultrascale+

This is the star of the show . Literally , a shiny chip. It contains a VCU, UltraRAM, a lot of pins that use new generation PHY( support for MIPI-CSI) in HW, plenty of BRAM/DSP cores, 4x A35 ARM cores, two real time R5 cores, a PMU.

 

The VCU is capable of encoding 8 video streams in real time. The design below is the simplest that one can use with a Linux based setup such as PYNQ. This allows taking an uncompressed video stream and outputting a H264 compressed video stream which is highly beneficial for network transmission bandwidth.

 

image

As a final note on the EVCC. It has a thickness of 63 mil. It would have been more sturdy if the PCB thickness was increased to 94 mil although this would require some impedance matching changes.

When unplugging an FMC connector you have to be careful not to flex the PCB.

 

 

Porting PYNQ to Avnet Ultrazed-EV

 

The initial try to port PYNQ to Ultarzed went through more than a couple of hoops.

 

There are two reference BSPs from Avnet. After downloading them , I was thinking of porting PYNQ to the board as it allows easy control of the IP blocks.

 

The first one is fairly old : uz7ev_evcc_sd_oob_2017_4.bsp.

 

The second one : uz7ev_evcc_2019_1.bsp is more recent so I tried to use that one.

 

Opening the BSP as a zip archive you'll note that it contains the exported Petalinux project and hardware design. However there are issues with it:

 

a) First the device tree does not contain any of the peripherals on the EVCC apart from Ethernet. This will results in a Linux Image that does not recognize the board peripherals.

 

c) Second when opening the Vivado hardware design, the HPM1FPD bus is not connected so you'll get a prompt to run block automation.

 

The easiest way to build PYNQ is to use a good BSP and couple it with the rootfs for the aarch64 Linux image in this case since we are using ZYNQ Ultarscale+.  With this BSP that won't work.

Diggin in the device tree shows that there is a conditional inclusion of the xlnk driver:

 

#ifdef CONFIG_SDX

#include "xlnk.dtsi"

#endif

 

Not sure where the CONFIG_SDX is defined though. Anyways ,  I decided to start from scratch in order to demonstrate how to port PYNQ to a completely new board.

First thing first , we need a simple hardware design for the specific platform.

 

 

Hello World program

You may want to download the platform board files from the Avnet Github repo first.

 

As a first step I implemented a Hello World just to make sure the board was working as expected. Two serial ports are enumerated COM12 and COM13 on my station. The BSP uses COM13 for printing to terminal.

Below you can see that the serial port

{gallery} My Gallery Title

image

IMAGE TITLE: THEN IMAGE DESCRIPTION

image

IMAGE TITLE: THEN IMAGE DESCRIPTION

image

IMAGE TITLE: THEN IMAGE DESCRIPTION

 

 

Porting PYNQ to Ultrazed

 

Note that the exact procedure can be used on all other Ultrascale+ boards. The only differences are with respect to the device tree fragments.

 

 

  1. 1.Hardware design

The first step is to create the hardware design which will be built using Vivado as shown above.. After the hardware bitstream is compiled the design needs to be exported. This will create an SDK folder with the HDF file. This file will be used by Petalinux to gather as much data about the hardware system.

The hardware design step can be done on a Windows or on a Unix station. The Petalinux tools however can run only on Linux so the project folder needs to be copied to the Ubuntu Virtual machine.

The Petalinux will be used to generate the boot partition. To deploy a custom kernel, only the resulting boot files needed :

  1. BOOT.BIN, which contains the FSBL, the boot bitstream, and the u-boot.
  2. image.ub, which contains the linux kernel (zImage) and device tree (system.dtb)

 

 

  1. 2. Petalinux Project

To create a Petalinux project, the HDF file located on the SDK folder needs to be imported. The Ultrazed is a ZYNQ Ultrascale+ device hence the Petalinux command is as follows:

petalinux-create –type project –template zynqMP –name UZ_EVCC

 

This will create a directory named UZ_EVCC inside the project folder. Change inside this directory and import the hdf.

Then import the hardware description file. (*.hdf)

cd PetaUZ_EVCC

petalinux-config --get-hw-description  /home/User/Doc/UZ_EVCC_GPIO_LED/UZ_EVCC_GPIO_LED.sdk/

 

  1. 3. Customizing the Kernel

Open a terminal on the Petalinux project folder that we created on the previous step and issue

petalinux-config

This step will allow us to customize the kernel as well as enable Ethernet support and Displayport support.

 

petalinux-config -c kernel

The PYNQ OS is a Ubuntu Rootfs that runs a customized Petalinux Kernel. On the Ultarzed, the SD card on the EV Carrier card uses the SDIO1 bus from the PS side. SDIO0 is used to communicate with the eMMC on the SOM.

Therefore, the most important part of the process is to change the boot mode to boot from SD card.

The main steps are outlined below:

  1. FPGA manager
  2. Change primary Sd/SDIO to SD1 under Subsytem AUTO hardware settings
  3. Change root file system type to SD card
  4. Change device node of sdcard to /dev/mmcblk1p2
  5. Disable tftpboot
  6. Serial port is ps_uart0 115200
  7. Primary Ethernet is psu_ethernet3
  8. Change uboot env partition settings to sdcard
  9. Change systemdtb to sdcard
  10. Change linux rootfs to SD card
    Image Packaging Configurations -> Root filesystem type -> SD card
    DTG Settings->Kernel Bootargs->generate boot args automatically Uncheck this option to manually set bootargs

image

image

Modify the kernel as follows:

 

Kernel Bootargs→generate boot args automatically (OFF)
• for Zynq MPSoC: Kernel Bootargs→ user set kernel bootargs (earlycon clk_ignore_unused quiet)

• for Zynq MPSoC: Device Drivers→ Generic Driver Options → Size in Mega Bytes(1024)

 

Enable staging drivers:
• Device Drivers → Staging drivers (ON)
Enable APF management driver:
• Device Drivers → Staging drivers → Xilinx APF Accelerator driver (ON)
Enable APF DMA driver:
• Device Drivers → Staging drivers → Xilinx APF Accelerator driver → Xilinx APF DMA
engines support (ON)

 

Enable the DMA APF

Device Drivers → Staging drivers → Xilinx APF Accelerator driver (*)

  → Xilinx APF DMA engines support (*)

 

Device tree modifications.

After creating a PetaLinux project using the Avnet BSP, you should also be able to locate the DTSI file under /meta-user/recipies-bsp/device-tree/files.

As stated before, the Petalinux BSP provided by Avnet will not work with the PYNQ rootfs. The main issue is the device tree does not have the proper entries.

 

Add the xlnk device tree fragment by editing:

project-spec/meta-user/recipes-bsp/device-tree/files/ system-user.dtsi

 

image

I tested PYNQ by running one of the GPIO PL examples.  Checking under /dev showed that xlnk device driver was not present.

The xlnk is the kernel driver that helps with memory management, accelerator control, and data movement. When configuring the Petalinux kernel the xlnk fragment was not included in the devicetree. This may be the reason why the default Avnet BSP does not work with the PYNQ script although I have not checked.

 

Aftre recompiling the kernel ,a fully functioning PYNQ Rootfs with a custom kernel was obtained.

 

After adding the device tree fragment for APF driver.

/{
xlnk {
compatible = "xlnx,xlnk-1.0";
};
};

 

 

Also modify the kernel parameters as follows:

Device Drivers->Generic Driver Options->Size in Mega Bytes(1024)

Device Drivers->Staging drivers (ON)->Xilinx APF Accelerator driver (ON)->Xilinx APF DMA engines support (ON)

The kernel configuration can also be edited directly by modifying the following files

project-spec/meta-user/recipes-kernel/linux/linux-xlnx/kernel.cfg

and appending the parameters shown below:

 

CONFIG_CMA_SIZE_MBYTES=1024

CONFIG_STAGING=y

CONFIG_XILINX_APF=y

CONFIG_XILINX_DMA_APF=y

 

 

 

 

4. Kernel image

Since we will use a prebuilt root filesystem  there is no need to customize the root filesystem using Petalinux. Once we are done with configuring the kernel issue

petalinux-build

 

This will take around 50 min to 1 hr depending on the host capacity.

Once the process is done , the next step is to generate the kernel image:

petalinux-package --boot --format BIN --fsbl zynqmp_fsbl.elf --u-boot u-boot.elf --pmufw pmufw.elf --fpga --force

image

petalinux-package --boot --force --fsbl images/linux/zynqmp_fsbl.elf --fpga images/linux/*.bit --u-boot

Once the kernel is generated copy the files:

  1. BOOT.BIN
  2. System.dtb
  3. image.ub

 

 

Once this is done we have pretty much finished the process.

Burn the Ultra96 PYNQ V2.5 on an SD card. Once complete, copy the BOOT.BIN , system.dtb and image.ub on the first FAT partition and boot the SD card on Ultrazed.

 

 

The UBOOT may not be set to recognize the correct rootfs partition so once the device boots , interrupt the process to get on the uboot console and check:

 

printenv

setenv bootargs 'earlycon clk_ignore_unused root=/dev/mmcblk1p2 rw rootwait'

 

Then save the uboot environmental variables on the primary FAT partition and issue a reset

saveenv

reset

 

 

5. Enable Ethernet functionality

Ethernet is by default not enabled with the base PYNQ rootfs.

The first step is to issue a static IP so that one can connect from a Windows host:


sudo ifconfig eth0 192.168.137.3 netmask 255.255.255.0 up

 

The next step is to modify the ethernet adapter with the proper static IP address and fix the name resolution.

Edit the file /etc/network/interfaces.d/eth0

 

And copy the following;

auto eth0

iface eth0 inet dhcp

 

auto eth0:1

iface eth0:1 inet static

address 192.168.137.3

netmask 255.255.255.0

 

To enable Internet , share the Host WIFI adpapter with the Ethernet adapter and  issue:

sudo systemctl disable systemd-resolved.service

sudo systemctl stop systemd-resolved.service

Aftre disabling and then stopping the service, remove the link to /run/systemd/resolve/stub-resolv.conf in /etc/resolv.conf

sudo rm /etc/resolv.conf

 

Now, we can add a manually created resolv.conf in /etc/

 

sudo nano /etc/resolv.conf

 

and copy the prefered DNS server there

 

 

In this case I used:

 

nameserver 208.67.222.222

 

This is temporary though so you have to do this again on reboot.

 

7. Updating PYNQ

Once we have verified that there is Internet connectivity , update PYNQ.

sudo pip3 install --upgrade --upgrade-strategy only-if-needed pynq

 

 

 

{gallery} My Gallery Title

image

IMAGE TITLE: THEN IMAGE DESCRIPTION

image

IMAGE TITLE: THEN IMAGE DESCRIPTION

 

 

Checking the I2Cbus shows the presence of the HDMI retimer modules and Infineon PMBUS PMICs.

image

 

 

 

Camera pipeline

To build AI based apps requires an image source and even better an image sink.

An HDMI camera can be used here as the EVCC included an HDMI RX port and a USB 3C. This requires using either the non-free HDMI core or rolling ones own. The core interfaces with the HDMI video PHY.

This approach was abandoned due to lack of time, To implement this one has to modify the device tree with the proper fragments for the retimer and clock sources.

 

Now, initially, I thought to use a DVP PMOD camera. As luck would have it the module from Avnet has been discontinued:

https://www.newark.com/avnet/aes-pmod-tdm114-g/pmod-camera-kit-xilinx-development/dp/69AC5619

 

So I kludged together a home-brewed version  using an OV7670 camera that was connected to the HD bank that operates at 3.3V . I should note that this can be easily upgraded to a higher resolution one such as OV2640 camera.

 

The main idea that was proposed was to use dual cameras to implement a stereo Jig.

Well , the FMC hardware errata and the fact that the FMC HP bank  itself operates at 1.8V implies that using DVP cameras that work at 2.5V wont work.

 

So , one possible workaround is to use two VDMA in write mode and write RTSP images from the PS side.

 

Checking the DVP cameras datasheet show that the bus can actually operate at 1.8V. It's just that they are powered by 2V5. . So the idea is to modify the LDO to test the cameras. I will leave this as a clifhanger for a future post.

 

 

Image processing app using Vivado HLS

Using Vivado HLS and the xfopencv framework an IP accelerator that could work with PYNQ was implemented.

First a USB camera was tested.

image

 

 

 

 

To develop HLS IP , the proper platform has to be selected.

image

 

 

{gallery} My Gallery Title

REPLACE THIS TEXT WITH YOUR IMAGEimage

IMAGE TITLE: THEN IMAGE DESCRIPTION

REPLACE THIS TEXT WITH YOUR IMAGEimage

IMAGE TITLE: THEN IMAGE DESCRIPTION

REPLACE THIS TEXT WITH YOUR IMAGEimage

IMAGE TITLE: THEN IMAGE DESCRIPTION

REPLACE THIS TEXT WITH YOUR IMAGEimage

IMAGE TITLE: THEN IMAGE DESCRIPTION

 

 

 

A section of the CV2PYNQ package (courtesy of Wolfgang Brückner) was ported and a Vivado design using xfopencv filter2D  IP  generated blocks was implemented.

As you can see for VGA images you can get aroudn 4300 frames per second.

 

It should be noted that Vivado HLS XfOpenCV framework (2018,2019) have a number of bugs . Vivado HLS 2019 is not backward compatible with 2018.3 so you may be getting obscure implementation failures that refer to black boxes for each IP.

 

After testing the image processing IP , an ISP was implemented using a VDMA and PYNQ.

image

Testing the PYNQ Jupyter notebook shows that it can transfer the camera frames. However the camera needs to be configured prior to using the VDMA. There is an Issue with the UOI on the current design that prevents the VDMA from working correctly so this is still in the troubleshooting stage.

 

 

AI Application

Next, a couple of AI IP cores and frameworks were explored.

 

A)

The BNN package from PYNQ is quote popular as it supports efficient neural networks on RTSP streams and live video. However installing the BNN package from pip fails.

The board however should be compatible as the Ultra96 is also an Ultrascale+ architecture.

So one workaround is to edit it manually.

image

 

B)

The next option is to use Vitis (Vivado 2020.1 ) and implement the tutorials provided by Avnet as shown here:

 

https://www.hackster.io/AlbertaBeef/vitis-ai-1-1-flow-for-avnet-vitis-platforms-part-1-007b0e

https://www.hackster.io/AlbertaBeef/vitis-ai-1-1-flow-for-avnet-vitis-platforms-part-2-f18be4

 

This will require installing Vitis and at least a 500Gb of space for a Virtual machine.

 

C)

PYNQ released a DPU overlay which is compatible in theory with the Ultrazed since it can run on Ultra96, Again the Python package needs to be edited manually in order not to get any failures for unsupported boards.

 

D)

The open-source  NVDLA IP core looks promising until you realize that you have to a complete another Ph.D in order to get it working.

 

Issues encountered:

 

As in all engineering projects Murphy's law strikes whenever it feels like.

I had some issues with DP active adapters. Still trying to troubleshoot an SDK Displayport app.

The only issue that were encountered was the hardware errata on the EVCC for the FMC connector.

The addition of an HDMI TX and RX reference design would have been pretty sweet. I am trying to rectify this by releasing a preliminary Vivado design of the HDMI TX based on the ZCU104 TRD. This is still WIP and the license for the HDMI cores expires in 2 months,

 

 

Ultrazed PYNQ repo:

https://github.com/Q-point/UltrazedPYNQ

 

 

Verdict

This is a pretty fun Ultrascale+ development board at a very competitive price point.

 

 

The Good:

  1. Ultrazed SOM is pretty cool. I bet Ultron from the Avenegers was powered by these </joke>
  2. Board is  quite flexible and full of peripherals
  3. Addition of VCU with capability to encode decode 8 streams makes this board perfect for AI/ machine vision applications.
  4. One can design one's own carrier card based on the open schematics. A big plus if I might add compared to other companies.
  5. SOM is quite compact.

 

 

The Bad

  1. Rev1 carrier board losses 2  points for the FMC carrier card incompatibility. This has been addressed with a new revision. For some reason the blue buttons on the SOM are not populated on the board i received.
  2. Increasing the carrier board thickness to 2mm (94 mil)  would mitigate any EVCC flexing issues with FMC cards. Also the fan is quite noisy.

   3. Why not use the new USB3C connector.

Anonymous