Digilent Zybo Z7 + Pcam 5C - Review

Table of contents

RoadTest: Digilent Zybo Z7 + Pcam 5C

Author: Fred27

Creation date:

Evaluation Type: Development Boards & Tools

Did you receive all parts the manufacturer stated would be included in the package?: True

What other parts do you consider comparable to this product?: There is quite a range of Zynq development boards, all with their own focus. With HDMI in, HDMI out and a MIPI camera interface, the Zybo Z7 seems particularly suited to vision projects.

What were the biggest problems encountered?: Versioning of Vivado. Samples and workshops require one of many specific versions on Vivado. Each one is 10s of GB in size and upgrading projects and IP is not easy.

Detailed Review:

What to expect from this road test

I'm going to start this road test by telling you a little bit about me. That's not because I think you're interested, but so that you know what to expect from this road test.


I'm a professional software developer and I'm happy with most languages that begin with c - mainly C#, C and C++. (Not COBOL though. I don't do COBOL.) C is relevant because a lot of what I intend to do with this particular Zynq device revolves around taking C/C++ and converting it so it runs as hardware rather than software. That conversion will be new to me, but the C bit won't.


I also have some basic FPGA experience. I've played around a bit with a few Xilinx devices. I'm fairly amateur at writing Verilog but I can get it to run on a Spartan 7 FPGA or it's smaller CPLD cousins like the Coolrunner. I bought a MiniZed and followed alongside the trainees who enrolled in Path to Programmable, so I have some experience playing around with the combination of microcontroller and Programmable Logic that is the Zynq.


I don't have any experience of High Level Synthesis (HLS). This is where Xilinx transforms C/C++ into a Hardware Definition Language that will run blisteringly fast in hardware as opposed to just fairly fast as software. I also have no experience in video processing, so whilst I'll be delving into this area, it will be to find out how easy it is to get to grips with rather than pushing it to the limit.


Essentially, what I'm saying is that this road test will be from the point of view of a relative newcomer to programmable logic. I hope that this means that for much of the E14 community it's easier to relate to. If you're after an expert analysis of video processing on the Zybo then I'm sure some of the other road testers with more HLS experience may cover what you're after. However, if you want the groundwork and to see if the Zybo Z7 might be for you, then please read on!


So, what are we road testing?

What is Zynq?

SBCs are everywhere these days. The Raspberry Pi is probably best known. It's great for a lot of things and it has sold millions of units to people who want to have a small computer that can also interact with hardware, reacting to switches and sensors, lighting LEDs, etc. A Raspberry Pi will usually be running Linux. Once again this is great for a lot of things, but not for anything real-time. Doing fast I/O isn't easy and doing predictably fast I/O is harder still. Just when you least expect it, Linux may swap your process out, garbage collect to free up memory, or who knows what. Even something fairly simple like PWM to drive a servo can be a problem without external peripherals.


I've recently been having a lot of fun with the BeagleBone - specifically the 2 Programmable Realtime Units that sit alongside the main ARM core - which will once again probably be running Linux. These PRUs are basically 16-bit microcontrollers and they will happily handle timing-critical tasks. If you like, you can think of it as a Raspberry Pi with a couple of Arduinos strapped to the side. And all of this in one little Sitara chip. Very handy.


Where am I going with this? Well, if you thought having a couple of microcontrollers alongside you ARM microcontroller core was useful, how would you like a fairly hefty FPGA there instead? You would? Great. Welcome to the Zynq!


The essence of the Zynq is an ARM Cortex A9 core for the Processing System (PS) side of things. There's one in the S series, or two in the rest of the range. The ARM core(s) are combined with an FPGA for the Programmable Logic (PL) side of things. This is more or less an Artix-7 for most of the range or a Kintex-7 for high-end Zynqs. These FPGAs are powerful enough that you can even use it to create more microcontrollers if the BeagleBone's PRUs were to your liking. If not, well the world of programmable logic is your oyster. Hang on to the PS and PL acronyms by the way. The Zynq documentation will use them a lot.


What is the Zybo Z7?image

The new Zybo Z7 is a reasonably recent development board with the Zynq at its core. It's an update to the original Zybo and it come in two Zynq flavours. The one I'll be road testing here is based around the Zynq XC7Z010. There is also a higher spec version available with a Zync XC7Z020. Apart from the beefier processor this version also has one more RGB LED and the 6th unpopulated Pmod connector is, um... populated. The processor is significant but these other differences seem so trivial you have to wonder why they bothered.


The interesting specs are as follows:

The Zync processor contains a dual core ARM Cortex-A9 processor, alongside FPGA fabric consisting of 17,600 Lookup Tables, 35,200 Flip-flips and 270KB of block RAM. It's capable of Gigabit Ethernet, USB 2.0, SPI, UART, CAN, I2C. Of course, the joy of FPGAs is that if it can't manage whatever protocol you want out of the box, then you can soon persuade it to.


Connector-wise you should find most of what you need. Video is covered by HDMI in and out, and a Raspberry Pi style MIPI CSI-2 camera interface. On the audio side there is line in, microphone in and headphone out. There's also an ethernet jack and USB host. I/O is broken out as 5 12-pin Pmod connectors. There is of course a few buttons, switches and LEDs (single colour and RGB).


The board has plenty on of-board flash (16MB of Quad-SPI), RAM (1GB DDR3L) and an SD card slot. It can be jumpered to load the FPGA bitstream from either the flash or SD card on start-up. Alternatively it can be loaded via JTAG - either onboard or external. It's difficult to see why you'd use an external JTAG programmer when it has an onboard one, but I tested it with an external Digilent HS2 and the toolchain programmed the Zynq with no more fuss than the on-board one.


The options for powering the Zybo (wall wart, USB or battery) and how it loads the bitstream (SD, flash or JTAG) are selected via clearly marked jumpers. One of the issues I had with the physically smaller MiniZed is that they were not easily visible or selectable. It seems that the designer of the Zybo has a bit more physical space to play with and for a development board this is a good thing.


To sum it up, if you wan to do Zynq development then the Zybo Z7 pretty much has it covered. There are other Zynq board that might suit your particular use case better. Digilent have a range including the Arty Z7, ZedBoard, Pynq Z1 and Cora Z7 which range in physical size, connectivity and processor power (single core 7007S up to dual core 7020). The Pynq is unusual that it's intended to run a Python-base environment but Pynq can be built for other Zynq boards. There is also a wide range of boards from Zedboard - namely the PicoZed, MicroZed, MiniZed, and the already mentioned ZedBoard which I assume is a collaboration. There's also the Zynq Ultrascale+ MPSoC based UltraZed and Ultra96, but they're in a different league.


Anything else in the box?

Nothing other than the bare board is supplied when you buy the Zybo Z7. To be honest, I like that. I really don't need yet another micro-USB cable to join the growing pile. The Zybo has some very useful standoffs that hold the board nicely off the bench, and I'd definitely pick these over a USB cable.


The Pcam 5c

This road test also included the separately purchased Pcam 5c camera. This is a nice little camera module which reminds of the Raspberry Pi ones. It's 5 megapixel and comes complete with the ribbon cable to attach it to the board. It's also the same connector as the Pi camera, so the Zybo is physically and electrically compatible with the 8 megapixel Pi cameras. I haven't seen anyone using the Pi camera with a Zybo, and I suspect it would involve a reasonable bit of work writing initialization code and perhaps some IP to get it working. The Omnivision OV5640 sensor supports auto-focus lenses but the module itself is supplied with a fixed focus (bur adjustable if that makes any sense) M12 lens. Interestingly, there is an unpopulated IC on the back and the schematic refers to this being populated for the liquid lens version only. There is currently no liquid lens version available although I spotted this page asking if there is any interest for it. The example SDK project used in this road tests also has code for controlling the liquid lens, so prototypes definitely exist. I'm sure if you were so inclined it would be possible to convert the Pcam to a liquid lens.


Basically the Pcam 5c is ideal for getting familiar with video processing and could be changed for something more suited to a particular application if that was what you needed. For this road test, it's perfect and best of all should work out of the box with any sample code.


Switch it on

The Zybo can boot from one of two sources - the SD card or the Quad SPI onboard flash. (It can also wait to be programmed over JTAG.) Out of the box, the SPI flash has some demo code which has some simple button / LED stuff going on, but also outputs a changing pattern over HDMI. All you need to do is shift jumper JP5 to the middle position (QSPI), apply power and switch on.


My goals for this road test

As I've mentioned, I'm familiar with Xilinx programmable logic but still a relative newbie. My goal is to follow along with a workshop intended to get you started with image processing on the Zybo. This ZYBO Z7 & Pcam 5C HLS Video Workshop was originally a classroom workshop that Digilent decided to publish for anyone to follow. I decided to see if the Zybo was easy enough for someone like me to get to grips with. If I can follow the course and then feel comfortable enough to create my own simple image-based application then I'll call it a success.


The Zybo Z7 & Pcam 5C HLS video workshop

Measuring performance

The workshop starts of very gently, with a few short chapters on what an FPGA is and them moves on to an explanation of how parallelism and pipelining speeds up program execution. If you need to multiply a and b, it states, that would take a processor one instruction cycle. If you then want to add c, that will be another. There is a latency of two instruction cycles to calculate a*b+c. if this sort of calculation is performed on the 1920 x 1280 = about 2,457,600 pixels of a HD video frame well then you're looking a twice this - 4,915,200 instruction cycles to transpose the whole frame. However, whilst adding the c for the first pixel, the massively parallel FPGA can be multiplying the a and b for the second one. There is an initiation inverval of only one instruction cycle. If you can parallelize this then the whole frame is only n*1+1 cycles rather than 2n. In this example that 2,4567,601 rather than  4,915,200 - quite a performance improvement!


{gallery} Installing Vivado


Installing Vivado: Watch that case-sensitive user ID!


Installing Vivado: Forgotten password page


Installing Vivado: The free WebPACK edition should be all that's needed


Installing Vivado: 21.25GB without support for high end devices


Installing Vivado: After putting high end devices back it's 40GB!

The Vivado toolchain

From here, the guide goes on to describe the process of using Vivado HLS to take C/C++ and turn it into an IP module containing either Verilog or VHDL. The first step is to install Vivado and Vivado HLS. As it turns out, this is quite a sizeable step! I'd previously had Vivado 2018.2 installed when getting to grips with the MiniZed - another excellent Zynq based board but with less of a video focus. However, Vivado 2019.1 had recently been released and I thought it seemed like a good idea to try this out.

Would you like a new HDD with that?

Vivado is without doubt an impressive piece of software and as we've come to expect, there is a totally free version available for hobbyist and those who aren't using it in a professional capacity. It is however huge. I don't mean a few hundred MB. I don't mean a GB or so. Not even a DVD's worth. The full installer download comes in at a whopping 21GB, and warns that it will take up 40GB of disk space once installed! There is a web installer that only downloads what you need so I plumped for this and unchecked support for the higher end devices I'm unlikely to ever use - the Zynq Ultrascale and Kintex. This, the web installer told me, reduced it to a 7.48GB download and a final disk space usage of 21.25GB. 38 minutes of downloading and 23 minutes of installing later and I was up and running. This unchecking of high end device support turned out to be a false economy and I ended up with another 2.5GB and 14 minutes to put it back. (See later for why. Suffice it to say, just leave the defaults alone.)


The installation also wasn't the smoothest of processes. The installer autopopulated my username as Fred27 when it should have been fred27. It took me a while fumbling over passwords to realise it was the case-sensitive username not the password that was tripping me up. Even the "forgot password" link gave me trouble by revealing nothing but a 404 error.


In the end Vivado 2019.1 needed 25.9GB to install. Bear in mind that I've also got 17.1GB of its predecessor ISE14.7 to support older devices and CPLDs like the Coolrunner. As you'll see later, I ended up putting Vivado 2018.2 back on. Another 2 hours and 24.21Gb for that one! Does anyone else think that over 67GB of software is a little excessive to work with a few Xilinx devices?


It might be possible to trims this installation down a bit but it's still going to be big. Whilst poking around I even found 1.8GB folder that appeared to contain information on the latest Versal devices. They're only just starting to be available as samples so almost nobody will have one, and they certainly aren't supported by my free WebPACK installation.


Yet more version trouble

As if two version of Vivado weren't enough, most of the demo software on Digilent's Zybo Z7 page require Vivado 2016.4 - despite the board being announced late in 2017. The first couple of examples that jumped out were Zybo Z7 Pcam 5C Demo and Zybo Z7 HDMI Demo. They sounded perfect for a quick start on the Z7, but when I tried out the first couple of examples they certainly didn't port neatly over to either of the later versions. Some of the IP upgraded fine, but others had been customized and there seemed to be no simple upgrade path. Of course, a Vivado expert would probably laugh and say that upgrading is all part of the understanding, but don't forget I'm doing this from a beginner's perspective. I decided to skip these examples in favour of continuing with the Video Workshop.


An intro to Vivado HLS

After the first couple of chapters introducing the Zynq and Zybo, were on to finding out a little about Vivado HLS which is Xilinx's High Level Synthesis tool. I won't parrot out that content here, because to be honest the guide does a pretty good job. I will summarise it for you though. I came across only one stumbling block, and that was the fact that the example project defaults to the Kintex devices that I'd deselected on my initial installation. Try as I might, I couldn't seem to change it to a part my installation supported like the Zybo. With hindsight I now know that the device settings are under the solution folder within the project folder. (As a Visual Studio developer by day, I'm used to the project being in the solution!) There was much swearing, editing tcl files and finding them mysteriously being reset before I discovered this. This is the error message that hinted that removing Kintex support was about to waste a lot of my time:



First, you open up a built-in example that is installed with Vivado. It's based on a function that takes a floating point number and does some nifty splitting of the floating point number storage into its component parts (the mantissa, exponent and sign) and calculates a * 2^b. It's a fairly simple C function that you can imagine could synthesise well into hardware.


C Simulation

You're taken through the steps of creating not just the function, but a test bench that runs it with 16 random numbers and checks the result is correct. It's simple enough to run and debug the C code using the "C simulation" feature in Vivado HLS. For anyone who's used to C, nothing should seem strange yet.


C Synthesis

Next we get onto Vivado HLS's raison d'être - we synthesize this C function into HDL. Clicking the "C Synthesis" button on the toolbar performs the magic and outputs the function in both Verilog and VHDL. You also get a view on the resources used and the latency and interval values (both 2) as described in the intro.image


This later version of Vivado produced output that differed a little in style from the images in the workshop, which isn't too surprising. The output was essentially the same though. What I did notice was that the Kintex version gave a latency and interval of 1 as opposed to the Zynq's 2. Obviously the Kintex is a more powerful device but I wasn't expected to see such a change in a vanilla function like this.


Here you can see an analysis of what a step in the synthesized output represents in the C code.


Everything up to this point was just getting the reader familiar with the environment, and I noticed that so far I hadn't needed to plug the board in. Everything so far was code or simulation. If you're thinking about getting a Zynq board and haven't jumped yet, you can follow along to this point just fine. However, now it's time to get serious about video processing...


Creating a pass-through video pipeline

Creating the starter project

The video pipeline part of the workshop starts with an existing Vivado (not Vivado HLS) project. From a folder in the downloaded file you are instructed to run create_project.tcl. TCL is a scripting language for the IDE so rather than being an already setup project, it has instructions for creating the project in your IDE and folder structure. Unfortunately running this gave me the first hint that deciding to use the latest version wasn't the best idea. The project appear to create OK, but there wasn't much there. I couldn't see any top level block diagram but there were no obvious errors. Checking the TCL Console tab show this. system_2019.1.tcl was indeed missing


No problem. I just copied the 2018.2.tcl file and changed every mention of 2018.2 inside it to 2019.1. Re-running create_project.tcl was much better this time. I now had a block design showing the IP modules that make up this skeleton project.


Once again, I won't just parrot out what the guide tells you. You're far better getting it from the horse's mouth. It explains quite nicely how the different IP blocks generate clocks, process MIPI camera data into an AXI Stream of pixel data and how this pixel data is then formatted for HDMI output. This project has everything you need to stream from the camera to a monitor.


There is also a "control block" block design which contains "processing_system_7_0" - i.e. our microcontroller cores. This core is responsible for all the things that microcontrollers are good at. It does things like communicating with the OV5640 controller over I2C, initializing it and setting resolution, etc. The guide doesn't really go into the process of writing the code to run on the Zynq's ARM cores, but here's a preview of what's going on. Here you can see that I'm stepping through the code and the OV5640 has just been initialized. If you're following the guide then I recommend you do this. It's interesting to see that immediately after setting up the device, data is streamed through the FPGA fabric and the camera image appears on the VGA output. It doesn't matter that the code is at a breakpoint. That programmable logic "runs" without any intervention from the processor. It's this sort of thing that really brings the power of the Zynq home. Note that comment just above the breakpoint too. There's code to support the mysterious liquid lens!image


Getting back to the guide, after explaining what the IP is all about you're prompted to fire up the SDK. Here's where I hit another little bump. I was assured that the SDK would contain 3 projects:

pcam_vdma_hdmiThis is the C code that will be running on one of the ARM cores. The other core is unused for this project as we're really focusing on the PL side of video processing
pcam_vdma_hdmi_bspThis is the board support package. It is a collection of drivers for the peripherals we're going to be using. For most microcontroller development, the peripherals you have are fixed and determined by the device manufaturer. Not with the Zynq!
system_wrapper_hw_platform_0The hardware platform is the hardware you're created. When starting this project it just contains IP written by Xilinx, but your own HDL will go here. Inside you will find the bitstream that defines the PL side of the Zynq.


However, all i could see was the HW platform. The other two projects were missing. At this point I must admit I fiddled around looking for another incompatibility with Vivado 2019.1. I reinstalled 2018.2. I wasted a lot of time. I'm embarrassed to say that all it took to solve the problem in the end was to go File / Import / Existing Projects into Workspace and to add in the missing projects. Obviously the project structure wasn't initialized properly, but I'd assume a far more complex problem than there really was. Oh well. What's another 2 hours and 24GB between friends? If you're following along and getting "out of date hardware" warnings, I can also recommend regenerating the bitstream in Vivado and then exporting the hardware - File / Export / Export Hardware then choose the hw_handoff folder and confirm overwrite.


At this point we have our starter project running, and whatever the camera sees is swiftly pipelined out the the HDMI TX port to be shown on a monitor.


Checking that the hardware synthesises

When we initially fire up the SDK we use pre-synthesised hardware, and we're presented with a warning about it being out of date. This doesn't matter for our first run, but just to make sure things are working as we should, let's go back to Vivado and resynthesize the bitstream. A few seeming innocuous warning messages pop up, but they are a little more serious than they appear. A little while later (OK - a long time later) Vivado comes to a grinding halt and is unable to work with the out-of-date IP.


Feeling confident after my earlier Vivado-upgrading skills I fumbled around for a while trying to rectify thing. Maybe I'd have got there on my own. Maybe. Probably not. Luckily one of my fellow roadtesters got there before me and found the IP change that meant it wouldn't compile properly. I'll let him explain it properly but it seems that removing an unused line buffer was what was needed. Thanks, Fabio!


From Vivado to Vivado HLS

Now that we have the project up and running it's time to create a module that does something with the video that we're now streaming through our FPGA fabric. We switch into Vivado HLS and start writing some C++ code that will be synthesized into hardware. The guide takes us through the process of creating new C++, header and testbench files, but to be honest they're already in the folder so it's easiest to add existing files rather than create new ones.imageimage


First we run C simulation. This step is somewhat glossed over. Whilst the guide doesn't mention it, the test bitmap that is supplied to the function is processed and output by the testbench. Then we move on to synthesis. This is really what it's all about - synthesising a function that performs edge detection to run in hardware. The provided edge_detect.cpp is synthesized and we get a report telling us amongst other things that the processing of our test file will take about 4.6 million clock cycles to perform edge detection on.



So, can we do anything to improve this? Sure we can. There are various directives that can be applied to the function to influence how HLS generates the logic. Whilst the input and return types of our function are marked as a stream, we can suggest that an AXI-Stream is used. Running the synthesis again significantly reduces both the latency and interval values to about 0.9 million clock cycles - about a 5-fold improvement.


Exporting our image processing function

Now that we're happy with the performance of our function we can export it as RTL. Our own little function becomes IP that we can use in Vivado in just the same way as the Xilinx IP. I went off-piste and tried exporting both the optimised and non-optimised version of the IP. Before optimisation, the input and output streams are not in the AXI-Stream format needed to slot in between the DMA block and the video out block. After adding the required directives as instructed, it fits perfectly.


After synthesizing the bit stream and exporting the new hardware, we could once again fire up the SDK. This time the video that was streamed from the camera to the HDMI output appeared after being processed just as the fox test image was. Success!




Well, my intention for this road test was to run through the Zybo Z7 and Pcam 5C Video Workshop and to see how a programmable logic beginner got on. I understand that it's the hardware that's being road tested, but to be honest hardware is nothing without the tools and support that go with it. Having said that, the Zybo is a fairly advanced device and video processing isn't the easiest of subjects to tackle. It would be ridiculous for me to think that you could go from nothing to creating a Tesla Autopilot with a development board, a course worksheet and a few free weekends.



I have to say I'm very impressed with the hardware. The Zybo Z7 is well spec'd and easy to work with compared to other Zynq options. There are no ridiculously tiny switches like the MiniZed. It has Ethernet - which I personally find preferable to WiFi. There's HDMI in and out, plus a camera connector, which is obviously ideal for any video work. I suspect it will be my go-to FPGA / Zynq board over the Arty S7 (just FPGA) or MiniZed. You might be tempted by the Pynq-Z2 if you want to play with Python. Whilst the Zybo isn't supported by a pre-built Pynq SD card image, Adam Taylor's excellent MicroZed Chronicles has a guide to building one here so even that should be covered. MicroZed Chronicles is essential reading for anyone using Zynq.



Vivado is a bloaty beast. There, I've said it. It's wonderfully impressive but needs to lose a few pounds. I understand that it does a lot but I'm sure it could be a 10th of the size it is. The 1.8GB folder for unreleased Versal devices kind of proves that. The examples I could find seemed good, but were let down heavily by the problems porting them from one of many old versions to the current one. This made them pretty much unusable by the sort of newbies that would benefit the most from them. Synthesizing is also slow, but I can well believe that enough is going on to justify this. I think it is comparable with other manufacturers' offerings.


The video workshop I followed did help a lot. It was clear and easy to understand. The only bits missing (if I was to judge it standalone) were the bits where it said "discuss what you find". There is obviously some input from an instructor missing here. That's about the only place you can see it's a classroom course that has been kindly put online. Whilst I feel comfortable following it and I have the basics I don't feel that I could now create my own video project from scratch. To be honest it's unreasonable to expect to. The Zynq is not a simple device. For instance, a friend was complaining about the timing of competitors when his canoe club has races. I know that the Zybo/Pcam would be able to be set up to recognise a canoe with a number board on it passing the finish line. Would I be able to do it myself? Not within any reasonable timescale.


In summary then, if you want to do any FPGA video processing then the Zybo is an excellent choice. If you want a general Zynq board, I'd also say it's a pretty good choice too but it's worth seeing if another board has a particular feature you need - like Arduino-style headers, Bluetooth or WiFi. Even if you just only to play with the FPGA side of things, that's not a problem - the ARM core can just be ignored until you want it. The Zybo is great. Thanks to E14 and Digilent for letting me road test it. I'm looking forward to continuing down the Zynq path with the Zybo Z7. It's a long road, but one well worth taking.

  • Have you ever tried to tie Zybo with MATLAB, that helps in resolving soft Dev issues faced in the beasty VivadoSuite, MATLAB provides a better learning paradigm especially for Xilinx devices, whether it be logic sim or High level Synthesis, similar to Altera's(Intel) Quartus II.

    Key point to remember : XILINX always had the best Silicon, but Altera possess the better Software.

  • I have to agree with you fred27 ??? . The software allows the hardware to function the way you need it to. Without the software, all you have is a door stopper.image

  • Nice review of the Xilinx Z7 family member, your road-test met every aspect needed to understand the hardware closely. Thumbs Up.

  • Well done! I have only just realised you already published your review, that was quick!


    Nice walk through of the board and the Video Workshop. It certainly makes a great introduction for those who are thinking of starting their journey into the Zynq world, which can be a scary thought!


    Also, many thanks for the mention. For sake of completeness (or pedantry perhaps image), I will explain what the problem was with the Workshop demo original files. As many users of Vivado will have experienced, porting a project created with an older version of the tool to a newer one can be problematic. The biggest problem, which accidentally was also causing the error reported for the Video Workshop demo, happens when the design contains some IPs which undergone some major change (i.e. the interface signals), which end up breaking the synthesis of the project on hardware.


    In particular, the workshop demo uses the Digilent IP MIPI_CSI_2_RX to interface the board with the Pcam 5C, which in turns leverages the AXI4-Stream Data FIFO IP from Xilinx to manage the data. The version of the Xilinx IP used at the time of the demo has been created, exposed the 'axis_data_count' signal on the interface, which has been later removed from the more recent versions of the IP, hence causing a 'not found' error when trying to synthesise the design using Vivado 2019.1.


    To solve the problem, I had to change the VHDL file in the MIPI_CSI_2_RX IP implementation, commenting out all the occurrences of the offending signal (axis_data_count).



  • Thanks for the feedback.


    I agree that the combination of ARM and FPGA is what makes the Zynq stand out, and that's why it interests me. I decided to focus on the video side of things for this road test and that's where the workshop I was following took me. The code running on the ARM core is baremetal code on one core only and once the camera is configured it doesn't really do anything. The workshop doesn't mention it at all. I'm definitely going to continue with the Zynq and make use of petalinux, just not within the scope of this road test.


    I'm looking forward to reading your road test too. I hope we'll all tackle different aspects of the Zybo as there are so many!

  • Nice review.  The Vivado tool is quite extensive and powerful and can be a bit overwhelming even for the seasoned FPGA developer.  It was interesting when Xilinx decided to switch from Xilinx ISE to Vivado with no legacy support, so depending on the FPGA used, one could end up having to use either ISE or Vivado causing more tool management for companies.


    Your review seemed to only focus on the FPGA side of the Zynq processor or the pass-through capabilities, so it would have been nice to see some examples using the ARM Cortex-A9 side as well with something like PetaLinux.

  • Good road test report.

    The video response was impressive.


  • You're right - Vivado and the Zynq are huge and not particularly beginner friendly. It's not like jumping into a little bit of microcontroller programming. I'm definitely going to continue with it though. Zynq is a really powerful tool that I want to feel comfortable wielding, but it will take a while to do so without risk of injury!


    I'm sure I'll be posting more on the Zynq in the future. Retracing my steps through the Path to Programmable is probably a good next step. I'm really looking forward to seeing how the other road testers tackle things too. I'm sure some will be very different.

  • Interesting review. It all looks a bit overwhelming (Vivado software, getting to grips with video processing etc.). You did well to come out the other side unscathed after only a few weeks to get to grips with it all. I suspect that this is one of those RoadTests, where you could have done with a few more weeks to let it all sink in.