Hello
Below is an 256x256 image of the output from my modified P1300C Microzed Vision Kit reference design. My modifications included removing TPG and HDMI references and modifications to the ARM code ROI the sensor output to 256x256. Basically I am left with the image processing pipeline from the Avnet onsemi camera IP to the VDMA IP. I then output to ethernet from the ARM program. My design works at the 27MHz Python PLL input clock frequency driven by the 108MHz clock divider in the reference design. When I increase the Python PLL frequency to 40 MHz by increasing the clock divider to 160 MHz. I am seeing light vertical lines every 16 columns and a much brighter vertical lines at around the sixth column (See included image, I purposely snapped an image of a dark background so that the lines would be more accentuated). I am thinking these artifacts are a result of the timing relationship between the LVDS output clock and the 5 LVDS data channels. Since I am getting frames at all it would appear that the sync channel is working. I thought that the constraint on the Python LVDS output serial clock would need to change, but it is set at 270 MHz, which is well above the 200 MHz I expect when I give the Python a PLL frequency of 40 MHz. I have also changed the manual Tap value to see if controlling the IODELAY to the ISERDES block imprves the timing. It didn't seem to help or hurt. Has anyone successfully increased the PLL clock frequency for this vision kit so that the camera can be operated ar higher frame rates?
<html><head><title>Jive SBS</title></head>
<body><font face="arial,helvetica,sans-serif">
<b>Error</b><br><font size="-1">
An general error occurred while processing your request.
</font></font></body></html>