Adjust Camera in Hyperion - Color - Help required

  • OK, I was able to "straighten" the images allright, BUT the performance is really subpar ^^


    1) per-frame "remap" operation (supposedly the fastest transform method) adds a significant lag, ~100-300ms on Pi 3B+ for a single 720p frame. I can process all frames with all 4 cores loaded 100% and only reach 5-10 fps tops. And this is using OpenCV library optimized for Pi CPU.

    2) "remap" operation necessarily has to interpolate pixels, which decreases the overall image quality, especially towards the edges, which we care most about.

    3) fisheye camera with >180-deg angle apparently results in "cropping" of the image. By playing around with the "balance" parameter, I'm able to "enlarge" the FoV, but this results in more interpolation towards the edges.


    Overall, results are maybe not too bad, but performance leaves this solution impractical IMO. Here're a couple representative images. I'll post next about another potential solution.

  • Just another update, I'm working on a new calibration procedure that will hopefully avoid the need to remap the video at capture time. It will auto-generate the LED location list and even upload it to Hyperion. So ? this should provide a good way for all camera users to auto-set up their LED layout, and avoid the load on the system at capture time.

  • Another update - my auto-calibration script is nearing completion. Early results are encouraging, but we'll see once I receive the real strip. There're some small imperfections and I'm still polishing it, but this should hopefully provide a "good enough" way to do warped trapezoid calibration for any camera, as a simple command-line tool with minimum to no input and no runtime overhead.

  • Alright, I was finally able to generate the LED coordinates using the full distortion+perspective transform for the camera! This approach is much more accurate than what you saw above, where I tried to measure the approximate locations from screen capture of individual LED areas drawn on the screen.


    So the new calibration procedure will consist of:

    1) one-time run of calibrate_camera.py script, which displays a checkerboard pattern on the screen. The user just needs to move the camera around at various angles pointing at the pattern. The script then measures the matrices of distortion coefficients and stores the results in a JSON file for future reuse.

    2) run of calibrate_screen.py script, which takes an image of the blank screen, and then applies forward and reverse distortion and perspective transforms to determine the screen crop area and the distorted coordinates of the LED areas, and finally uploads them to Hyperion.


    Script (1) takes several minutes, but only needs to run once for a given camera+lens combination.

    Script (2) needs to be re-run every time the camera moves, but it only takes a second, so not a big deal.

    Both scripts would need to be run in graphical mode, so might have to enable it on the Pi.

    But other than that, this doesn't require any additional processing power from the Pi, since the generated LED list is used by Hyperion directly without any extra video processing.


    The early results look quite good!

    Going to massage the scripts for general use, and test on a real strip when it arrives next week :-)

  • dinvlad From the next Hyperion release onwards, the HWLedcount of an LEDDevice will be leading. Therefore, please take care that you do not configure more LEDs in the layout than configured as per given LED Controller.

    You can test against the current Master Branch…


    Edit: As you anyway read the config, the HwCount is part of it… ;)

  • Lord-Grey thanks, I forgot about it! So should I update `hardwareLedCount` simultaneously with setting the LED layout?


    EDIT: Just did that :) I'm updating both that and max LED count - not sure about the difference.

  • But other than that, this doesn't require any additional processing power from the Pi, since the generated LED list is used by Hyperion directly without any extra video processing.

    Very smart!


    You could avoid graphical mode by asking the user to switch to the right image (like the webui's color calibration do in hyperion)

    For lot of people, showing an image using a chromecast, kodi.. would be easier.


    Looking forward the real result!

  • So should I update `hardwareLedCount` simultaneously with setting the LED layout?

    Please do not!

    There were multiple problems with the previous approach that the number of LEDs from Lay-out were used, even when the number exceeded the physical number of the LEDS. Therefore, the HwLedCount is now the leading one again. In the coming Ui, the User is forced to have the HwLedCount set and the Ui ensures that the layout will not exceed that number (and yes, the API should check that too and return an error going forward).

    Until then please ask the user to have the HwLedcount matching the physical number of LEDs. The layout you create can then be <= HwLedcount.

    I ask for here your Support that we do not get inconsistent configs via a „Backdoor “ into the system.

  • Lord-Grey ah, I see, understood. I'll check for that then!


    David Hequet that's an interesting idea! If this proves useful to folks, we can think about developing this into a plugin for Hyperion perhaps - it only needs OpenCV library, which is easily usable from C++ (I can draft later if there's enough need).


    Another slight improvement we could consider is to add support for non-rectangular LED areas. This will provide more accuracy for color matching, compared to the current "rectangle aligned with x-y axes only" model. Let me know what you think :)

  • I've received the strip (ws2815) but believe it or not, I just can't get it working from Pi for whatever strange reason (electronics huh). It even works with a stock SP107E controller, but won't light up when I connect it to GPIO 18 (with or without a level-shifter..). Not even with Adafruit NeoPixel library. So I'll have to ponder on that until I can finally test the whole setup :/


    EDIT: Solved it! As explained here RE: Anyone having success with WS2815?. Finally, I'll be testing the new setup tomorrow!

  • Hello Folks!

    dinvlad I tried your script, the exact process i did is:

    - install Raspbian with desktop (https://downloads.raspberrypi.…-raspios-buster-armhf.zip)

    - install hyperion ng

    - turned off platform capture in hyperion (maybe this should be in the readme) to be able to use the 4k tv for your script (as I know hyperion platform capture and 4k output doesnt work together)

    - installed a dependency manually becasue of this error (I cant remember what was it):

    Python
    ~/HyperVision $ python3 calibrate_camera.py
    Traceback (most recent call last):
    File "calibrate_camera.py", line 8, in <module>
    import cv2
    File "/home/pi/.local/lib/python3.7/site-packages/cv2/__init__.py", line 5, in <module>
    from .cv2 import *
    ImportError: libcblas.so.3: cannot open shared object file: No such file or directory

    - then I was able to run the camera calibration

    - then tried the screen calibration, i get this error:

    Code
    ~/HyperVision $ python3 calibrate_screen.py 
    Traceback (most recent call last):
      File "calibrate_screen.py", line 379, in <module>
        main()
      File "calibrate_screen.py", line 357, in main
        crop, new_k_inv, p_t = get_screen_transforms(blank, cam, args.cam_alpha)
      File "calibrate_screen.py", line 196, in get_screen_transforms
        cam_alpha,
    SystemError: new style getargs format but argument is not a tuple

    I'm not a python magician but tried a few things, and failed with it, so could you help me out?

    Python version: 3.7.3

    OpenCV version: 4.5.1


    The last answer here maybe related to this issue but I'm just guessing: https://stackoverflow.com/ques…tuple-when-using/43656642

  • stewe93 thanks for trying it out! This looks like an OpenCV library problem. Did you install it using `pip3 install opencv-python` or some other method? Tbh mine was cross-compiled on another host, so even pip3 install might not be sufficient.


    Digging deeper into the error, looks like it complains about BLAS library - could you try the following?

    Code
    sudo apt-get install libblas3

    Btw, I just pushed a fix for the final calibration script - you might have to re-pull.


    For the second error, looks like the problem is in `cv2.getOptimalNewCameraMatrix()` function, and you're using Pinhole camera model (is that the case?). Tbh I haven't tested this model yet - was half-expecting it to fail ;) I'll try to take a look with C920 this weekend.


    Btw, could you post your `params.json` file generated by the first script (maybe via a gist)? That would help with troubleshooting.


    I'll also update README with your and other clarifications..

    (In other news, been too busy to mount my strip to the TV - will probably get to it this weekend ?)

  • Hi - for those running a Logitech C270 webcam, could you please confirm:


    1. Whether the C270 is capable of 60 FPS

    2. Confirm the content in the video demos is either 30 or 60 Hz?


    When I tried a C270 I could not get rid of the rolling lines. Colours were good but any tweaking would be catastrophic for the output.

  • dinvlad I restarted the whole process to properly document every step of it in case anyone else interested. Im using a Raspbarry Pi 4:

    • Installed Raspios with desktop (https://downloads.raspberrypi.…-raspios-buster-armhf.zip)
    • Enabled SSH and configured Wifi by wpa_supplicant file on boot partition
    • After first boot, applied updated with sudo apt update && sudo apt upgrade
    • Installed Hyperion NG with the deb package (wget https://github.com/hyperion-project/hyperion.ng/releases/download/2.0.0-alpha.9/Hyperion-2.0.0-alpha.9-Linux-armv7l.deb && sudo apt install ./Hyperion-2.0.0-alpha.9-Linux-armv7l.deb)
    • After installation, the 4K output to the TV is gone, but its "normal" in this case, you should turn off Platform Capture under capturing hardware on the hyperion web interface (http://YourIp:8090/#conf_grabber)
    • sudo reboot via ssh
    • Checked out Your repo (git clone https://github.com/dinvlad/HyperVision.git)
    • Your suggestion is already insatalled but found the package I installed: sudo apt install libatlas-base-dev to make it work.
    • Installed the requirements "by hand" (pip3 install requests opencv-python screeninfo)
    • Calibrate camera runs without problem after all of the above
    • params.json
    • python3 calibrate_screen.py in my case need some fixes, I did figure out the followings:
    • At this point I stuck with the following error:
    • A picture of the enviroment



    Python version: 3.7.3

    OpenCV version: 4.5.1



    Could you please help me out with this? I also tried the fisheye calibration but that failed as well for me.

  • @stewe93 interesting, thanks a lot for sharing the details! I'll take a look on the weekend. Sounds like maybe something is up with the camera resolution (and I have to swap the dimensions for Pinhole, looks like). Could you confirm you cam supports 1280x720 (they should all do, but just in case)?


    Also, for the 2nd error, did you make sure to turn off all lights (as the readme says)? :) This is for sure what would explain it, as the script needs to find the 4 corners of the screen by flashing a blank white full-screen background; and if there's external lighting, then this process fails.

  • 1. Whether the C270 is capable of 60 FPS

    2. Confirm the content in the video demos is either 30 or 60 Hz?

    1. I´m using the C270 with a resolution of 320x240 due to the limited power of the raspberry pi zero. Pixel Format in my case is 'YUYV'. Using --list-formats-ext in terminal shows that the mentioned pixel format and resolution goes up to 30 fps. Scrolling through the other lines I don't see fps above 30, so I don't think the C270 is capable of 60 fps.

    2. I tested using the Philips Lightwave Video showing the stats for nerds in youtube and it shows that the video runs at 24fps. Then I used another video at 60fps and in both cases I don't have problems with rolling lines.

Participate now!

Don’t have an account yet? Register yourself now and be a part of our community!