SDR & HDR 1080p/4k capable setup with Hyperion-NG for Media Center

  • Manipulation with saturation can make some picture HDR beatiful, but average result is a lot worse. I started from that solution. The main problem is messed up gamma and that LUT HDR patch fixes it.


    I haven't met a test screen (true HDR "in SDR") that can be broken with the LUT correction and I can provide you a lot of samples that increased saturation causes totally messed up & unrealistic results. Skin tone is best example: you must like orange, a lot of orange everywhere. Then if you start playing with hue to reduce reddish then the yellow starts look greenish, it really leads to nowhere. The input scale to fix is not linear.


    Anyway first test release for LUT-HDR to play with: https://github.com/awawa-dev/HyperHDR/releases

  • As i said, your own screenshots looks basically the same, do u see any meaningful difference in leds colors?
    Also there is the issue of performance, i have gaming in mind and it needs to be at least 60 fps processing.
    Why bother with buying better SPI leds when it will be even less than 30 fps, also MT optimisation works, but only if u have the cores, not anyone has.
    Its great works nonetheless, just looking for more answers.


    Is the decimation first? I dont remember now but is the decimation resolution independent in hyperion, 4k and 1080p outputs the same image res? Why would someone use 800x600 as input when you can go as low as 192 x 108? Then we could maintain high fps, first decimate to as low as it can be and only then do the processing, this should be flawless 60fps+ without frame skip.


    And why you said that the input scale is not linear? Do the grabbers cut off the colors non-linearly? If so how, i did read your PR about it, where you explain that chrominance (color) is well preserved but its the gamma that is the issue, gamma is about shadow and highlights, some could simply say about brightness or white/black relation, would it change colors and why so much if the colors are in the chrominance.


    Still one thing to remember, the fast hdr project works with true HDR data input, not broken sdr.

  • FPS are side-effect. For Rpi3 20-30% of frames are broken/incomplete and we need more but more import is latency:
    - before 140ms and that must be computed on single core, almost everything else must wait for resources.
    - after 30ms and can be computed on different core: both timing depends on jpeg frame size/content
    And single core usage that causes Hyperion to pause processing data => then visible lag between TV and LEDS. We cant always blame the grabber or too slow CPU. Especially in case when sizedecimation (that causes frame reduction) improves situations for users as it's often reported and it's possible to improve code performance.


    640x480 in my opinion is useless for MJPEG. The jpg artifacts are too noticeable and have effect for the LEDS. We can process JPG better frame by frame but again: this also would increase load. And you could not get even 30FPS on 640x480 before: that the lower limit of that FullHD grabber. The raw jpg frame from the grabber for 800x600 if I remember correctly has only 14kb (depends on scene): that's how much details you loose with lower resolution.

  • Hmm, i think there is some misconception here, do we have the fullhd frame and resized inside hyperion or the grabber is configured to output a lower res? What is the default MJPEG grabber quality, does it look good? Cuz i dont know how the grabber does it internally, also is there any interpolation in the hyperion decimator? Well i was assuming the grabber quality is quite good, hyperion doesn't output more colors than leds used, so in theory we could use very small images, but they need to be resized with some filtering.

  • Frame can be resized inside the Hyperion when cropping or sizedecimation is set (and maybe black border detection, didnt check that). It's additional cost, but smaller result frame can compensate it with.


    But that's not the case. Without changes there isn't enough power on certain setup to process raw frame that size can be set on the grabber and it's job to resize it.
    Frame size is small and quality is poor at least for 640x480, for 800x600 is better but it's minimum quality that I can accept.


    -------------------------------------------------------------------------------------


    EDIT:
    As there were some reports about conflicts between some systems and installers that were native built, now installers are build using standard Hyperion Docker images + libturbojpeg0 package. Also few new changes and code optimization.
    For now I'm think I'm done, maybe if some bugs to fix in my fork appear I'll back to that subject. I'm happy from the new features and I use then with every movie (HDR or SDR forced to bt2020 by codec) from some time.


    https://github.com/awawa-dev/HyperHDR/releases

  • As an experiment I've run it on Pi zero.
    In fact I'm quite surprised how it performs.
    As you can see you can easily go below 80ms lag delay.
    That better's timing than I had on Alpha 7 on Rpi3:
    Rpi3 Alpha7 800x600 size decimation 1 => lag 140ms
    Pi Zero from fork 800x600 size decimation 1 => lag 80ms
    I write some optimization mainly for HT but there is one, critical for me issue with video buffer copy procedure for MJPEG that affects single thread also so I've changed it too.


  • I started to dig it again and I found this:
    https://gathering.tweakers.net/forum/list_messages/1986718
    Another surprise. It's seems that there is one more benefit for USB3.0: YUY2 encoding that provide much more quality and better performance. Unfortunately it's disabled when the device is connected to USB2.0 and MJPEG is enabled instead. That's why I didn't see it on Rpi3 but I tested it on Win10 and it was missing either. I've connected ezcap to Windows now and YUY2 is enabled! I could swear that I didn't see that in Windows either before firmware upgrade from the ezcap page.




    Performance & quality on Win10 for that light encoding are great! 1080/60FPS with almost no frame lost. My next setup is Rpi4 :)
    And LUT for fast YUY2/HDR translation is incoming ... No need to convert YUY2->RGB that is processed by Hyperion using live equations anymore :)

  • Hi guys,
    i'm not new in diy ambilight. I used for about 2 years a lightberry hd, then updated to 4k with their hdmi 5 box. Anyway after some months hdmi 5 box stopped to function and now i've updated my system thx to you. I own a raspberry 4, and next week finally the ezcap 269 will be in my hands.
    After i readed this post i'm a bit confused, what should i do to have a correct hdr on my system? There's a definitive guide for a really noob (I just bought raspberry because of ambilight).
    Thx a lot for the patience.

  • @Tyler983 As you have Rpi 4 probably you want to wait a little as I'll finish optimization for YUY2 encoding that is enabled when Ezcap 269 is connected to the USB 3.0 port of Rpi4. You can of course try my latest build to test YUY2 but the performance for that encoding isnt great: I took the procedure from Hyperion.NG 2.0.0.7A...in facts it's worse than MJPEG decoding from my fork now but I can only test on camera that supports that encoding. With LUT direct YUV>RGB translation I improved the performance nearly 50% without tone mapping and 75% with tone mapping in first tests for 1280x720. We will see how it works when my new Rpi4 will come from the China....now it's only speculation.


    The penalty of MJPEG are jpg artifacts that can be visible on LED especially on dark scenes. Even if the movie is stopped but the grabber is still working there is a little noise. That's why YUY is better and the status I explained on the start. Higher resolution can help quality of MJPEG stream.


    For now the old way is recommended. You have to have working LED system connected to the RPI and HDMI to USB signal converter for ex. Ezcap 269 (that's not the subject of that topic):


    1 I recommend you to connect Ezcap 269 to USB 2.0 port Rpi3/Rpi4. This will force MJPEG encoding.


    2 Install Hyperion from my fork


    3 Generate LUT table from the LUT generator page (link in the grabber configuration page) and upload it to the Hyperion configuration folder.
    Typically /home/pi/.hyperion/.....


    4 restart service or RPi. Then enable HDR tone mapping in the grabber properties. For better performance try Border mode (mainly for leds) or the full screen to preview result.


    5 Check result in the live feed (upper right corner) and debug log (System->Log).

  • Thanks for the answer, for the moment I think I'll wait for the yuy2 optimization, I'm in no hurry ... What I didn't understand, being an amateur, how do I install hyperion from your fork? Actually I installed hyperbian on my pi4.
    Not using a pc connected to the ezcap, but only hdmi video sources (Playstation, payperview, etc ...) how do I generate a lut with my settings?
    Thanks again

  • how do I install hyperion from your fork? Actually I installed hyperbian on my pi4.


    Well, it can be a hard way if you dont use Linux...I dont have Hyperbian but Raspbian so the first thing you need is to uninstall Hyperbian.NG and install my fork (link in the signature, in releases there are compiled packages).


    how do I generate a lut with my settings?


    Default settings should be sufficient if you experience bleak colors (examples are on my fork page), click only save LUT.


    Thanks for the answer, for the moment I think I'll wait for the yuy2 optimization, I'm in no hurry ...


    In fact I think I finished it. Performance for YUV/YUY2/YUYV,UYVY increased x3 and MJPEG x10. There is of course support for multithreading but it's crucial thing rather for MJPEG decoding on Rpi.


    So it can be used even without HDR support just for better performance and much smaller lag.


    As I'm next week offline & more testing is needed & I'm still waiting for my Rpi4 I will prepare release later or you can compile that version from master branch sources (keep in mind that the LUT format for that version has changed and is needed to be regenerated).

  • Thx Awawa i think i need an intense linux course, cause your words are incomprensible for me. My Fault. Anyway as i told, i'll wait the yuy2. Today i received the ezcap 269, i'm atwork office, but when i'll be at home i will test it. Thx a lot for your precious support

Participate now!

Don’t have an account yet? Register yourself now and be a part of our community!