Beiträge von gigahrebic

    For some reason my post was deleted? I'll try to recap again.


    Okay, I will try to explain from the beginning what it is I'm trying to achieve.

    From one video signal, I'm trying to control two instances of LEDs, just as your picture described earlier.


    1) Instance 1 - Regular Hyperion Ambilight - Takes the input signal from the my HDMI device and sends one copy to the TV so it can display, and another copy to the USB Grabber so that RPI3B can produce LED values for the LEDs attached to the back of the TV. This works perfectly and has been working fine for the past 4 years.


    2) Instance 2 - New Code - Takes input signal like in 1) but instead of zoning around the perimeter of the signal, it's cropped to a specific zone depending on which media is loaded up. For example, in a fighting game, it's zoned around the health bar. I then use my code to determine its color based on image recognition. I tested this code with screenshot examples and a spare RPI3A + spare APA10s I had lying around but have not tried it with a live video signal yet. In practice, I will use my Arduino UNO and 13 m of WS2811s to control this Instance.


    My immediate problem is this: I cannot access or forward the video stream to my second RPI to test my code on a live video signal. I know the first RPI and Hyperion sees and uses this:

    but I've no idea how to get my second RPI (or perhaps first) to see it without having to open Hyperion's web configuration.


    With your python program you can listen to the captured Image Stream, do your image evaluation and then trigger (color-) effects.

    Note: Make use of a Web-Socket communication in this case.


    I tried to make sense of this but was not able to in these past few weeks.

    If I understood correctly, it's a matter of:

    1) opening a socket communication on my python program and

    2) using the following code to finally get access to the video stream for the image evaluation

    Code
    {      "command":"ledcolors",      "subcommand":"imagestream-start"
    }


    but I'm still stuck on even getting the socket communication established. I tried using the sample code above to set my original RPI as a server and my second RPI as a client so that with the second RPI I can access the video stream and use it to test the code before moving everything over to the WS2811s but have had no luck.


    I used ports 19444 and 19445 but I get "OSError: [Errno 98] Address already in use" which I assume because Hyperion already has them designated, but my client.py code for accessing the port doesn't return or display anything other than the debugging text I wrote inside the loops? So I used the default 10050 port provided in the example above but it returned the error above as well.


    Zitat

    Are you already running Hyperion of the same video input in parallel?

    Yes, I thought that was the goal here? To use the same video input in parallel for both Instances.


    PS - cv2.VideoCapture(0) I had this originally and it returned the exact same error.


    Does that help describe the problem?

    Alright, it's been a few weeks and I have not had much luck. I spent a long time trying to wrap my head around the TCP Socket thing but am unable to simply find the video stream source. I thought it'd be as simple as have my second RPI read video stream-->execute code. But I can't even get that step.


    This is all I managed to do, based on this article: https://medium.com/nerd-for-te…-with-python-6bc24e522f19


    This is what I did for my code:

    "server side" = original RPI running current Ambilight for TV Edge, Instance 1


    "client side" = second RPI to run Instance 2


    and this is the error message, after confirming a socket connection between the two RPIs:

    Code
    Connection from: ('192.168.1.31', 55162)
    [ WARN:0] global /tmp/pip-wheel-2c57qphc/opencv-python_86774b87799240fbaa4c11c089d08cc3/opencv/modules/videoio/src/cap_v4l.cpp (802) requestBuffers VIDEOIO(V4L2:/dev/video0): failed VIDIOC_REQBUFS: errno=16 (Device or resource busy)
    [ WARN:0] global /tmp/pip-wheel-2c57qphc/opencv-python_86774b87799240fbaa4c11c089d08cc3/opencv/modules/videoio/src/cap_v4l.cpp (802) requestBuffers VIDEOIO(V4L2:/dev/video0): failed VIDIOC_REQBUFS: errno=16 (Device or resource busy)
    [ WARN:0] global /tmp/pip-wheel-2c57qphc/opencv-python_86774b87799240fbaa4c11c089d08cc3/opencv/modules/videoio/src/cap_v4l.cpp (802) requestBuffers VIDEOIO(V4L2:/dev/video0): failed VIDIOC_REQBUFS: errno=16 (Device or resource busy)
    [ WARN:0] global /tmp/pip-wheel-2c57qphc/opencv-python_86774b87799240fbaa4c11c089d08cc3/opencv/modules/videoio/src/cap_v4l.cpp (802) requestBuffers VIDEOIO(V4L2:/dev/video0): failed VIDIOC_REQBUFS: errno=16 (Device or resource busy)
    [ WARN:0] global /tmp/pip-wheel-2c57qphc/opencv-python_86774b87799240fbaa4c11c089d08cc3/opencv/modules/videoio/src/cap_v4l.cpp (777) requestBuffers VIDEOIO(V4L2:/dev/video0): Insufficient buffer memory
    [ WARN:0] global /tmp/pip-wheel-2c57qphc/opencv-python_86774b87799240fbaa4c11c089d08cc3/opencv/modules/videoio/src/cap_v4l.cpp (890) open VIDEOIO(V4L2:/dev/video0): can't open camera by index


    Am I thinking about this too hard, or missing something very obvious? I thought I could just open a specific port like 19445 and get the videostream directly from there? Any advice?

    Hi Lord-Grey, thanks for replying.


    Zitat

    From the above I was not clear, if the effect is send to the same LED-Device or a different.

    In Hyperion-NG you can define multiple LED (i.e. Output) Instances. So you can decide to which instance you send an effect from your python program.

    Apologies, I'll explain the situation more clearly.

    I want TWO LED systems running from ONE input signal:

    • 1) 1 LED system will function like a normal Ambilight where it's reading zones around the border of the display signal and mirroring LEDs to extend that display from behind the TV. This is already fully functional from before.
    • 2) Another LED system will be used as background for my entire room. I want this to only read a specific zone. My code takes this specific zone and outputs colors based on the analysis. For example, health bar in a videogame. This is my brand new venture.

    The way my code for 2) works right now, is that it will output an RGB value based on whatever image it processed. However, I only have tested this with premade screenshots, not a live video feed. My biggest obstacles are as follows:

    • 2A) I need my code to access the same video feed that 1) is seeing. I have the older ambilight setup with the USB grabber + HDMI2AV converter. Will heavily consider upgrading to HDMI capture card.
    • 2B) I need to learn how to "rezone" the areas for analysis. i.e. instead of looking at the border, I want to crop a portion of the video feed and only use that cropped feed.
    • 2C) I need to then assign colors to 2) LED system that are outputted by analysis of 2B)'s cropped video feed.
    Zitat

    One option to solution it could be leveraging Hyperion's JSON API.

    With your python program you can listen to the captured Image Stream, do your image evaluation and then trigger (color-) effects.

    Note: Make use of a Web-Socket communication in this case.

    I think your proposed solution takes care of all my big obstacles for now? I have some coding experience but have not worked with using APIs before so this'll be a great learning opportunity. I only have 2 final concerns, though they are hardware related so feel free to defer me to the hardware forum.

    • Will I be able to get away with 1 RPI3B to handle all the processing? Or should I use another RPI in parallel?
    • My LEDs for 2) are WS2811s and I think I'll need to use an Arduino, my current 1) Ambilight are with APA102s so I can get away with just 1 RPI. I believe this is because the APA102s have a dedicated clock signal input, could you confirm?

    Thank you

    Hello all,


    I first discovered this forum years ago when I built my first ever Ambilight in 2017. I just recently noticed there's a new version of Ambilight that runs in the browser now instead of the java applet so I've been adjusting a little to that change. Here is my issue, though:


    I'd like to run a secondary ambilight simultaneously along the first. This second ambilight (Ambilight X) will only look at a subsection of the screen capture and output according to a preset code I've determined. (i.e. if the healthbar averages red, glow red, if it averages green, glow green). I wrote my code in python but I think Hyperion is written in C++? Is it feasible for me to marry the two codes together or will I either have to convert my pythonic code to C++, or write my own Ambilight software?


    Also, my original Ambilight 1.0 setup is with APA102s and a RPI. For my Ambilight X, I have WS2811s now so I think I'll need to also use an Arduino due to lack of clock input on these 3 pin LEDs?


    TLDR:


    1) Ambilight on TV to extend display

    2) Ambilight on room to only color with specific "event points" on screen

    I got 2 apa102 rolls but one was faulty. I am going to order 2 rolls of SK9822 hope it works with hyperion.


    Slightly off-topic, but do you (or anyone else) know if the lower PWM frequency makes an impact on the ambilight as a whole, significant or otherwise? I just recently got a 4KTV myself and i'm looking to retrofit my HDTV ambilight on to but was planning on getting new LEDs. Previously used APA102s and am strongly considering the SK9822s.

    Hey Akriss, thanks for your feedback here as well! Would you happen to have a photo? I'm curious how the aluminum frame fares.
    Back when my ambilight was working, I did some testing with my TV mount collapsed (LEDs ~6in/15cm away) vs my TV mount fully extended (LEDs 30in/75cm away) and didn't notice anything game breaking.

    They should be. To clarify they're both from one product I bought off Amazon. It was 1 (one) 5m strip where about 3.6m was used for the tv and 1.4m laying around. I did at one point drive the 3.6m with reverse voltage but all LEDs on that strip would light up regardless, every now and then.

    I have a wall-arm-mounted TV so I think it'd be best for me to fix my LEDs to a foam ring sitting a few inches away from the wall. This way the intensity of the lights remain constant whenever I pull the TV outwards or angle it around the room (I'll post photos later). I currently have them temporarily fixed to the back of my TV but will be fixing to a more permanent solution once I workout some more kinks.


    My question is does anyone know what the optimum space I should leave between the LEDs and the wall should be?


    And if I should have them face the wall perpendicularly or at an angle like 30,45,60?

    I believe my power supply was the issue. Let me rephrase that, it was a issue. I have hyperion running perfectly, like it did the first time, on my spare strip of LEDs. I tried the following:
    * Hyperion running on spare strip, without the extended wires, with the lever shifter -- works perfectly
    * Hyperion running on spare strip, with the extended wires, with level shifter -- works perfectly
    * Hyperion running on spare strip,with the extended wires, without level shifter -- works perfectly
    * Hyperion running on previous strip,with the extended wires, without level shifter -- FAIL
    * Hyperion running on previous strip,with the extended wires, with level shifter -- FAIL
    * Hyperion running on spare strip,with the extended wires, with level shifter -- FAIL


    It's very possible for the last case, I soldered incorrectly and will check again when I get home from work tonight. I'm at a loss for the previous strip, though. Do I just trim out LEDs until it works? Cut it into 4 equal pieces and check each length? Can I run a conducive continuity test of some sort?


    If anyone could offer any input, please let me know.

    You're right, they're different. After ordering I noticed I started seeing more of the 125s on this forum but I got the 08s because of this post: https://hyperion-project.org/t…ter-based-on-74hct08.512/


    And I seem to understand the logic, if someone can confirm or correct me. I believe the 74HCT08 cascades 2 AND gates in order to buff up the input voltage to match closer to that of the chip's VCC (5V). I think the 74HCT125 does the same by using NOT gates (inverters) and Buffer gates.


    When I had my Ambilight working the first time, I wasn't even using a level shifter but I also did not have any extended wires whatsoever. After redoing everything, I introduced a level shifter into the mix thinking I'd improve the voltage level since lengthening the wires could affect voltages ever so slightly (I think, someone please correct me if otherwise).

    Hello all,


    I've been following this forum for a while and have been on this Ambilight project for several weeks. I am finally resorting to creating my own topic here. I'm using RPI 3B+ and LibreElec 8.2.4 with about 100 APA102 LEDs. I also have some 74HCT08 chips for a level shifter.


    Please skip this background if you're only interested in helping my immediate problem:


    Background:
    [INDENT]I spent my first weekend with allnighters and had it finally found it working randomly when i came home from a day out.
    The lights worked perfectly for a week. Then, while sending a new config file when trying to color correct for my blue painted wall, the software stopped working entirely. I believe somehow all the memory available was used up and hyperion said it could not even store screenshots anymore.


    I decided to do a fresh install of everything since I needed to permanently fix everything anyways and in doing so I extended the power supply by several feet(2m) and added about a foot (1m) to the data/clock wires. Then I encountered this strange behavior. The first LED would rainbow swirl, sometimes skipping the blue color, and the second LED would flash and none of the other LEDs would light. Thinking the flashing LED was bad, I snipped it out and soldered the 1st to the 3rd but now the 3rd LED also flashed haphazardly. I then decided to cut the 1st. Called a friend over to run some tests and ended up shorting the RPI 3B somehow when we were testing with the unused strip of LEDs.


    Now I have resoldered everything to this unused strip of LEDs, with a new RPI 3B+ (which randomly keeps dropping wifi and I need to manually reconnect via KODI menu) and I cannot even get Hyperion to run.
    ***
    In between this and the last sentence, I went to bed, woke up, and formally soldered my level shifter together to a perf board and some jumper wire connectors. I currently have Hyperion running smooth, allegedly.
    [/INDENT]
    /End background.


    When I kill hyperion, I can take screenshots and they are accurate and when I try to control a color, I don't recieve any errors. Currently all of my LEDs are lit but only when I jostle the wire connections a bit. I am not able to change any colors.
    What's truly frustrating to me is that I keep bouncing between thinking it's a hardware issue vs software issue. I'm un able to narrow it down. Scratch that, what's truly infuriating is that it was working before!


    Can someone provide any input? I'll attach my config and SSH output below:






    The screenshot came out properly, tried to change colors RED, then BLUE and then GREEN but the LEDs remain white. Trying to initiate a rainbow swirl from my phone produces no results either.