Posts by Michael Rochelle

    davieboy Nah wasn't a joke. My LEDs have came. I've re-created my light bars on the corners using WLED. I'm having an issue with WLED which may be related to the problem I have with my Ubiquiti wireless access point. What you see above is my fork for Hyperion, I stopped development for a second because I wanted to get a lot of the latest features into my fork, I have several things to fix also.


    Became really busy with work lately but plan on getting back to finishing up the Audio Grabber feature.

    Kratos84


    I don't quite remember the pins and I have my RPi4B in an enclosure.

    Here are my hardware settings in hyperion:


    The 5v+ and ground to my LEDs are wired directly to the power supply.

    I believe the Raspberri Pi isn't 5v tolerant. so I used a logic level shifter in between the RPis I/O and the LEDS Data and clock Pins.

    One side of the logic level shifter is connected to the RPis 3v3+ and Gnd, and the SPIs clock and data pins. The other side is connected to the 5V power supply + and gnd and the respective LEDs Clock and Data pins.

    If you're not using a logic level shifter, that could be your problem.

    I finally got the controllers for my corner lights.

    I used Richelieu Corner Profile tracks with Opal lenses.

    I used HD107S 4-pin Leds. I've tried other LEDs but nothing was as bright, rich in colors, and responsive as these LEDs. These are the same ones I used behind my TV.

    For my contrllers I used the StanleyProjects LedBoxV2. This was an awesome controller. It comes with a 3-pin connection but the board is stubbed out for a 4-pin led too. I soldered a 4 pin connection and this came out awesome. I installed the latest version of WLED on it and followed the schematic to configure it for the 4-pin strip.

    HwXsPkA.jpg


    wP6Fc2k.jpg


    With Hyperion:

    External Content www.youtube.com
    Content embedded from external sources will not be displayed without your consent.
    Through the activation of external content, you agree that personal data may be transferred to third party platforms. We have provided more information on this in our privacy policy.

    External Content www.youtube.com
    Content embedded from external sources will not be displayed without your consent.
    Through the activation of external content, you agree that personal data may be transferred to third party platforms. We have provided more information on this in our privacy policy.

    Here's my progress so far. This is taking a little time because I do it on my off time. I also had to study the source code a bit to learn how it was structured prior to coding. It works great on ubuntu on a PC and on the Raspberry Pi.


    I need to make the audio capture automatically disable if it doesn't find the ALSA Lib. For windows, I need to port it to use WASAPI instead of DirectSound. DirectSound was the very first implementation. I don't have a mac so I haven't been able to develop the mac version. I may create a hackintosh on my laptop to do it.


    Here is the visualizer running on 2 instances. One connected to a pin on my RPi4 and the one on the floor is an ESPixelStick.


    External Content www.youtube.com
    Content embedded from external sources will not be displayed without your consent.
    Through the activation of external content, you agree that personal data may be transferred to third party platforms. We have provided more information on this in our privacy policy.


    Here's what the configuration looks like right now. I want to make the visualizer pluggable. It is designed in such a way to be pluggable right now for time sake I hard coded the UV Meter.


    Here is the audio hardware enumerated:


    Update for the Audio Capture logic. I've successfully created the version for Linux using ALSA. I'm going to update the build scripts to make it optional. I've ran into some issues with Windows Direct Sound and probably should recode the windows side to use WASAPI. I will do that in the future. I'm going to compile it on the RPi soon and test it there.


    I used ubuntu in VMWare to write the linux side, and did the direct sound part on my PC.

    Why did you move from the RPI zero to the RPI4? I would like to use a zero if it's feasible


    Originally I didn't have any problems with the Zero. I'm implementing many things and didn't want to run into performance issues. I'm now doing 4K HDR via HDMI, I'll have 3 instances of LEDs, 2x ESPixelSticks and the one behind my TV. I'm creating logic to process audio and create visualization also where I'm compiling Hyperion on the RPi itself.


    I had an RPi4 laying around and decided to use it for the processing power.

    Sorry, my fault! I was talking about the "visualization addons" which are implemented in Kodi. They already visualize "audio" and it should be easy to "grab" these data, or?


    Ahh ok. I haven't used Kodi myself, but I believe there is a way to integrate Kodi where Kodi can control the LEDs.


    With my current configuration, I have a little USB sound dongle that is connected to a pre-amp left and right output from my theater receiver. This allows me to run this visualizer with any audio source (even via eARC from my TV). That proof of concept above is my TV running its built-in spotify app. Theoretically it should also be able to capture PCM audio via HDMI.

    Wouldnt it be possible to use an "visualisation addon" for analyse and use these data as "audio source"?


    I'm not sure what you mean by "visualization addon". Currently the application has APIs that you can hit to control the LEDs. Technically you could create a visualization application then control the LEDs via this API.


    What I'm doing is creating an Audio Capture Feature. Like the Screen Capture, Camera Capture, and Video Capture.


    I've currently completed the Windows implementation. Now I'm working on the linux implementation using the ALSA libs. I don't have a Mac so I probably wont be able to do the Mac implementation. Maybe I can use hackintosh on my laptop to boot MacOS, but I'm not sure if the audio device will work.


    I'm also stubbing it out to enable us to use python to process the audio data and emit the led configuration. In the future this will allow custom audio visualization plugins.

    I have a similar issue with my setup. My HDMI Grabber advertises up to 4K. My Splitter can downscale and has similar EDID dip switches and advertises 4K HDR.


    In the fine print my splitter says it cannot downscale 4K 60 4:2:2 content.


    My next investigation is to see if my XBox is outputting 4:2:2 when HDR10 is enabled. if this is the case then its sending video to my capture card that it doesn't support.


    In order to view 4K content with Netflix on my XBox, it has to link at 4K HDR10. With simply 4K it only plays the content in HD.


    From some of my research the xbox should be able to output 4K 60 HDR10 at 4:2:0 which my splitter says it supports. I'll probably check the configuration later.

    It would be nice to have a programmable interface to push the LED state.


    For example, I'm installing LEDs in the room using an ESPixelStick. it would be nice to have them sync with Hyperion.


    I'd like the main LEDs behind my TV to operate simultaneously with others.


    My ideal setup would be to have 2 strips going up the corners of the wall. the right strip take the right output on my TV and the left take the left. This configuration should work well with movies and visualizers.


    This could be something that allows us to write python scripts with access to the left,right,top,bottom, and raw image data.

    I'm currently working on this. Using ALSA in linux to capture the audio from a selected audio device.


    Currently having issues with the configuration showing up in Hyperion. Submitted a pull request to fix schema loading. I think there is a deeper issue as the config doesn't show up even after loading the schema.


    The other thing I want to do is explore using FFT instead of averaging the raw PCM data. I need to be able to install python modules into hyperion or atleast modify hyperion to find python modules installed in the OS.


    lastly I'm currently getting audio data from a mic and usb audio interface. I want to explore grabbing it from the HDMI input. My first go-round appeared that the data may not be PCM. After I get everything else together then I'll reinvestigate.


    External Content www.youtube.com
    Content embedded from external sources will not be displayed without your consent.
    Through the activation of external content, you agree that personal data may be transferred to third party platforms. We have provided more information on this in our privacy policy.

    Here's my setup:


    External Content www.youtube.com
    Content embedded from external sources will not be displayed without your consent.
    Through the activation of external content, you agree that personal data may be transferred to third party platforms. We have provided more information on this in our privacy policy.


    Samsung 82" Q900R + Hyperion + 4K HDR10
    Denon X6500H
    332 leds ~19ft HD107S LEDS 60LED/m: https://www.aliexpress.com/ite…042311.0.0.17324c4dqEa6x5
    Raspberry Pi 4B
    KDR 4K Video Capture: https://www.amazon.com/gp/prod…tle_o09_s00?ie=UTF8&psc=1


    RobotDyn Logic Level Converter: https://www.amazon.com/gp/prod…tle_o00_s00?ie=UTF8&psc=1
    Aclorol 5V 40A Power Supply: https://www.amazon.com/gp/prod…tle_o03_s00?ie=UTF8&psc=1


    Used this power box that enabled me to power it on/off via a 12V Trigger from my receiver:
    IoTRelay: https://www.amazon.com/gp/prod…tle_o03_s01?ie=UTF8&psc=1


    Used this USB Connector to allow me to power the Raspberry Pi from the LED Power Supply: https://www.amazon.com/gp/prod…tle_o03_s00?ie=UTF8&psc=1


    Here is a VUMeter Effect that I started developing:

    External Content www.youtube.com
    Content embedded from external sources will not be displayed without your consent.
    Through the activation of external content, you agree that personal data may be transferred to third party platforms. We have provided more information on this in our privacy policy.


    Started with the Raspberry Pi Zero W then switched to the Raspberry Pi 4B. Also switched the the USB RCA capture adapter to a 4K HDMI one.



    I'm creating a custom effect and used a few python modules. It appears that Hyperion that comes with Hyperbian for Raspberry Pi isn't using the python interpreter built into the OS.


    I've installed the modules using pip and pip3 and hyperion still complains that the module is missing. is there any way to install custom modules here? or can I configure it to use the python that is installed into the OS?


    Thanks.