[Support] DareDoes - Snapcast


Recommended Posts

  • 1 year later...

I installed the update of today (both of your snapcast and mopidy3 containers). Now it doesn't work fully  any more. the SnapWeb UI says: The resource '/' was not found.

 

What did you change? What do I need to change in my config?

 

In the meanwhile I would love to go back to the previous version for both containers, but you seem to have removed them. There is only "latest" and no other tags... frustrating.

 

I suggest you give every version of your containers a unique tags and let the "latest" tag point to the tag of the latest version, so that people can go back to an older version if they have problems with the latest container.

Edited by murkus
Link to comment

@murkus I didn't even know anyone actually used my stuff! Mostly published for myself. 

I spent last night trying to fix Airplay in Snapcast while bumping it to version 0.27, and stopped around 2am. I did find while doing this that parts of the config have changed without documentation for snapcast. Unfortunately, to build shairport with the right stuff meant. increasing the image size from 30MB to like 400MB. Still working on slimming that. 

I'll definitely keep the tag content in mind. Unfortunately, not much of a way to go back to 0.26. 

Notably, where it says
`source = ...` it should now be `stream = ...`

otherwise it can't detect any streams and uses a fallback config.

 

 

Now on Mopidy, that update is significant and good. But I realized during my update that I have SO MANY THINGS unexplained.

Did you know that the Mopidy app has the ability to dynamically create multiple instances AND automatically add and remove that instance as a stream source in snapcast over HTTP? I gotta clean up my documentation. Also I can't get the local library scan in Iris to work for the life of me, so I re-added a supervisor command that triggers the local scan, and tied it to a cron job that goes off at 430am every day

Anyways, you have my focus, and I want to help. I'll work today on fixing the resource issue, my guess is that the config doesn't like the path given for the homepage though.

Here's the new sample config for snapcast I'm including. Try using this and updating it with your specific configuration information.

I've also attached a servers.json file that lives in the config area for Mopidy. The templates for that can be found in the docker image under `/home/templates`, but supervisorctl will try to use the `mopidy.conf` and `servers.json` found in the config folder first when setting itself up.

snapserver.conf servers.json

Edited by daredoes
Link to comment

I did not have source = in the config file. Thanks for the example file, it is more self-explanatory. I will use that and edit it to include my old config settings

 

Also edited the server.json and put it in the config path on appdata. it seems that it is read when restarting the server.

 

I still get tons of these:

[Notice] (handleAccept) ControlServer::NewConnection: <ip>

Error] (cleanup) Removing 1 inactive session(s), active sessions: 2

 

I also clicked on Settings > Scan Library and it displays a pop-up saying "Scanning local library" with a spinner. I don't know whether it is effectively scanning the library, though.

 

Edited by murkus
Link to comment

you wrote: "Did you know that the Mopidy app has the ability to dynamically create multiple instances AND automatically add and remove that instance as a stream source in snapcast over HTTP?"

 

That sounds great! I would like to understand how I may use that.

 

Link to comment

@murkus 

Give both a force update. I've made some good changes, but I think Airplay is still broken for snapcast at the moment. Were you even using it though? Doesn't affect you if not. 

 

The Mopidy3 instance is HIGHLY CUSTOMIZED. It runs Supervisord (port 9001) and cron in an Ubuntu image that does some python magic. Let me describe. 

For the server.json you'll want to place it in `appdata/Mopidy3` alongside `mopidy.conf`

There is a line in `mopidy.conf` which is VERY IMPORTANT and its `

output = audioresample ! audioconvert ! audio/x-raw,rate=44100,channels=2,format=S16LE ! filesink location=/tmp/snapfifo

 

The mopidy server when booted up creates a copy of this configuration and does some updates to it, including replacing `/tmp/snapfifo` with `/tmp/snapfifo{A_REALLY_LONG_UNIQUE_HASH}`.

This works together with `servers.json` to allow you to have one docker image create multiple instances of Mopidy3 that can rely upon the same cache/data/etc

In the attached code block which is my scrubbed `servers.json` I create two instances of Mopidy3, "Home" and "Ambience". I also provide information to reach my snapcast server (well not actually mine). This is used in boot up, and automatically put into the Iris configuration.

This data is used to create a supervisord configuration for each desired instance of Mopidy3. So in this case it creates two programs, one for Home and one for Ambience.

Those programs run a python program before starting up that reaches out to the snapcast server to clear any stream that has the same name, and then add in this stream as a pipe pointing to `"f"pipe:///data/snapfifo{hash}?name={name}&sampleformat={sample_format}&send_to_muted=false&controlscript=meta_mopidy.py"`. Important note, you want to have Mopidy3 and Snapcast running on the same unraid instance with a shared tmp folder. For the extra instance, manually edit the docker config for Mopidy to expose the new MPD and HTTP ports over TCP.

{
    "servers": {
        "Home": {
            "mpd": 6600,
            "http": 6680
        },
        "Ambience": {
            "mpd": 6601,
            "http": 6681
        }
    },
    "snapcast": {
        "host": "snapcast.example.com,
        "port": 443,
        "use_ssl": true,
        "enabled": true
    }
}

 

All of this code is public at github.com/daredoes/docker-snapcast or github.com/daredoes/docker-mopidy3

Edited by daredoes
Link to comment
6 minutes ago, daredoes said:

@murkus 

Give both a force update. I've made some good changes, but I think Airplay is still broken for snapcast at the moment. Were you even using it though? Doesn't affect you if not. 

 

For the server.json you'll want to place it in `appdata/Mopidy3` alongside `mopidy.conf`

There is a line in `mopidy.conf` which is VERY IMPORTANT and its `

output = audioresample ! audioconvert ! audio/x-raw,rate=44100,channels=2,format=S16LE ! filesink location=/tmp/snapfifo

 

The mopidy server when booted up creates a copy of this configuration and does some updates to it, including replacing `/tmp/snapfifo` with `/tmp/snapfifo{A_REALLY_LONG_UNIQUE_HASH}`.

This works together with `servers.json` to allow you to have one docker image create multiple instances of Mopidy3 that can rely upon the same cache/data/etc

In the attached code block which is my scrubbed `servers.json` I create two instances of Mopidy3, "Home" and "Ambience". I also provide information to reach my snapcast server (well not actually mine). This is used in boot up, and automatically put into the Iris configuration.

This data is used to create a supervisord configuration for each desired instance of Mopidy3. So in this case it creates two programs, one for Home and one for Ambience.

Those programs run a python program before starting up that reaches out to the snapcast server to clear any stream that has the same name, and then add in this stream as a pipe pointing to `"f"pipe:///data/snapfifo{hash}?name={name}&sampleformat={sample_format}&send_to_muted=false&controlscript=meta_mopidy.py"`. Important note, you want to have Mopidy3 and Snapcast running on the same unraid instance with a shared tmp folder. For the extra instance, manually edit the docker config for Mopidy to expose the new MPD and HTTP ports over TCP.

{
    "servers": {
        "Home": {
            "mpd": 6600,
            "http": 6680
        },
        "Ambience": {
            "mpd": 6601,
            "http": 6681
        }
    },
    "snapcast": {
        "host": "snapcast.example.com,
        "port": 443,
        "use_ssl": true,
        "enabled": true
    }
}

 

All of this code is public at github.com/daredoes/docker-snapcast or github.com/daredoes/docker-mopidy

 

OK OK OK, this explains why shit doesnt work well...

 

I did actually change the name of the fifo and created four of them in the mopidy config manually. Of course everything is jumbled up now.

 

 

 

Link to comment

Yeah this is being developed for me first and foremost. I love making it more accessible for others though. 

If you need a 1:1 troubleshooting session just lmk. Probably won't take more than 15 minutes to solve. 

I've been considering doing live coding on Twitch or something so others can learn and ask questions as I go. Let me know if that's of any interest as well (just for my own research).

Link to comment
16 minutes ago, daredoes said:

Yeah this is being developed for me first and foremost. I love making it more accessible for others though. 

If you need a 1:1 troubleshooting session just lmk. Probably won't take more than 15 minutes to solve. 

I've been considering doing live coding on Twitch or something so others can learn and ask questions as I go. Let me know if that's of any interest as well (just for my own research).

 

I see, thanks for the offer. I'll try to fix it on my own, as I don't use twitch and I don't intend to join that. But if it helps I woul dmeet you on Matrix or Discord.

 

The Library scan button seems to work. I found a corresponding log in the tmp folder where the fifos are.

Edited by murkus
Link to comment

I was taught in my first job that public chat is always better than private chat, so the discord server aims to provide more information to others over time. Hope that's alright, if not send me a PM with your discord username and I'll send you a friend request (applies to all reading this)

Link to comment

"The Library scan button seems to work. I found a corresponding log in the tmp folder where the fifos are."

The library scan button activates but it throws an error in the supervisor program-specific logs about not having access to system.sh for me

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.