[SUPPORT] Stash - CorneliousJD Repo


Recommended Posts

Stash is a Go app which organizes and serves your NSFW and adult media content.

Github homepage: https://github.com/stashapp/stash

 

I did not create Stash, I simply created the template for unRAID.

I honestly saw this as an enhanced tile on Heimdall and wondered what it was, and then I was actually shocked to see a template didn't yet exist so more of as an excercise in setting up containers I fired this up, and now here we are... 

 

Report bugs and request new features via Github here: https://github.com/stashapp/stash/issues

 

I can do my best to assist with any Unraid specific issues here via this thread.

Edited by CorneliousJD
Link to comment
2 minutes ago, eds said:

Thanks for this app. Any way to get the docker to use nvidia for transcode/thumbnail generator purposes?

I don't think that would be possible at this time at least, the app is in its infancy still it seems, lots of work going into it on a regular basis by a small team of people, but I'm not involved with development, just setup the container/template for the community to use.

 

Sorry I can't be of more help on that one!

Link to comment
2 minutes ago, CorneliousJD said:

I don't think that would be possible at this time at least, the app is in its infancy still it seems, lots of work going into it on a regular basis by a small team of people, but I'm not involved with development, just setup the container/template for the community to use.

 

Sorry I can't be of more help on that one!

Understood.  Thanks anyway.

Link to comment
  • 1 month later...

I can't seem to get the scrapers working. Just get an error

Quote

Error: GraphQL error: No matches found in

With the following from the logs:

Quote

2020/06/02 11:47:32 "POST http://10.20.0.10:6969/graphql HTTP/1.1" from 10.20.0.113:53770 - 200 59B in 286.177191ms
2020/06/02 11:47:33 "POST http://10.20.0.10:6969/graphql HTTP/1.1" from 10.20.0.113:53770 - 200 264B in 180.321124ms
2020/06/02 11:47:35 "POST http://10.20.0.10:6969/graphql HTTP/1.1" from 10.20.0.113:53770 - 200 198B in 180.246982ms
2020/06/02 11:47:38 "POST http://10.20.0.10:6969/graphql HTTP/1.1" from 10.20.0.113:53770 - 200 108B in 786.517878ms
2020/06/02 11:47:52 "GET http://10.20.0.10:6969/graphql HTTP/1.1" from 10.20.0.113:53697 - 000 0B in 7m6.325971483s

 

Is the scraper included in this docker?

Link to comment
2 hours ago, Noego said:

I can't seem to get the scrapers working. Just get an error

With the following from the logs:

 

Is the scraper included in this docker?

The scraper should be functional, you'll likely want to take this up with the devs on their GitHub here: https://github.com/stashapp/stash

 

Or you can try upgrading to the beta release to see if it's been fixed/improved there, as that's where actual development is happening.

You can edit the container to pull 

stashapp/stash:development-x86_64

But note that if you upgrade, you can't easily downgrade because it changes the database. You'd need to stick with the beta/dev build until they update the main release to be based off that new updated database.

Link to comment
  • 2 months later...
6 hours ago, redeuxx said:

Any way to get to bundle the community scrapers in the Docker? ...

 

https://github.com/stashapp/CommunityScrapers

 

Thanks for your work!

Automatically? No, not unless the Stash devs add it to the docker themselves, but manually, sure!

 

Quote

Any scraper file has to be stored in the ~/.stash/scrapers ( ~/.stash is where the config and database file are located) directory. If the scrapers directory is not there it needs to be created.

 

So /mnt/user/appdata/stash/scrapers -- just follow the instructions on the CommunityScrapers page to download the files and put them in the correct place and you should be able to use them no problem.

  • Thanks 1
Link to comment
  • 4 months later...
9 hours ago, rhodo said:

Thanks for this app. The Docker currently takes one Unraid share and maps this to "/data/"

 

Is there any way to add additional shares? When I attempt to add folders through the Stash UI it does not see any of the Unraid shares.

 

Thanks.

For sure! 

 

The idea though is that you would have data point to a "Stash" share and put all your media into that, but if you need access to more shares, you can add mappings like this

 

This would be /pics inside of the container

 

image.png.ee30f298b367ab05d9e8a479685d2c65.png

Link to comment
7 minutes ago, Xenu said:

Does anyone have an idiots guide on how to get this up and running, please? I've loaded up a list of scrapers from https://github.com/stashapp/CommunityScrapers, but what now? The config is asking for an endpoint and API!?

I haven't used this with scrapers personally but I would assume endpoint and API are for the actual site you're scraping from? 

 

You might want to ask on the stash github issue tracker for more info/help, as this is more-so just for unraid specific issues. 

I think the issue you're running into isn't unraid specific and more of a usage question on the app itself, so I can't really be of more help directly since I don't use it that way. 

 

Hope this helps!

Link to comment
  • 4 weeks later...
21 minutes ago, rmmprz said:

Could you integrate Python? Python3 is needed for some of the CommunityScrapers to run within the Container.

 

This is actually using the official docker container of https://hub.docker.com/r/stashapp/stash/ -- you'll want to ask the Stash devs to add Python3 to the docker container to be able to make sure of community scrapers then, they would have to do it on their end, and if they do implement it into the officail container, the unRAID container should auto-update itself to include it as well next time you check/update. 

Link to comment

Has anyone had any luck moving a Stash setup from Win 10 to a docker?

 

I copied all of the Stash data from my Windows storage to unRAID but I'm having trouble mapping the folders (config, metadata, cache, generated) to the docker. Stash is able to find the data files but can't seem to find/apply the previous database info (studios, performers, scene data, etc.). I suspect the problem is either I have the wrong file paths or the SQLITE file will need to be edited to reflect the new data file locations... But I'm a total noob with this stuff so any help is greatly appreciated!

Link to comment
25 minutes ago, shattered said:

I copied all of the Stash data from my Windows storage to unRAID but I'm having trouble mapping the folders (config, metadata, cache, generated) to the docker. 

Did you, in fact, create the maps to the folders?  And then put the data into those folders? I have never migrated from Windows but this is where I would start troubleshooting..  Your docker config should basically look something like this:

 

Untitled.png.856ffb09af6178150711422346bb889b.png

 

Untitled2.png.e508cc424441a8d11049c61f4aac2c60.png

 

I don't remember if these paths had to be created manually or if they were included in the template.. 

 

It is entirely possible that a direct migration is not possible if the database is filled with Windows paths and now it's on Unraid... in that scenario the best you might hope for is that when you create a new library from Unraid that the existing metadata, etc, is used ..  

 

This is another one that might be better addressed to the Stash devs/community.

 

Off topic, should we all share our 'media' with each other? ;D hah.

 

 

Link to comment

I figured it out! It is possible to migrate from Windows 10 to a docker container but it takes a couple steps, I'll try to explain what I did. 

 

1. BEFORE shutting down your Windows setup, back-up the Stash data files (should be folders .stash, cache, generated, metadata) the most important of these is the stash-go.sqlite file within .stash, the config is less important because you have create a new config for the docker to identify the new file paths

2. Install the Stash docker container. I kept the default folder locations within the appdata share so I knew where to move stuff, the data path should be something like /mnt/user/share/file/path/

3. Make Stash scan the new library, all of your 'media' should appear but without your precious metadata

4. Stop the Stash docker container

5. Delete *most* of the generated folders in the appdata folder and copy your original data over. You must keep the config file and folder names that the docker generated or it won't work. On Windows the config file was stored in the .stash folder but in the docker it's in a folder named config

6. Start the Stash docker container, now your metadata should be attached to all the right files (performers are attached to scenes, studios attached to scenes, etc.), BUT the files will be unplayable because it's trying to find the file where it was on Windows

7. Make Stash scan the library again and it should update the file locations while keeping your metadata in place

 

I hope this helps the next guy upgrading their system! Also, I didn't try this out but it might work, in Settings > Tasks there's an option to export and import JSON data, this might do the trick

Link to comment
  • 1 month later...

I'm trying to scrape one of the sites, but I'm getting a "x509: certificate signed by unknown authority" error. I found in the discord that suggested to download the .pem file from the site, and install it to the ca. Is there a way to install this cert in the docker file? I've tried using scp, and putting it in etc/ssl/certs/ and then using the terminal "update-ca-certicates" but that didn't work. Is there a different location for certs within the dockers?

 

 

Update: In the dev version, there's an option to ignore unsecure certificates fixing this issue.

Edited by GoldStig23
Link to comment
12 minutes ago, GoldStig23 said:

I'm trying to scrape one of the sites, but I'm getting a "x509: certificate signed by unknown authority" error. I found in the discord that suggested to download the .pem file from the site, and install it to the ca. Is there a way to install this cert in the docker file? I've tried using scp, and putting it in etc/ssl/certs/ and then using the terminal "update-ca-certicates" but that didn't work. Is there a different location for certs within the dockers?

I'm not sure on this, but this is their official docker container. They should be able to assist more in discord or via GitHub issue perhaps? Sorry I can't be of more help!

Link to comment
  • 1 month later...
17 hours ago, RxLord said:

Having an issue where the community scrapers are not showing up under the Web Qui. I have moved the .yml files to the \\TOWER\cache\appdata\stash\scrapers directory. Am I doing something wrong?

 

Sorry, I'm not using custom scrapers, but is there a config page that you have to tell it where the scrapers live? Would need to make sure that it matches up with what the container expects so the container path and your appdata for scrapers line up.

 

Maybe someone else with Scrapers enabled can chime in here?

Link to comment
17 hours ago, RxLord said:

Having an issue where the community scrapers are not showing up under the Web Qui. I have moved the .yml files to the \\TOWER\cache\appdata\stash\scrapers directory. Am I doing something wrong?

Sounds like it's correct. If you go to Settings>Scrapers and click the reload scrapers button, does that work? If not, you could always restart the docker and see if that works.

Link to comment
  • 3 weeks later...
On 5/18/2021 at 5:21 AM, tvd1 said:

I keep getting JSON errors when i try to 'scrape' using some of the prebuild yml files.  Im not seeing anyone talking about this in the general help, is this an Unraid specific one?

Yeah I'm having the same issue - I've gone so far as to add the community scrapers as well, and cant get anything to run a successful query unfortunately,

Link to comment
  • 4 weeks later...

Gents/Ladies

 

With regards to the scraper issues - from the stash github

 

Quote

"Any scraper file has to be stored in the ~/.stash/scrapers ( ~/.stash is where the config and database file are located) directory. If the scrapers directory is not there it needs to be created."



Try putting the scrapers directory the config directory. Then stick the scraper yml's in there. yvw

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.