[Support] ich777 - Application Dockers


ich777

Recommended Posts

I have 27,937 photos in Photoprism. With a total size of:

 

image.png.716be825ceef50c3665fae16cf101ab8.png

 

My appdata/photoprism/cache/... directory has the following stats:

 

image.thumb.png.746314acc06cc5b4efe725b0c729edb8.png

 

Why on earth is just the tumbnail cache taking so much storage? It's eating 60GB of my SSD cache drive.

 

On top of this, photoprism eats 95% of my 4 core 8 thread CPU when doing indexing and there is no progress bar or ETA to display during the indexing process.

 

I haven't set any different settings or anything. Just ran the container and set my library location. Surely there is something wrong with needing 185,400 thumbnail cache files spread through 8,500 folders.

Link to comment
10 minutes ago, plantsandbinary said:

Why on earth is just the tumbnail cache taking so much storage? It's eating 60GB of my SSD cache drive.

This is just the default behavior for PhotoPrism, you can change the quality and set a fixed size for thumbnails with this Variables in the Docker template (Source) :

+----------------------------------------+
| KEY                            | VALUE |
+----------------------------------------+
| PHOTOPRISM_THUMB_UNCACHED      | true  |
| PHOTOPRISM_THUMB_SIZE          | 720   |
| PHOTOPRISM_THUMB_SIZE_UNCACHED | 720   |
| PHOTOPRISM_JPEG_SIZE           | 720   |
| PHOTOPRISM_JPEG_QUALITY        | 80    |
+----------------------------------------+

 

Of course you can set the thumbnail sizes however you like and even lower the PHOTOPRISM_JPEG_QUALITY even more but would not recommend to go below about 70.

 

10 minutes ago, plantsandbinary said:

On top of this, photoprism eats 95% of my 4 core 8 thread CPU when doing indexing and there is no progress bar or ETA to display during the indexing process.

For this I would recommend that you create a issue over on GitHub as a feature request: Click

 

 

10 minutes ago, plantsandbinary said:

I haven't set any different settings or anything. Just ran the container and set my library location. Surely there is something wrong with needing 185,400 thumbnail cache files spread through 8,500 folders.

No, because PhotoPrism creates different size thumbnails for every picture that you have in your library, it's basically the same when you run Nextcloud...

Nextcloud for example creates IIRC 4 or 5 different size thumbnails for saved pictures.

Link to comment

Gday @ich777 Thank you for the luckybackup docker, its amazing, its all set up and good to go and is doing backups.

However I am having trouble with the automatic scheduling, it worked once then stopped. For the life of me, I cannot get it to be done automatically weekly. I tried making it 5 minutes in the future so I can watch it backup but it has not done it. I have attached the pics of my scheduling. Many thanks in advance for helping. Untitledzz.thumb.jpg.5cd89462a0af76bc9753d497abf09c9e.jpg

Link to comment
7 minutes ago, aymanibousi said:

I tried making it 5 minutes in the future so I can watch it backup but it has not done it. I have attached the pics of my scheduling. Many thanks in advance for helping.

You have to enable "Console Mode" in the schedule screen, like the discription from the container says and to actually see it in the show backups screen you have to restart the container, the reason behind this is that luckyBackup was designed as a Desktop application and was never meant to run when a schedule is executed.

 

Hope this helps and explains it a little better.

Link to comment
32 minutes ago, ich777 said:

You have to enable "Console Mode" in the schedule screen, like the discription from the container says and to actually see it in the show backups screen you have to restart the container, the reason behind this is that luckyBackup was designed as a Desktop application and was never meant to run when a schedule is executed.

 

Hope this helps and explains it a little better.

Hi there

Have ticked console mode and cron'ed it, then restarted the docker. Still not running it unfortunately, waited 5 minutes and still nothing with the scheduling.  Any ideas please?

Link to comment
2 hours ago, aymanibousi said:

Still not running it unfortunately, waited 5 minutes and still nothing with the scheduling.  Any ideas please?

Are you sure that you have checked console mode click okay and the again CRON IT?

 

Please send over the cron schedules.

 

You have to restart the container after the cron job run to see it in luckyBackup.

Link to comment
1 hour ago, Goldmaster said:

Is there a way to male sure $recycle.bin and system volume infomation folders are not deleted when backing up to an external ntfs drive in luckybackup?

Do you meannon the destination?

I don‘t think that NTFS is capable of containing folders files starting with $ but I‘m not to sure.

 

Backing up to NTFS through luckyBackup is always a bit more complicated because it is simply not a Linux filesystem and most of the attributes deleted when copying over to NTFS.

Link to comment
3 minutes ago, ich777 said:

Are you sure that you have checked console mode click okay and the again CRON IT?

 

Please send over the cron schedules.

 

You have to restart the container after the cron job run to see it in luckyBackup.

Hi

Yep I checked console, then clicked cron, and croned twice, refreshed, then restarted docker (container). And still not running here is the screenshots of the schedule (sorry i assume those are cron schedules?)

untiltied333.jpg

Link to comment
6 minutes ago, aymanibousi said:

HI Yep ran the job once at 11, finished at 11.24 with the schedule at 11.30, still nothing, its 11.45 now.

Thank you for your reply

Can you open up a container terminal, enter:

date

and check if the date matches your local date/time?

Link to comment
1 hour ago, feins said:

I've install Firefox and would like to know how to copy or move the downloaded files from the container?

You have to add a additional path to your container wherever you like it, for example:

grafik.png.fb8ddb1b578cc525a32f02a5c1c80bcb.png

 

In this case you would download the files in the Firefox container to: /mnt/downloads and you can access the files through the folder Downloads, of course the share on your host Downloads have to exists (you can change the Host path to whatever path you like).

 

Hope that helps.

 

  • Like 1
Link to comment
1 hour ago, Goldmaster said:

Yes, the idea is that certain folders are not deleted on the destination such as $recycle.bin and system volume information.

I think it is possible that this can not work when you copy to a NTFS drive...

Have you yet tried to copy the same exact files to a ext4 or XFS drive?

Link to comment
1 hour ago, aymanibousi said:

Yep opened up unraid terminal and container terminal, both match exactly.

This is really strange, never had issues with cron jobs in luckyBackup. I will set a new cron job up that runs in an hour and test it again.

 

Do you run the container as root or not?

Link to comment
36 minutes ago, ich777 said:

You have to add a additional path to your container wherever you like it, for example:

grafik.png.fb8ddb1b578cc525a32f02a5c1c80bcb.png

 

In this case you would download the files in the Firefox container to: /mnt/downloads and you can access the files through the folder Downloads, of course the share on your host Downloads have to exists (you can change the Host path to whatever path you like).

 

Hope that helps.

 

Thanks for the guide ich777.

  • Like 1
Link to comment
22 minutes ago, Goldmaster said:

No the reason, being is that I want to be able to read the drive on windows and exfat and ntfs are the only file systems capable of doing so.

Please look at Paragon, they have a software that let you read a Linux filesystem.

 

EDIT: here

Link to comment
6 hours ago, ich777 said:

This is really strange, never had issues with cron jobs in luckyBackup. I will set a new cron job up that runs in an hour and test it again.

 

Do you run the container as root or not?

Hi Ich,

I am happy to give you access via anydesk to my laptop so you can have a look at the docker :)

I have set it as true for root user and retested, still not running schedule. Please pm me as I am more than happy to give access.

Manythanks :)

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.