Jump to content

[Support] ich777 - Application Dockers


ich777

Recommended Posts

Thanks for the info, but now I'm stuck on “nano”, which is not installed, even an “apt-get -y install nano” doesn't help,
root@0069af0c1c04:/# apt-get -y install nano
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
E: Unable to locate package nano


If only I hadn't done an update, the old instructions always worked so far....

Link to comment
9 hours ago, new_unraid_user said:

apt-get -y install nano” doesn't help

Because you hve to do apt update first and after that you can issue apt install nano

 

9 hours ago, new_unraid_user said:

If only I hadn't done an update, the old instructions always worked so far....

But the old instructions are not correct you always have to use the path /luckybackup/.ssh

 

If you use /root/.ssh it will always break you config on a container update but with /luckybackup/.ssh not and this is the correct way to set it up.

 

BTW, you also don't need to generate new SSH keys because the container generates it's own keys on the first start from the container, so to speak the are unique.

 

The linked tutorial in my previous post should always work and helped a lot of people.

Link to comment

Hey there, hope you don't mind me bugging you about this but I've been experiencing embarrassingly slow speeds with the prefill docker. I have lancache and lancache-dns active and I also have Gigabit internet speeds, which my unRAID server seems to validate via speedtest. Problem is that the prefill docker is getting 200 megabits per second at best and 30 at worst averaging around 130. I do have some games like Half Life which for some reason touched the 600 megabits per second but I'm unsure what the cause is.

 

Do you have some insights perhaps? If you need any additional info please let me know.

Link to comment
13 minutes ago, TrollingJoker said:

Do you have some insights perhaps? If you need any additional info please let me know.

Do you run the Cache on a SSD or HDD?

Please note that if you want to hit high speeds you have to use something like a NVME SSD or at least a SATA SSD with plenty of space on it.

If you are running it on a HDD or even worse on the Array your speeds will be very slow...

 

I can hit speeds on my Linux machine of about 230MB/s downloads so to speak roughly 2,3Gbit/s and I run the cache on a SATA SSD (which almost fully saturates my 2,5Gbit connection).

 

You also have to take into consideration that the chunk size matters that you've configured in your LANCache itself but for most users the default is fine and because of those smaller chunks HDDs are usually really slow compared to a SSD/NVME.

It also depends on how the game developers have uploaded it to Steam.

Valve games have usually really high download speeds where some games from other publishers are really slow.

 

However this is more of an issue with your LANCache and not with the container itself.

Link to comment
8 minutes ago, ich777 said:

Do you run the Cache on a SSD or HDD?

Please note that if you want to hit high speeds you have to use something like a NVME SSD or at least a SATA SSD with plenty of space on it.

If you are running it on a HDD or even worse on the Array your speeds will be very slow...

 

I can hit speeds on my Linux machine of about 230MB/s downloads so to speak roughly 2,3Gbit/s and I run the cache on a SATA SSD (which almost fully saturates my 2,5Gbit connection).

 

You also have to take into consideration that the chunk size matters that you've configured in your LANCache itself but for most users the default is fine and because of those smaller chunks HDDs are usually really slow compared to a SSD/NVME.

It also depends on how the game developers have uploaded it to Steam.

Valve games have usually really high download speeds where some games from other publishers are really slow.

 

However this is more of an issue with your LANCache and not with the container itself.

Aside from the lancache settings you mentioned I think that I'm running the cache on my HDD array. I was doing this because I wanted my largest storage to hold it and I assumed I couldn't set it up so that it downloads to the SSD and promptly transfers it to the HDD afterwards. Is there a way of caching via SSD but redirecting it for storage on the HDD or should I generally just use a SSD with lancache?

Link to comment
24 minutes ago, TrollingJoker said:

Aside from the lancache settings you mentioned I think that I'm running the cache on my HDD array.

And exactly this is the issue.

 

A HDD is not able to keep up with those chunks (because they most of the times are located on a different place on the HDD) and the read head simply can't move fast enough to gather data quickly enough (go to your Dashboard when something is downloading through your lancache and you will see a lot of I/O wait because of that <- high CPU usage, but it's actually not high CPU usage).

 

24 minutes ago, TrollingJoker said:

Is there a way of caching via SSD but redirecting it for storage on the HDD or should I generally just use a SSD with lancache?

As said above the writes are not strictly the issue it is more the read from the chunks.

 

You should use a SSD or NVME <- the lancache team also recommends always using a storage device that is as fast as possible.

I use a Transcend SATA SSD230S 4TB for my lancache and it is working well enough for me.

 

As a side note on my Intel 12th gen system I have to set the process priority to high for Steam to hit almost the same speeds because the data runs through Windows Defender and it seems there is an issue with the Scheduler on Windows on my machine but since I use Linux most of the times I can most of the times saturate my 2,5Gbit/s connection.

 

Again this is not an issue with the prefill container this is a lancache issue and basically the wrong support thread, however I hope this helps and explains why you are experiencing slow download speeds.

Link to comment
23 minutes ago, ich777 said:

And exactly this is the issue.

 

A HDD is not able to keep up with those chunks (because they most of the times are located on a different place on the HDD) and the read head simply can't move fast enough to gather data quickly enough (go to your Dashboard when something is downloading through your lancache and you will see a lot of I/O wait because of that <- high CPU usage, but it's actually not high CPU usage).

 

As said above the writes are not strictly the issue it is more the read from the chunks.

 

You should use a SSD or NVME <- the lancache team also recommends always using a storage device that is as fast as possible.

I use a Transcend SATA SSD230S 4TB for my lancache and it is working well enough for me.

 

As a side note on my Intel 12th gen system I have to set the process priority to high for Steam to hit almost the same speeds because the data runs through Windows Defender and it seems there is an issue with the Scheduler on Windows on my machine but since I use Linux most of the times I can most of the times saturate my 2,5Gbit/s connection.

 

Again this is not an issue with the prefill container this is a lancache issue and basically the wrong support thread, however I hope this helps and explains why you are experiencing slow download speeds.

Yeah no problem and thank you for looking into it! I was expecting this to be more user error than anything else and as I was experiencing it through prefill I hoped you would have some insights. Thank you very much and I'll look into increasing my knowledge around this :).

  • Like 1
Link to comment

 @Ctek I've pushed a update to the container, please update the container itself and you have to restart it once after the container update so that it pulls the latest updated.

 

I've identified the issue, something must have been changed on one of the recent updates from Electrum.

 

However, thank you for the report.

 

grafik.png.86c30df622f27767057a906141cea0ed.png

  • Like 1
Link to comment
On 6/7/2024 at 6:30 AM, ich777 said:

Because you hve to do apt update first and after that you can issue apt install nano

 

But the old instructions are not correct you always have to use the path /luckybackup/.ssh

 

If you use /root/.ssh it will always break you config on a container update but with /luckybackup/.ssh not and this is the correct way to set it up.

 

BTW, you also don't need to generate new SSH keys because the container generates it's own keys on the first start from the container, so to speak the are unique.

 

The linked tutorial in my previous post should always work and helped a lot of people.

Hello,

Thank you for your help, but now I'm facing the next problem. I was able to install “nano”, but after entering “nano ~/.ssh/authorized_keys” the following message immediately appears in red: “Directory ‘/root/.ssh’ does not exist”
If I ignore this and simply insert the key and then try to save, the following appears: “Error writing /root/.ssh/authorized_keys: No such file or directory”

I solved this with “mkdir -p /root/.ssh” and then “chmod 700 /root/.ssh”

Now I'm stuck again in Lucky Backup. I enter the IP address as the remote computer, but how do I select the ssh file? I'm not offered “/luckybackup/.ssh”, only the directory that I just created with the authorized key. Even if I simply enter the directory by hand “/luckybackup/.ssh/rsa_....”. After a simulated start, it again asks for a password...

My computer, which I only ever synchronize in one direction, is 192.168.188.3 and goes to 192.168.178.3. I think I have to find another solution, there must be something simple that just lets you select the folders you want to synchronize and then press start. It doesn't even have to work automatically, I'm just old-school.

Link to comment
18 minutes ago, new_unraid_user said:

Thank you for your help

You are definitely not following this tutorial:

 

Why are you still doing using this directory when I already said twice this is wrong:

18 minutes ago, new_unraid_user said:

/root/.ssh

 

The root directory is only on the remote Unraid server, there is definitely something wrong if you get a message that the directory /root/.ssh doesn't exist on the server that you want sync to.

 

Please describe your setup in detail, luckyBackup is 100% working and I use it to back up my whole server over SSH and if you use the directory /luckybackup/.ssh (in the container) everything is working and will work even when you update the container.

Link to comment

So, here is the list:

 

192.168.188.3 is the server that mainly sends (source)
192.168.178.3 is the server that mainly receives (backup)

 

On the backup server I open the console and enter

 

cat /luckybackup/.ssh/ssh_host_rsa_key.pub

 

There I get the key that I have to insert on the source server, at least that's what I understand from your instructions.

Before that I have to execute on the source server

 

apt update
apt install nano

 

Then I execute the following command on the source server in the console of Luckybackup:

 

nano ~/.ssh/authorized_keys

 

...and then I get the error, directory does not exist.
That will have been the error, I open the console of Luckybackup, NOT the console of the system, just tested...
I am just pre-damaged by the other instructions, there the console was always opened by the Luckybackup dockers. With your instructions I also assumed that the console would be opened by LuckyBackup, but yes, it only says open console...
But when I open the command in the “main console” of Unraid, a file is actually displayed, including the keys that are already stored there. I assume that was the error.

Link to comment
20 minutes ago, whiteout said:

is there a way to get reports from lucky backup by mail e.g, if the backup run during the night was successful incl. an overview about the files which are backed up?

You can set up notifications for luckyBackup like mentioned in the post here:

 

  • Thanks 1
Link to comment
1 hour ago, new_unraid_user said:

Then I execute the following command on the source server in the console of Luckybackup:

Oh now I get it, you are trying to use two instances of luckyBackup and there the error happens correct?

 

Usually I set it up like this:

Backup Server (Destination) <- with luckyBackup installed

Main Server (Source) <- without luckyBackup installed

 

On the Backup Server you have to do the cat command from inside the container (cat /luckybackup/.ssh/ssh_host_rsa_key.pub) to get your public Key, on the Main Server you then have to place the key in ~/.ssh/authorized_keys (which is equal to /root/.ssh/authorized_keys) <- but do keep in mind that you have to do the from the Unraid console or if it's not a Unraid system from the console NOT the container console on the Main Server.

 

On the Backup server in luckyBackup you have to select your private key /luckybackup/.ssh/ssh_host_rsa_key in the GUI.

After that the sync should work perfectly fine.

 

When set up this way you pull from the Backupserver everything form the main server <- I think that's the better way to do it as long as the Backup server is located in your Home since the Backupserver then has access to your Main Server and pulls the new data in.

The procedure from above is almost the same except that luckyBackup is installed on the Main Server and that you have to put your public key on the Backup Server <- from a Unraid console.

 

You can do that of course the other way around too so that you pushing the backups from your Main Server to the Backup Server but I would only recommend doing that when your Backup Server is located not on site.

Link to comment
19 minutes ago, ich777 said:

Oh now I get it, you are trying to use two instances of luckyBackup and there the error happens correct?

 

Usually I set it up like this:

Backup Server (Destination) <- with luckyBackup installed

Main Server (Source) <- without luckyBackup installed

 

On the Backup Server you have to do the cat command from inside the container (cat /luckybackup/.ssh/ssh_host_rsa_key.pub) to get your public Key, on the Main Server you then have to place the key in ~/.ssh/authorized_keys (which is equal to /root/.ssh/authorized_keys) <- but do keep in mind that you have to do the from the Unraid console or if it's not a Unraid system from the console NOT the container console on the Main Server.

 

On the Backup server in luckyBackup you have to select your private key /luckybackup/.ssh/ssh_host_rsa_key in the GUI.

After that the sync should work perfectly fine.

 

When set up this way you pull from the Backupserver everything form the main server <- I think that's the better way to do it as long as the Backup server is located in your Home since the Backupserver then has access to your Main Server and pulls the new data in.

The procedure from above is almost the same except that luckyBackup is installed on the Main Server and that you have to put your public key on the Backup Server <- from a Unraid console.

 

You can do that of course the other way around too so that you pushing the backups from your Main Server to the Backup Server but I would only recommend doing that when your Backup Server is located not on site.

Thanks for the tip:-), yes, I installed 2x LuckyBackup, on the source and backup server, as I said, I followed the official instructions on the Unraid page so far https://unraid.net/blog/unraid-server-backups-with-luckybackup . Unfortunately, it no longer works since the last Docker update of LuckyBackup. The fact that I then had to run the procedure with the keys was a bit annoying, yes, but you didn't have to do it every day...

Ok, I had now entered the keys from the backup server (about 50 km away, 192.168.178.3) from the source server (here on site, 192.168.188.3) by opening the main console on the source server. There were already keys in this file, I added the one from the backup server and saved it.

After that, I opened LuckyBackup on the source server and continued, which was the mistake, so I have to do it the other way around, i.e. copy the key from the source server to the file on the backup server, and then your instructions would be correct for me again, right? I would then delete LuckyBackup on the backup server.

Phew... a difficult birth... I'm curious 🙂

Link to comment
10 minutes ago, new_unraid_user said:

I installed 2x LuckyBackup, on the source and backup server, as I said, I followed the official instructions on the Unraid page so far https://unraid.net/blog/unraid-server-backups-with-luckybackup .

But there it isn't mentioned either that you have to install it on both machines nor does it say you have to place the key in the container.

 

10 minutes ago, new_unraid_user said:

The fact that I then had to run the procedure with the keys was a bit annoying, yes, but you didn't have to do it every day...

This won't happen the next time if you are using /luckybackup/.ssh instead of /root/.ssh inside the luckyBackup container.

 

10 minutes ago, new_unraid_user said:

and then your instructions would be correct for me again, right?

Yes, the procedure is always the same, the public key from luckyBackup must be placed on the other host on Unraid itself in /root/.ssh/authorized_keys

  • Like 1
Link to comment
Posted (edited)

Does anyone know why this I/O error is popping up? I'm just trying to backup some files from my array to an external hdd connected via unassigned devices. It still copies everything but not sure what this means? Source is mapped as read only while the external HDD is set as r/w slave and using Mirror A -> B (incremental). I read a previous comment about user nobody but I'm not sure what means. How do I make this error go away?

 

Screenshot2024-06-09133058.thumb.png.d3476f4eb26ab916a44cf19d46d9e222.png

 

 

destination.png

Source.png

Edited by reezzeer
Link to comment
18 minutes ago, reezzeer said:

I'm just trying to backup some files from my array to an external hdd connected via unassigned devices.

Is this a NTFS formatted drive, if yes make sure to check the checkbox in the Advanced Settings that it is a Windows destination.

Link to comment
2 hours ago, ich777 said:

Is this a NTFS formatted drive, if yes make sure to check the checkbox in the Advanced Settings that it is a Windows destination.

It is NTFS formatted. Is this the option you've mentioned - Preserve DOS attributes (MS Windows/DOS only) ? Do you know why it's grayed out?

image.thumb.png.efe91c8138b8f0382d04857c135e6608.png

Link to comment
7 hours ago, reezzeer said:

Do you know why it's grayed out?

That's the wrong spot, go into the settings from the Task, there you have to enable Advanced View and choose that it's a NTFS Filesystem:

grafik.thumb.png.29c589f4898d082cc8b62fb893f0ea42.png

 

EDIT: Sorry, just noticed the Screenshot is in German, but I think you should get the idea where to edit it. :)

  • Like 1
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...