johnlabod

Members
  • Content Count

    2
  • Joined

  • Last visited

Community Reputation

0 Neutral

About johnlabod

  • Rank
    Newbie
  1. Sorry, I did do that I just accidentally left it out of my original post. I have edited it. My workaround right now is to add together the size of every file and add all the filenames into an array. Once the accumulated size is over 2GB then I create a tar archive and run through the array to make something like this: tar -cf archive.tar file1 file2 file3 This works. I took a look at tar's source and it seems it seems to be a problem with blocking. If you run tar -rf archive.tar file2 within a shfs filesystem and then list out the contents of the
  2. Hello all, I just recently got a license for Unraid and I was trying to accomplish a small plugin to backup my server to a storage bucket online. To accomplish this I decided to separate the files I wanted to back up into separate tar archives and send them one at a time. This was to prevent creating one large tar file of all my files and taking up a lot of space on my system. Anyway I have been running into this problem, I create an archive in /mnt/user/tempdir but every file that I append to this archive will not be added. To recreate this issue do this: