Jump to content

knex666

Members
  • Posts

    353
  • Joined

  • Last visited

Posts posted by knex666

  1. 21 hours ago, ImSkully said:

    This OpenProject template is no longer working out of the box and using an outdated image of the application, a new template for OpenProject using the latest supported stable release is now available in community applications with a much simpler setup and one-click installation.

     

    Hey @ImSkully 

     

    thats a kind of behaviour I absoloutly dont like and understand. 
    This Template works and since OpenProject does not support "latest" versioning everyone is aware that you have to manually update the major version. Nice that you build your own template using excatly the same container...
    you could although contribute to that project insted of claiming that it is not working and it is "outdated"

     

    Cheers

  2. On 8/27/2023 at 1:29 PM, Kilrah said:

    You can change the repo source to my kilrah/nextcloud-ffmpeg container that basically has that (plus VAAPI drivers) installed before publishing, the rest of the template is compatible since it's the same source container.


    How do you ensure that it pulls the latest official image if you build your own?

     

    That would be great to know to merge these projects

     

  3. 15 hours ago, Goldmaster said:

    Im now getting this as well. I have tried nextcloud cron docker, as it saysw cron is recommended. but how the heck do i give write access, its looking for?

    Please read my andware. I dont know that cron docker and I recommend you to use user scripts instead

  4. 3 hours ago, 2000gtacoma said:

    Is there a way to install ffmpeg?

     

    Edit: Disregard found this command.

    docker exec -u 0 Nextcloud /bin/sh -c "apt update && apt install -y ffmpeg"

     

    Yes, thats working but you have to run it everytime - thats a big disadvantage of that official docker

  5. On 8/24/2023 at 4:55 AM, mktlb0303 said:

    Hi all,

    first, great job knex666 on the integration and help with the support!

     

    I have an issue with cron being updated.

    Here is my setup.

    1. Your docker installed using the " ExtraParams: --user 99:100 --sysctl net.ipv4.ip_unprivileged_port_start=0 PostArgs: && docker exec -u 0 NAME_OF_THIS_CONTAINER /bin/sh -c 'echo "umask 000" >> /etc/apache2/envvars' " you suggested.

    2. I'm using mariadb

    3. I also installed Nextcloud-cronjob docker to help with this.

     

    The errors I'm getting now other than just having the usual "Last background job execution ran XX days ago. Something seems wrong" is the following:

     

    "It was not possible to execute the cron job via CLI. The following technical errors have appeared:

    Your data directory is invalid. Ensure there is a file called ".ocdata" in the root of the data directory.

    Your data directory is not writable. Permissions can usually be fixed by giving the web server write access to the root directory. See https://docs.nextcloud.com/server/27/go.php?to=admin-dir_permissions."

     

    I'm not sure what to do next.

    Thanks for your help.

     

    Hi,

     

    thanks for the flowers!

    So I dont know the Nextcloud docker cron and I dont know if that works because of permissions - has anyone experience with it?

    I am running a userscript every hour, that does work best for me:

     

    #!/bin/bash
    echo "START SCANNING FOR NEXTCLOUD FILES&FOLDERS"
    docker exec --user www-data Nextcloud /var/www/html/occ files:scan --all &
    docker exec --user www-data Nextcloud php -f /var/www/html/cron.php &

     

    Cheers

  6. 1 hour ago, jargo said:

    docker exec Nextcloud /var/www/html/occ preview:generate-all -vvv


    OCI runtime exec failed: exec failed: unable to start container process: exec: "/var/www/html/occ": permission denied: unknown

     

    How do I fix this permissions issue?  Permission for the 'occ' file is set to 'nobody'

    Try to run it with the owner -u www-data

  7. 2 hours ago, Cessquill said:

    If you just mean running the Nextcloud's cron tasks periodically, I have the following in the User Scripts plugin set to run every 5 minutes

     

    #!/bin/bash
    #docker exec -u www-data Nextcloud php -f /var/www/html/cron.php
    docker exec Nextcloud php -f /var/www/html/cron.php
    exit 0

     

    Disclaimer: this was set a fair while ago, and aside from updating the docker I haven't been diving too deep into Nextcloud, just using it.

    please use it without the # before docker otherwise it will not run ;)

    • Like 1
  8. 25 minutes ago, hopstah said:

     

    As I've deleted the docker image since this all happened, I can't show exactly what I did, but I did not use the external storage plugin - I hadn't even gotten that far in the setup process. That's what I'll do once I get my data back. Thanks for the tip.


    Ok, I think you tryed to mount your share to user/files and that folder will be recreated by nextcloud as an empty dir.

     

    give it a try by mounting your share somewhere as ro and use external choose local drive there

  9. 57 minutes ago, hopstah said:

     

    Thank you. I'm aware of that and that's what I did. What I don't understand is why that resulted in the data in those directories being wiped. Again - not trying to be a jerk, just not understanding why the folders that I mapped for Nextcloud to use got their contents deleted.


    I can just repeat my question where did you map the folder to?

     

    did you try to mount it into a user folder or did you use the external storage plugin as written in #1 post here

     

    cheers

  10. 7 hours ago, hopstah said:

     

    How would you recommend making my existing data accessible through Nextcloud at their existing locations? That's ultimately what I'd like is read/write access to my existing data.


    please read about volume mounting in docker. A folder inside your host system can be mapped to any location inside your docker.

  11. 45 minutes ago, hopstah said:

    I attempted to install Nextcloud yesterday evening and I set my User files path to be my existing data directory, and I created additional paths for other directories I wanted to access through Nextcloud.

     

    Unfortunately something went terribly wrong and all of the contents of those directories were deleted when I started up the docker for the first time. I am baffled as to why this would have happened and would like to not repeat the same thing in the future. I'm hoping to be able to cobble together my data again from various backup sources, but this is a pretty big screw up on my part, even though I'm not sure what I did wrong.


    Oh damn!

     

    hope you can recover your data.

     

    please mount your share to a non existing folder like /share.

     

    where did you mount the folders to?

     

    maybe you mount that folder as read only next time.

     

    cheers.

  12. 27 minutes ago, mrtrilby said:

    Hi.

     

    You write: "Please make shure you got the volume mounting correct. Please note you can mount any share to for example /mnt/Share and mount it in nextcloud with the "external storage" app."

     

    I've enabled the External Storage app, but when I go to set it up (following Spaceinvader One's YT video, which I've followed in the past with a different docker image), I get this error message: ""smbclient" is not installed. Mounting of "SMB/CIFS", "SMB/CIFS using OC login" is not possible. Please ask your system administrator to install it."

     

    What can be done to fix this?

     

    He is using a different image. Dont use smb use local and then choose the mounted folder

  13. Hey @SidM,

     

    denk dran, dass Du Docker benutzt und Docker ist cool.

     

    Also einfachere Alterantive.:
     

    Erstelle für jeden sein Share und mounte die Shares unter 

    /var/www/html/data/USERNAME

    in den Share müssen dann die Ordner etc.

    cache/  files/  files_trashbin/  files_versions/  uploads/
     

    Cheers

  14. Hey folks!

     

    so I had to swap a disk a second time that year. So I was wondering why my HDDs does not live longer than 3 years. 
    I tryed to preclear the same disk to just give it a try. but after 2min of pre-read it failed - so I was like ok, thats trash.

    After buying a new one an pre-clear that disk everythings fine now but I ask myself why do this disk fail so early.

    I put that disk in my windows machine and used HDDScan to read all sectors of that disk - there has been no bad sector. only one with > 500ms read - thats not a disk fail. smart state is good...

     

    why does unraid say that this disk is in error state?

    thank you!

     

     

    WD3TB_nf_Screenshot 2022-12-18 153636.png

    WD3TBsmart.pdf WD3TBres.pdf

  15. Hey everyone,

     

    I have changed the docker image to a slim python image, since it was very large in space.

     

    IMPORTANT:

    Please let the "post arguments" blank, as the latest docker image includes a runnable start command

     

    Please update your config file in order to make autoscan work again.

    Folder with full path now

    folder and filename are separated with an ; now.

     

    /Archiv/Test1/TESTARC;Filename": ["Keyword1"]

     

    Example: https://github.com/maschhoff/prpdf/blob/main/config/config.json

  16. Hi folks,

     

    so I updated to 6.11 today and I got a problem installing that plugin again.

    After installing it manually and restart I would like to choose tbsos-DVB Driver but it failed.

    Any idea why and how I can fix that?

     

    Cheers

     

    Quote

    --Please wait, downloading tbsos-DVB-Driver-Package------------

    ---------------Can't download tbsos-DVB-Driver-Package---

     

×
×
  • Create New...