[SUPPORT] - stable-diffusion Advanced


Recommended Posts

9 hours ago, Holaf said:

You can just edit the template and change the WEBUI_VERSION.
Interfaces are stored in different folders and work alongside each others.
You can even run multiple containers pointing to the same local folder at the same time.

 

Let's say in the GUI Docker tab I click on the stable-diffusion docker and click Edit to change the existing template. If I just change the WEBUI_VERSION and install Kohya in addition to my existing ComfyUI installation (done via the container), I assume this means I have both ComfyUI and Kohya in the same container. When the container starts, which UI starts running? And if I don't change the WebUI port, which UI will I see when I open the default GUI address in the browser?

 

I might have misunderstood what you said. It seems like it's better to run multiple containers, with each one controlling the startup of a particular GUI. (I assume they can share the same /config folder.) Otherwise I don't know how I can choose whether to start Kohya or ComfyUI  when they're both in the same container.

 

I would have tried just editing the template, but I have spent some time getting ComfyUI running the same way I had it before in my VM, so I'm a little scared of borking the whole thing.

 

 

Link to comment
4 hours ago, Holaf said:

@sonofdbn If you want to run multiples UI at the same time you have to run multiples containers.
But those containers can point at the same folder on unraid (eg /mnt/user/appdata/stable-diffusion)

 

So now I've installed Kohya in a separate container, but using the same folder /mnt/user/appdata/stable-diffusion for /config. As expected, I ended up with a 70-kohya sub-folder there, along with my existing 05-comfy-ui folder.

 

On initial launch it seemed OK; the interface (assigned to port 9070; ComfyUI is on 9000) came up fine, but I didn't run anything. But when I restarted after adding some paths to the template for the outputs and model folders, it went into a loop in the log file and the web UI never came up.

 

I thought it might be a problem with ComyUI running at the same time, so I stopped CUI and then restarted Kohya, but there was again an endless loop of messages in the log file. I couldn't find it saved anywhere, but I grabbed what I could from the log after I stopped the container. That's in the attachment.

 

I saw the log (but keep in mind I'm not a Python person) that "library" is installing and uninstalling, and I don't know whether the hyphen makes a difference.

 

The log also says there's no venv folder, but I do have a venv folder under 70-kohya.

 

However, ComfyUI still runs OK. I'll try installing maybe FaceFusion or A1111 in another container and see how that goes.

Kohya_log.txt

Link to comment

I tried a few more things: I installed FaceFusion in a separate container, but using the same /config folder. It worked fine, with ComfyUI running as well.

 

I tried reinstalling Kohya again, separate container, same /config, but got the same endless loop.

 

I installed Kohya again, but this time in a different /config folder (/mnt/user/appdata/kohya instead of /mnt/user/appdata/stable-diffusion for the others). Installation went smoothly, I saw the Torch bit looking OK in the log after the nVidia toolkit was detected:

 

23:32:46-955597 INFO     Version: v22.5.0                                       
                                                                                
23:32:46-959892 INFO     nVidia toolkit detected                                
23:32:50-012332 INFO     Torch 2.1.2+cu121                                      
23:32:50-016632 INFO     Torch backend: nVidia CUDA 12.1 cuDNN 8902             
23:32:50-026714 INFO     Torch detected GPU: NVIDIA GeForce RTX 4070 VRAM 12009 
                         Arch (8, 9) Cores 46                                   
23:32:50-027610 INFO     Verifying modules installation status from             
                         /config/70-kohya/kohya_ss/requirements_linux.txt...    
23:32:50-028561 INFO     Installing package: xformers==0.0.21                   
                         bitsandbytes==0.41.1    

 

The GUI came up fine (I didn't run anything). Then I shut down the container and restarted. This time I was back in the endless loop, with the same "Could not load torch" error as before, and no GUI. A partial log is attached.

Kohya_log_2.txt

Link to comment

Hey Holaf,

 

I found an interesting issue with the Automatic1111 Implementation, that may extend to others. From what I can tell, the Container creates a symlink from the chosen WebUI's folder to the folder in the base directory. This works for Checkpoint models, but doesn't appear to be working for Loras. See imaged attached that I have files in the /config/models/lora folder and have confirmed the functionality of the symlink at /config/02-sd-webui/webui/models/Lora. However, Automatic1111 states that they're not there. However, it IS able to see my checkpoints.

 

My apologies if this is an issue with Automatic1111 and not your image. I just find it odd that it'd be able to load one set, but not the other. And the link has to work well enough for the Civit.ai WebHelper Extension to be able to download and save the files through the symlink to the correct folder.

 

image.png

Edited by mrguymiah
Expansion on existing content.
Link to comment

Hi everyone,

 

Im having an issue installing stable diffusio. Hopefully someone can point me in the right direction? I install it easy enough, but it gets stuck during the post install script where it is trying to download condo-forge. Here is an excerpt from the log:

The following NEW packages will be INSTALLED:

  _libgcc_mutex      conda-forge/linux-64::_libgcc_mutex-0.1-conda_forge 
  _openmp_mutex      conda-forge/linux-64::_openmp_mutex-4.5-2_gnu 
  bzip2              conda-forge/linux-64::bzip2-1.0.8-hd590300_5 
  c-ares             conda-forge/linux-64::c-ares-1.26.0-hd590300_0 
  ca-certificates    conda-forge/linux-64::ca-certificates-2023.11.17-hbcca054_0 
  curl               conda-forge/linux-64::curl-8.5.0-hca28451_0 
  gettext            conda-forge/linux-64::gettext-0.21.1-h27087fc_0 
  git                conda-forge/linux-64::git-2.43.0-pl5321h7bc287a_0 
  keyutils           conda-forge/linux-64::keyutils-1.6.1-h166bdaf_0 
  krb5               conda-forge/linux-64::krb5-1.21.2-h659d440_0 
  ld_impl_linux-64   conda-forge/linux-64::ld_impl_linux-64-2.40-h41732ed_0 
  libcurl            conda-forge/linux-64::libcurl-8.5.0-hca28451_0 
  libedit            conda-forge/linux-64::libedit-3.1.20191231-he28a2e2_2 
  libev              conda-forge/linux-64::libev-4.33-hd590300_2 
  libexpat           conda-forge/linux-64::libexpat-2.5.0-hcb278e6_1 
  libffi             conda-forge/linux-64::libffi-3.4.2-h7f98852_5 
  libgcc-ng          conda-forge/linux-64::libgcc-ng-13.2.0-h807b86a_4 
  libgomp            conda-forge/linux-64::libgomp-13.2.0-h807b86a_4 
  libiconv           conda-forge/linux-64::libiconv-1.17-hd590300_2 
  libnghttp2         conda-forge/linux-64::libnghttp2-1.58.0-h47da74e_1 
  libnsl             conda-forge/linux-64::libnsl-2.0.1-hd590300_0 
  libsqlite          conda-forge/linux-64::libsqlite-3.44.2-h2797004_0 
  libssh2            conda-forge/linux-64::libssh2-1.11.0-h0841786_0 
  libstdcxx-ng       conda-forge/linux-64::libstdcxx-ng-13.2.0-h7e041cc_4 
  libuuid            conda-forge/linux-64::libuuid-2.38.1-h0b41bf4_0 
  libxcrypt          conda-forge/linux-64::libxcrypt-4.4.36-hd590300_1 
  libzlib            conda-forge/linux-64::libzlib-1.2.13-hd590300_5 
  ncurses            conda-forge/linux-64::ncurses-6.4-h59595ed_2 
  openssl            conda-forge/linux-64::openssl-3.2.0-hd590300_1 
  pcre2              conda-forge/linux-64::pcre2-10.42-hcad00b1_0 
  perl               conda-forge/linux-64::perl-5.32.1-7_hd590300_perl5 
  pip                conda-forge/noarch::pip-23.3.2-pyhd8ed1ab_0 
  python             conda-forge/linux-64::python-3.10.13-hd12c33a_1_cpython 
  readline           conda-forge/linux-64::readline-8.2-h8228510_1 
  setuptools         conda-forge/noarch::setuptools-69.0.3-pyhd8ed1ab_0 
  tk                 conda-forge/linux-64::tk-8.6.13-noxft_h4845f30_101 
  tzdata             conda-forge/noarch::tzdata-2023d-h0c530f3_0 
  wheel              conda-forge/noarch::wheel-0.42.0-pyhd8ed1ab_0 
  xz                 conda-forge/linux-64::xz-5.2.6-h166bdaf_0 
  zstd               conda-forge/linux-64::zstd-1.5.5-hfc55251_0 



Downloading and Extracting Packages: ...working... done
Preparing transaction: ...working... done
Verifying transaction: ...working... done
ERROR conda.core.link:_execute(945): An error occurred while installing package 'conda-forge::ca-certificates-2023.11.17-hbcca054_0'.
Executing transaction: ...working... done
Rolling back transaction: ...working... done

[Errno 95] Operation not supported: 'cacert.pem' -> '/config/03-invokeai/env/ssl/cert.pem'
()

 

Then it just keeps retrying and it loops over and over again.

Link to comment
1 hour ago, Holaf said:

@mrguymiah Strange, it's working fine with me 😵💫
Presse-papiers-1.thumb.png.7a7716293c529cf6768e61a31526e44b.png

 

When you go into /config/02-sd-webui/webui/models/Lora do you see your files ?

Yes. When I hit "Enter" on the symlink, it does show the contents of /config/models/lora. And, as you can see in my image (reposted for convenience), the symlink reports functional in Midnight Commander, and has the appropriate capitalization. Could it be an issue that the folder that the symlink points to has the lowercase "L"?

 

2024-01-26_23h20_24.thumb.png.bfd52db0512aca057d23070a9f088eeb.png

 

Link to comment

@GradwellZA Perhaps they change something in the installer 🤔
I'll take a look.

@mrguymiah
what's important is the name of the link, the name of the folder it points to doesn't change anything.
When the interface is launched, you can try to delete the Lora symlink and replace it with your folder (with an uppercase L).
Then click on refresh to check if it sees the loras 🤷‍♂️
(at the next launch of the container the folder will be moved again to the common folder)

  • Like 1
Link to comment
15 minutes ago, Holaf said:

@GradwellZA Perhaps they change something in the installer 🤔
I'll take a look.

@mrguymiah
what's important is the name of the link, the name of the folder it points to doesn't change anything.
When the interface is launched, you can try to delete the Lora symlink and replace it with your folder (with an uppercase L).
Then click on refresh to check if it sees the loras 🤷‍♂️
(at the next launch of the container the folder will be moved again to the common folder)

I'll give that a shot and get back to you.

Link to comment
12 hours ago, Holaf said:

@GradwellZA Perhaps they change something in the installer 🤔
I'll take a look.

@mrguymiah
what's important is the name of the link, the name of the folder it points to doesn't change anything.
When the interface is launched, you can try to delete the Lora symlink and replace it with your folder (with an uppercase L).
Then click on refresh to check if it sees the loras 🤷‍♂️
(at the next launch of the container the folder will be moved again to the common folder)

Perfect thanks so much,  that would be amazing. I tried redoing the installer last night and this time I selected 02 - Automatic1111 instead of the 03 - InvokeAI WebUI. That worked perfectly. So looks like it's just 03 - InvokeAI WebUI install that is causing an issue. If you could figure it out and point me in the right direction, that would be absolutely amazing! 

Link to comment
13 hours ago, Holaf said:

@GradwellZA Perhaps they change something in the installer 🤔
I'll take a look.

@mrguymiah
what's important is the name of the link, the name of the folder it points to doesn't change anything.
When the interface is launched, you can try to delete the Lora symlink and replace it with your folder (with an uppercase L).
Then click on refresh to check if it sees the loras 🤷‍♂️
(at the next launch of the container the folder will be moved again to the common folder)

 

This is also something that comes up during the loop cycle: 

 

** INVOKEAI INSTALLATION SUCCESSFUL **
If you installed manually from source or with 'pip install': activate the virtual environment
then run one of the following commands to start InvokeAI.

Web UI:
   invokeai-web

Command-line client:
   invokeai

If you installed using an installation script, run:
  /config/03-invokeai/invokeai/invoke.sh

Add the '--help' argument to see all of the command-line switches available for use.

[2024-01-28 01:08:16,911]::[InvokeAI]::INFO --> Patchmatch initialized
/home/abc/miniconda3/lib/python3.11/site-packages/torchvision/transforms/functional_tensor.py:5: UserWarning: The torchvision.transforms.functional_tensor module is deprecated in 0.15 and will be **removed in 0.17**. Please don't rely on it. You probably just need to use APIs in torchvision.transforms.functional or in torchvision.transforms.v2.functional.
  warnings.warn(
[2024-01-28 01:08:21,568]::[uvicorn.error]::INFO --> Started server process [1055]
[2024-01-28 01:08:21,568]::[uvicorn.error]::INFO --> Waiting for application startup.
[2024-01-28 01:08:21,569]::[InvokeAI]::INFO --> InvokeAI version 3.6.2
[2024-01-28 01:08:21,569]::[InvokeAI]::INFO --> Root directory = /config/03-invokeai/invokeai
[2024-01-28 01:08:21,573]::[InvokeAI]::INFO --> Initializing database at /config/03-invokeai/invokeai/databases/invokeai.db
[2024-01-28 01:08:26,606]::[InvokeAI]::ERROR --> Problem creating migrations table: database is locked
[2024-01-28 01:08:26,609]::[uvicorn.error]::ERROR --> Traceback (most recent call last):
  File "/home/abc/miniconda3/lib/python3.11/site-packages/invokeai/app/services/shared/sqlite_migrator/sqlite_migrator_impl.py", line 101, in _create_migrations_table
    cursor.execute(
sqlite3.OperationalError: database is locked

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/abc/miniconda3/lib/python3.11/site-packages/starlette/routing.py", line 705, in lifespan
    async with self.lifespan_context(app) as maybe_state:
  File "/home/abc/miniconda3/lib/python3.11/site-packages/starlette/routing.py", line 584, in __aenter__
    await self._router.startup()
  File "/home/abc/miniconda3/lib/python3.11/site-packages/starlette/routing.py", line 682, in startup
    await handler()
  File "/home/abc/miniconda3/lib/python3.11/site-packages/invokeai/app/api_app.py", line 105, in startup_event
    ApiDependencies.initialize(config=app_config, event_handler_id=event_handler_id, logger=logger)
  File "/home/abc/miniconda3/lib/python3.11/site-packages/invokeai/app/api/dependencies.py", line 73, in initialize
    db = init_db(config=config, logger=logger, image_files=image_files)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/abc/miniconda3/lib/python3.11/site-packages/invokeai/app/services/shared/sqlite/sqlite_util.py", line 34, in init_db
    migrator.run_migrations()
  File "/home/abc/miniconda3/lib/python3.11/site-packages/invokeai/app/services/shared/sqlite_migrator/sqlite_migrator_impl.py", line 47, in run_migrations
    self._create_migrations_table(cursor=cursor)
  File "/home/abc/miniconda3/lib/python3.11/site-packages/invokeai/app/services/shared/sqlite_migrator/sqlite_migrator_impl.py", line 116, in _create_migrations_table
    raise MigrationError(msg) from e
invokeai.app.services.shared.sqlite_migrator.sqlite_migrator_common.MigrationError: Problem creating migrations table: database is locked

[2024-01-28 01:08:26,609]::[uvicorn.error]::ERROR --> Application startup failed. Exiting.
Unknown args: ['--web']
Unknown args: ['--web']
Unknown args: ['--web']
Unknown args: ['--web']

App is starting!
Channels:
 - defaults
Platform: linux-64
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... done

# All requested packages already installed.

Channels:
 - conda-forge
 - defaults
Platform: linux-64
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... done

## Package Plan ##

  environment location: /config/03-invokeai/env

  added / updated specs:
    - git
    - python=3.10


The following NEW packages will be INSTALLED:

  _libgcc_mutex      conda-forge/linux-64::_libgcc_mutex-0.1-conda_forge 
  _openmp_mutex      conda-forge/linux-64::_openmp_mutex-4.5-2_gnu 
  bzip2              conda-forge/linux-64::bzip2-1.0.8-hd590300_5 
  c-ares             conda-forge/linux-64::c-ares-1.26.0-hd590300_0 
  ca-certificates    conda-forge/linux-64::ca-certificates-2023.11.17-hbcca054_0 
  curl               conda-forge/linux-64::curl-8.5.0-hca28451_0 
  gettext            conda-forge/linux-64::gettext-0.21.1-h27087fc_0 
  git                conda-forge/linux-64::git-2.43.0-pl5321h7bc287a_0 
  keyutils           conda-forge/linux-64::keyutils-1.6.1-h166bdaf_0 
  krb5               conda-forge/linux-64::krb5-1.21.2-h659d440_0 
  ld_impl_linux-64   conda-forge/linux-64::ld_impl_linux-64-2.40-h41732ed_0 
  libcurl            conda-forge/linux-64::libcurl-8.5.0-hca28451_0 
  libedit            conda-forge/linux-64::libedit-3.1.20191231-he28a2e2_2 
  libev              conda-forge/linux-64::libev-4.33-hd590300_2 
  libexpat           conda-forge/linux-64::libexpat-2.5.0-hcb278e6_1 
  libffi             conda-forge/linux-64::libffi-3.4.2-h7f98852_5 
  libgcc-ng          conda-forge/linux-64::libgcc-ng-13.2.0-h807b86a_4 
  libgomp            conda-forge/linux-64::libgomp-13.2.0-h807b86a_4 
  libiconv           conda-forge/linux-64::libiconv-1.17-hd590300_2 
  libnghttp2         conda-forge/linux-64::libnghttp2-1.58.0-h47da74e_1 
  libnsl             conda-forge/linux-64::libnsl-2.0.1-hd590300_0 
  libsqlite          conda-forge/linux-64::libsqlite-3.44.2-h2797004_0 
  libssh2            conda-forge/linux-64::libssh2-1.11.0-h0841786_0 
  libstdcxx-ng       conda-forge/linux-64::libstdcxx-ng-13.2.0-h7e041cc_4 
  libuuid            conda-forge/linux-64::libuuid-2.38.1-h0b41bf4_0 
  libxcrypt          conda-forge/linux-64::libxcrypt-4.4.36-hd590300_1 
  libzlib            conda-forge/linux-64::libzlib-1.2.13-hd590300_5 
  ncurses            conda-forge/linux-64::ncurses-6.4-h59595ed_2 
  openssl            conda-forge/linux-64::openssl-3.2.0-hd590300_1 
  pcre2              conda-forge/linux-64::pcre2-10.42-hcad00b1_0 
  perl               conda-forge/linux-64::perl-5.32.1-7_hd590300_perl5 
  pip                conda-forge/noarch::pip-23.3.2-pyhd8ed1ab_0 
  python             conda-forge/linux-64::python-3.10.13-hd12c33a_1_cpython 
  readline           conda-forge/linux-64::readline-8.2-h8228510_1 
  setuptools         conda-forge/noarch::setuptools-69.0.3-pyhd8ed1ab_0 
  tk                 conda-forge/linux-64::tk-8.6.13-noxft_h4845f30_101 
  tzdata             conda-forge/noarch::tzdata-2023d-h0c530f3_0 
  wheel              conda-forge/noarch::wheel-0.42.0-pyhd8ed1ab_0 
  xz                 conda-forge/linux-64::xz-5.2.6-h166bdaf_0 
  zstd               conda-forge/linux-64::zstd-1.5.5-hfc55251_0 



Downloading and Extracting Packages: ...working... done
Preparing transaction: ...working... done
Verifying transaction: ...working... done
ERROR conda.core.link:_execute(945): An error occurred while installing package 'conda-forge::ca-certificates-2023.11.17-hbcca054_0'.
Executing transaction: ...working... done
Rolling back transaction: ...working... done

[Errno 95] Operation not supported: 'cacert.pem' -> '/config/03-invokeai/env/ssl/cert.pem'
()

 

Link to comment
On 1/24/2024 at 4:15 AM, Holaf said:

i'll have a look at it ...
I dont use Kohya so I admit I didn't do a lot of tests with it 😅

 

I'd be interested to know if anyone else has Kohya running successfully in the docker.

 

I ended up going back to running everything natively in my Win11 VM, largely because I don't know enough to go into the docker and do all the fiddling around that the various packages always seem to require.

 

But while I can natively run Automatic1111 and ComfyUI, native (Windows) Kohya constantly gives me problems. I've only been able to run it successfully under Pinokio.

Link to comment
On 1/21/2024 at 5:19 PM, Holaf said:

The best way to do this is via a reverse-proxy.
https://nginxproxymanager.com/ is a good one, easy to use, and you can find a lot of help on the internet
 

 

Thanks for the suggestion.

I've been trying to configure through nginx reverse proxy, it is proving difficult though, I'm using a cloud flare tunnel unfortunately now I do not have access to port forwarding. 

I did have nginx setup before the tunnel working fine. 

I've just got open public access to Invoke AI right now.  🤦🏻‍♂️

Link to comment
On 1/29/2024 at 10:07 AM, FlorianXL said:

please specify exactly how i point my embeddings on my array to the a1111 container. i got it with loras, but not with my embeddings

I had the same issue, it looks like it's a bug of auto1111.
I reinstalled it from scratch and embeddings were back.

embeddings should be stored in the models/embeddings folder (eg : /mnt/user/appdata/stable-diffusion/models/embeddings)

if you don't want to reinstall you can try to add this option in the parameters.txt file :
--embeddings-dir "/config/models/embeddings"
 

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.