Infotrend Inc.

Members
  • Posts

    9
  • Joined

  • Last visited

Retained

  • Member Title
    Infotrend

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Infotrend Inc.'s Achievements

Noob

Noob (1/14)

1

Reputation

  1. Sorry for the delay, I forgot to enable notifications for this page. Unfortunately, CTPO is a partial rewrite of the underlying solution to enable the creation of a Dockerfile that can be used independently by end users wanting to rebuild the container. The other change is that /iti replaces the mount that used to be /dmc but they should be mostly compatible with one another. Did you happen to check the Logs from the unraid dropdown for the container?
  2. Glad the environment variable issue got resolved. As for the networking issue, I am uncertain on how to help, unfortunately.
  3. Please check the post below for the original FAQ, new topics will be added here as needed. Q: How to change the default Lab Password A: the Jupyter user home directory is mounted outside of the container. As such if you use the "Setup a Password" option on the Log In page, the hashed password will be stored in the mounted "home/.jupyter/jupyter_server_config.json" file and will survive container restart and upgrade. Q: How to reset the Lab Password to the default A: in the "appdata" directory, find the existing "home/.jupyter/jupyter_server_config.json" file and delete the "IdentityProvider" block. After you restart the container, the token will use the default value (iti)
  4. Unraid support thread for Jupyter-CTPO and Jupyter-TPO (short FAQ in next message) Changelog: - 20240421: Release with support for CUDA 12.3.2, TensorFlow 2.16.1, PyTorch 2.2.2 and OpenCV 4.9.0 - 20231120: Initial Release, with support for CUDA 11.8.0, TensorFlow 2.12.0, PyTorch 2.0.1 and OpenCV 4.7.0. Project details: https://github.com/Infotrend-Inc/CTPO/ For Jupyter-CTPO: If you have multiple GPUs with some allocated to VMs, make sure to change --gpus all (see below) The default password for the notebook is iti The system is run as the jupyter user (has sudo privileges) and /iti is where you can place your weights and other files to support your development. Jupyter-TPO (> 5GB download) Unraid compatible Jupyter Lab (Python kernel) container with CPU-ready Tensorflow, PyTorch, OpenCV, etc. Jupyter-CTPO (> 19GB download) Unraid compatible Jupyter Lab (Python kernel) container with GPU-optimized Tensorflow, PyTorch, OpenCV, etc. This GPU-bound container requires the Nvidia driver installed on your Unraid server with support for Docker. This driver needs to support the version of CUDA in use by this container. The template adds --gpus all to the way the docker container is started to get access to the GPU(s). The Unraid Nvidia Plugin is available in the community apps store If you have multiple GPUs in your system with some allocated to VMs, make sure to replace --gpus all with --runtime=nvidia and follow the steps below to set the NVIDIA_DRIVER_CAPABILITIES and NVIDIA_VISIBLE_DEVICES variables to only give the container access to selected GPUs.
  5. First, thank you for the note on the spaces, I should have fixed this in the latest branch (soon to be released as v0.9.1) To answer your question: Yes, the tool has a limited list of models supported (it displays a small help for each model based on that knowledge). I am adding a UI error (in the latest branch) that will let you know the list of models not supported from the list. In particular, for gpt-3.5, older models ("0301", "0613") are "Legacy" models per https://platform.openai.com/docs/models/gpt-3-5 so we did not include them in the release. The current list of supported models is in the ".env.example" file (see first post), which currently contains: gpt-3.5-turbo gpt-3.5-turbo-16k gpt-3.5-turbo-1106 gpt-4 gpt-4-32k gpt-4-1106-preview As for the "instruct" models, per their model page "Similar capabilities as text-davinci-003 but compatible with legacy Completions endpoint and not Chat Completions" knowing that "text-davinci-003" is a legacy API at this point, it also was not added. We have also removed that piece of code (legacy completion) in favor of the chat one. I hope this helps answer your questions.
  6. That usually means the script encountered an error before the streamlit UI is able to boot. Usually this is related to the provided environment variables. All of them need to have a value. I would suggest from the Docker tab to click on the icon and checking the logs for error after you start it. Environment check variable errors were not being passed to the UI. I will see about adding UI errors when possible. I have deployed the UI on two different Unraid for testing without encountering this issue. The tool works directly with reverse proxies that communicate with the IP:PORT of the Unraid installation. This has been tested using Nginx Proxy Manager as well as Cloudflare tunnels.
  7. Unraid-ready WebUI (streamlit-based) to ChatGPT and Dall-E's API (requires an OpenAI API key). Latest version: v0.9.1 (20231120) The tool's purpose is to enable a company to install a self-hosted version of a WebUI to access the capabilities of OpenAI's ChatGPT and DallE and share access to the tool's capabilities while consolidating billing through an OpenAI API key. Please see https://github.com/Infotrend-Inc/OpenAI_WebUI/blob/main/.env.example for details of possible values for the environment variables. Even if a feature is not used, its environment variable should be set. This container is built from the source available at https://github.com/Infotrend-Inc/OpenAI_WebUI The container requires a set of environment variables to work, but will inform the end user if an expected value is missing. Changelog: - v0.9.1 (20231120): Print streamlit errors in case of errors with environment variables + Addition of gpt-3.5-turbo-1106 in the list of supported models (added in openai python package 1.3.0) + added optional `OAIWUI_USERNAME` environment variable - v0.9.0 (20231108): Initial release -- incorporating modifications brought by the latest OpenAI Python package (tested against 1.2.0)