[Support] infotrend/OpenAI_WebUI


Recommended Posts

Unraid-ready WebUI (streamlit-based) to ChatGPT and Dall-E's API (requires an OpenAI API key).

 

Latest version: v0.9.1 (20231120)

 

The tool's purpose is to enable a company to install a self-hosted version of a WebUI to access the capabilities of OpenAI's ChatGPT and DallE and share access to the tool's capabilities while consolidating billing through an OpenAI API key.

 

Please see https://github.com/Infotrend-Inc/OpenAI_WebUI/blob/main/.env.example for details of possible values for the environment variables. Even if a feature is not used, its environment variable should be set.

 

This container is built from the source available at https://github.com/Infotrend-Inc/OpenAI_WebUI

 

The container requires a set of environment variables to work, but will inform the end user if an expected value is missing.

 

Changelog:

- v0.9.1 (20231120): Print streamlit errors in case of errors with environment variables + Addition of gpt-3.5-turbo-1106 in the list of supported models (added in openai python package 1.3.0) + added optional `OAIWUI_USERNAME` environment variable
- v0.9.0 (20231108): Initial release -- incorporating modifications brought by the latest OpenAI Python package (tested against 1.2.0)

Edited by Infotrend Inc.
added Changelog
Link to comment

Hey there, I've spent a while trying to get this working, but I'm not having much success. Maybe you can advise, @Infotrend Inc.??

 

The container boots fine, but when I go to the web interface, I get a blank dark blue screen. there's a ... menu on the top right, and I can go to settings there. there is "made with streamlit" down on the bottom left and there's a gradient from red to yellow across the top.

 

I've tried a couple of different browsers and computers, and I get the same result.. Also, when I refresh, for just a microsecond there's a box that pops up, which says "please wait". I've got no errors in the logs.

 

 

Also, as a minor side note.. any chance you can allow reverse proxies, or a variable for the url?

 

Edited by kharntiitar
add additional question
Link to comment
19 hours ago, kharntiitar said:

The container boots fine, but when I go to the web interface, I get a blank dark blue screen. there's a ... menu on the top right, and I can go to settings there. there is "made with streamlit" down on the bottom left and there's a gradient from red to yellow across the top.

 

I've tried a couple of different browsers and computers, and I get the same result.. Also, when I refresh, for just a microsecond there's a box that pops up, which says "please wait". I've got no errors in the logs.
 


That usually means the script encountered an error before the streamlit UI is able to boot. Usually this is related to the provided environment variables. All of them need to have a value. I would suggest from the Docker tab to click on the icon and checking the logs for error after you start it. Environment check variable errors were not being passed to the UI. I will see about adding UI errors when possible.

I have deployed the UI on two different Unraid for testing without encountering this issue.

 

19 hours ago, kharntiitar said:

Also, as a minor side note.. any chance you can allow reverse proxies, or a variable for the url?

 

The tool works directly with reverse proxies that communicate with the IP:PORT of the Unraid installation.
This has been tested using Nginx Proxy Manager as well as Cloudflare tunnels.

Edited by Infotrend Inc.
Link to comment

Is there a limit to the "OAIWUI_GPT_MODELS:"? I have the models below but adding them in the field doesn't allow me to get past the "Enter User" screen. Checked logs but nothing. Thanks!

gpt-3.5-turbo,gpt-3.5-turbo-0301,gpt-3.5-turbo-0613,gpt-3.5-turbo-1106,gpt-3.5-turbo-16k,gpt-3.5-turbo-16k-0613,gpt-3.5-turbo-instruct,gpt-3.5-turbo-instruct-0914,gpt-4

Side Note: It does NOT like when there are spaces after the commas.

Edited by Dmitriyus
Link to comment
On 11/15/2023 at 6:10 PM, Dmitriyus said:

Is there a limit to the "OAIWUI_GPT_MODELS:"? I have the models below but adding them in the field doesn't allow me to get past the "Enter User" screen. Checked logs but nothing. Thanks!

gpt-3.5-turbo,gpt-3.5-turbo-0301,gpt-3.5-turbo-0613,gpt-3.5-turbo-1106,gpt-3.5-turbo-16k,gpt-3.5-turbo-16k-0613,gpt-3.5-turbo-instruct,gpt-3.5-turbo-instruct-0914,gpt-4

Side Note: It does NOT like when there are spaces after the commas.

 

First, thank you for the note on the spaces, I should have fixed this in the latest branch (soon to be released as v0.9.1)

 

To answer your question: Yes, the tool has a limited list of models supported (it displays a small help for each model based on that knowledge).

I am adding a UI error (in the latest branch) that will let you know the list of models not supported from the list.

 

In particular, for gpt-3.5, older models ("0301", "0613") are "Legacy" models per https://platform.openai.com/docs/models/gpt-3-5 so we did not include them in the release.

 

The current list of supported models is in the ".env.example" file (see first post), which currently contains: gpt-3.5-turbo gpt-3.5-turbo-16k gpt-3.5-turbo-1106 gpt-4 gpt-4-32k gpt-4-1106-preview

 

As for the "instruct" models, per their model page "Similar capabilities as text-davinci-003 but compatible with legacy Completions endpoint and not Chat Completions" knowing that "text-davinci-003" is a legacy API at this point, it also was not added. We have also removed that piece of code (legacy completion) in favor of the chat one.

 

I hope this helps answer your questions.

Edited by Infotrend Inc.
  • Thanks 1
Link to comment
  • 3 weeks later...

Hey @Infotrend Inc., thanks for getting back to me so fast, sorry I failed to achieve the same.

 

Quote

That usually means the script encountered an error before the streamlit UI is able to boot. Usually this is related to the

provided environment variables. All of them need to have a value. 

 

You were 100% correct on it being an environmental variable. I had, in my haste to get it setup, set the value of "OAIWUI_SAVEDIR" to the path of a host folder, rather than correctly to "/iti". I have no idea how I missed this, but stepping away for a couple of weeks has helped me work it out :P

 

 

Quote

The tool works directly with reverse proxies that communicate with the IP:PORT of the Unraid installation.
This has been tested using Nginx Proxy Manager as well as Cloudflare tunnels.

 

I think this is what is causing my ongoing issue, I use a MACVLAN network for my unraid containers, so my IP isn't the same as my unraid install. Either that or I've misconfigured something. 

Edited by kharntiitar
Edited for clarity and succinctness.
Link to comment
On 12/5/2023 at 5:57 PM, kharntiitar said:

You were 100% correct on it being an environmental variable. I had, in my haste to get it setup, set the value of "OAIWUI_SAVEDIR" to the path of a host folder, rather than correctly to "/iti". I have no idea how I missed this, but stepping away for a couple of weeks has helped me work it out :P

 

Glad the environment variable issue got resolved.

As for the networking issue, I am uncertain on how to help, unfortunately.

Edited by Infotrend Inc.
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.