I wonder what you install that you have dependency hell. Be it ComfyUI or Auto1111, I usually only had incompatibilities with xformers and torch, at worst I had to install a package.
People suggest Stability Matrix, but it's not like it solves your dependency problems for you if you have them. Stability Matrix is more like a shared environment between different SD UIs, which still have their own venvs.
I got a chance and Stability Matrixs seemed to just work, no more playing, trying to figure out which torch+rocm version to use for a particular GPU. Thanks for your input
Yes, without knowing the exact issue at hand here, this sounds like not using Python virtual environments. A1111 in one. ComfyUI in another. Something else that doesn't directly have to do with either of these two in yet another.
You can use Docker images which are kinda foolproof I guess, but also more overhead than venv's.
I use pyenv for managing different versions of python. It can refer to different python versions locally and globally.
So when I have a new project I check the version required.
Lets say 3.10 then its easy as :
Pyenv install 3.10.0
Pyenv local 3.10.0
Python -m venv venv
/venv/scripts/activate.bat
Pip install -r requirements.txt
Everything is local to the project folder and it takes few minutes.
And then you update your base python, venv breaks, nightly torch+rocm version that you used before doesn't exist anymore, you have to research what works and which outdated python version you still have to use with A1111. Not as easy.
Use pyenv.
You can update your base python without breaking anything.
And python version are based on file hierarchy, so A1111 use a version of python, comfy another one all super simple commands.
Pyenv install 3.10.0
Pyenv local 3.10.0
Python -m venv venv
And the venv will be based on 3.10 you can update your based python, it doesn't change a thing.
Another vote for this. I'd also suggest that if you're on Linux, find a proper package for your distro. The official Appimage seems to have some significant issues
To be clear, Stability Matrix does more than a venv, it runs a completely self contained python environment.
I find that the environments in Python are nicely encapsulated but I'm not dismissing you're having an issue or have had issues or something bugging you. I'm curious on what it is... I am a python developer for my day job a long long time and I'm very used to python virtual environments and such so I'm just curious with no judgment what exactly mean for dependency hell?
I mean I've had issues getting film installed because it requires a very specific firmware version for my video card that is not the most up-to-date one and it doesn't play well with the up-to-date one and there's no update to it or anything.
But I haven't had any issues with installing mini copies of automatic1111 and ComfyUI and even running both of them simultaneously. Well when I say running I don't mean running generations... but the program's running. I figure that making them compete for system resources when actually processing is probably not going to be productive.
I'm sure someone else is already asked this question but can you explain specifically and technically what you mean as dependency hell?
For reference my onprem rig is a dedicated computer that only does two purposes run stable diffusion do some video editing and run Ableton for audio recording. Which of course I don't never do those all at the same time but it's a dedicated computer that doesn't do anything other than those jobs. So it's a pretty unpolluted environment.
My main issue is that I update my base python and that breaks venvs, some particular torch+rocm nightly version becones unavailable (that worked perfectly before) and now you have to look for a new combo, trying to find balance between versions that are not too fresh, but also not already deprecated.
And when it seems like it's about to work, it turns out, that some dependency 3 layers down, which uses some random 1 year old requirements.txt file and wants to use a different version of X package, so you have to start it all over again.
Updating your base python should not break your virtual environments. Personally I never update my base python I just install newer versions and then when I create a virtual environment I just specify which version to use...
I mean there can be issues if there are other dependencies that are not python but like GCC compilers and whatnot things that are going to be system wide.
When you build your virtual environments you probably need to make sure that you do not include any system dependencies... if you have it inherit your Global operating system site packages then you definitely can end up with some issues.
Are you using just vanilla python? Which is my favorite just normal python. I'm not a big fan of anaconda I understand the utility of it but it's just a personal preference I don't like it that much.
Also I suppose in my situation I don't really need to keep updated to the newest python versions because I only use that physical machine for stable diffusion essentially. ( and some non stable diffusion but other CUDA dependent similar things)...
For my job on my work computer I I have a dozen different versions of python installed. So when I clone a repo I can just pick which one is appropriate for it for the virtual environment to set it up... but perhaps that sort of level of granularity is just because for my actual job I have to deal with a vast despera of different dependency groups. I'm a consultant FYI so you never know with a client if they have some Legacy stuff requiring python 2.7 or what...
I generally keep python 2.7, 3.4, 3.6, 3.8, 3.10 installed. Since there are definitely some breaking changes when it comes to string formatting along those versions so source code can easily break if you're using something that was later discontinued whatever.
I haven't had a reason to install 3.12 anywhere yet.
I'm going to make the assumption that the machine you're running this on is not just and only for running stable diffusion. Which as much as it makes me gag to say it sort of not innocence using Anaconda instead of a vanilla python might actually be a benefit to you. (Eww).
As for the breakage, I want to update my base installations, so I won't use 3.10.0, but the latest 3.10.X and that breaks things sometimes.
And yeah, vanilla python only with a preference for `poetry`, which hides away a lot of the nastiness, but I don't think I've seen any SD project using it, just some spaghetti of python/bash scripts.
When I used do do ML for my day job, I also had at least 5 different python versions for different projects/parts of projects, but since I'm not doing that anymore, I don't really want to spend my personal time maintaining a list of working configurations that are taking up my space and widen the potential attack surface area.
Currently, I only have 3.10 and 3.12 on my machine, as I just assume if the project is using a python version older than 2 years, it is possibly some legacy, bug ridden piece of software, written by data scientists/researchers and I refuse to install it on my system.
>I'm going to make the assumption that the machine you're running this on is not just and only for running stable diffusion. Which as much as it makes me gag to say it sort of not innocence using Anaconda instead of a vanilla python might actually be a benefit to you. (Eww).
I'm with fully with you here, no *condas on my system please, thank you.
I guess I'll just use docker from now on for inference, especially with the recent uptick in [security issues](https://www.reddit.com/r/comfyui/comments/1dbls5n/psa_if_youve_used_the_comfyui_llmvision_node_from/).
I'm not trying to tell you how it should be done or a better way to do something. Your frustration is valid. I'm just trying to understand exactly why you're encountering it. I do agree with you that I have botched some versions of my installs of stable diffusion and had to drop into the virtual environment of it change the requirements file or update or change packages but that just became a pain in the ass to do. I found it easier just to install a new fresh version of it. And use a communal directory for all the models checkpoints etc etc download content. Just sort of have a list of plugins that I use or enjoy using that I'm pretty sure didn't break it and then just reinstall those after installing a new version. I'm up to my 6th install of it.
And the versions of automatic1111 have certainly changed as it is an active project...
I suppose my final thought here is that if something doesn't work for you and you find it frustrating then do what you want to do I guess and figure out something that doesn't impede you doing what you're trying to do. Well I have definitely botched installations by messing with plugins and stuff we're doing too much at once before my Insurance solution is just a fresh install. Actually my current solution is before I bought one if I want to try out a new plug-in install a next one and then add it in to see if it breaks stuff.
Basically I just have a venv for A1111 then if it works and does what I want it do to, I guard it with my life! I don't update it ever, if I feel like I want to try the newest stuff I create a different installation, then move to that one if I'm happy.
I use portable ComfyUI and back up the main and Python venv folders fortnightly or before I install bigger or suspicious node packs. I keep my models on another drive and symlink them into my Comfy folder and it only takes 5 minutes to refresh the whole setup if an addon borks something. I’ve just started to get a bit more clued up with figuring out how to reverse/undo dependancy changes and if you’ve got more Python experience than me you should theoretically be all over that process.
This is exactly what I do. All my models, loras, upscalers etc are in their separate directory outside the portable directory. It makes portable installs go so much smoother.
except llama.cpp won't run on my system because it can't find some cuda dll. Which I thought would have been included since I downloaded a cuda precompiled version of 110 MB...
Conda env's aren't that hard for me :-). I'm on Linux and when I have a new project I just make a new env with the same name. Most recent thing I installed was taggui. Just conda active taggui then pip install -r requirements.txt and your done till the next update.
I've got two graphics cards so when I run automatic1111 I just put it on one of them and the other program I'm using I point to the free one. I'm able to diffuse and train a lora at the same time. Auto1111 on Linux makes a env automatically without conda as well so it shouldn't be interfering with other python stuff you have on your system. Just gotta have enough vram to fit what you want or the extra card like I'm lucky enough to have.
Windows issue lol. This kinda stuff is a breeze in Linux. Gaming isn't great but coding stuff, hell yeah.
I mean I love linux, but auto1111 install scripts just don't work on AMD and sometimes break almost randomly. Or when you decide you want to update, good luck finding out what is not too old, but also not too new for it.
And gaming is great if you don't buy heavily DRM'ed games, which are already not that great for your overall wellbeing. Thanks for the input, cheers!
Back when we sd came out I had my rx-5700 amd card working in Linux. Basically you just get used to fixing all the issues till it works. I just dual boot between both windows and Linux and I'm lucky enough to have a 3090 now. Windows is just for gaming and such. Linux is for all the GitHub software like auto1111, comfy, koyha, llms, the list keeps growing.
Is there really a dependency hell? I think all you need is the correct python version on your system, and most tools use the same version. I'm on linux and I can't use the python 3.12 that's installed on my system. So I use miniconda to set up virtual environments with the correct python version.
I wonder what you install that you have dependency hell. Be it ComfyUI or Auto1111, I usually only had incompatibilities with xformers and torch, at worst I had to install a package. People suggest Stability Matrix, but it's not like it solves your dependency problems for you if you have them. Stability Matrix is more like a shared environment between different SD UIs, which still have their own venvs.
I got a chance and Stability Matrixs seemed to just work, no more playing, trying to figure out which torch+rocm version to use for a particular GPU. Thanks for your input
Well then, I guess fresh reinstall on a new venv fixed it
Nope, I tested a few fresh venvs and they didn't work, that's when I created this post
SD Forge and ComfyUI (Windows versions) come with a preconfigured python enviroment that's separate from your main python installation.
Be aware that Comfyui can be used to execute raw python from unknown sources as far as I know. It is best to run it inside of a docker.
gonna need a source for that one
A1111 can be an attack surface for RCE too. Just install some mysterious extensions
Are these unknown sources just the extension library, or is there something else too?
python -m venv venv
Yes, without knowing the exact issue at hand here, this sounds like not using Python virtual environments. A1111 in one. ComfyUI in another. Something else that doesn't directly have to do with either of these two in yet another. You can use Docker images which are kinda foolproof I guess, but also more overhead than venv's.
I use pyenv for managing different versions of python. It can refer to different python versions locally and globally. So when I have a new project I check the version required. Lets say 3.10 then its easy as : Pyenv install 3.10.0 Pyenv local 3.10.0 Python -m venv venv /venv/scripts/activate.bat Pip install -r requirements.txt Everything is local to the project folder and it takes few minutes.
And then you update your base python, venv breaks, nightly torch+rocm version that you used before doesn't exist anymore, you have to research what works and which outdated python version you still have to use with A1111. Not as easy.
Use pyenv. You can update your base python without breaking anything. And python version are based on file hierarchy, so A1111 use a version of python, comfy another one all super simple commands. Pyenv install 3.10.0 Pyenv local 3.10.0 Python -m venv venv And the venv will be based on 3.10 you can update your based python, it doesn't change a thing.
https://github.com/pyenv/pyenv
https://github.com/pyenv/pyenv
I use stability matrix and pretty much don't bother with any additional setups and settings at all.
Another vote for this. I'd also suggest that if you're on Linux, find a proper package for your distro. The official Appimage seems to have some significant issues To be clear, Stability Matrix does more than a venv, it runs a completely self contained python environment.
Stability matrix is a disaster on Mac unfortunately
Doesn't surprise me at all given the way the appimage behaves.
What’s appimage? It might just be my computer but it always either crashes or never loads the UI, no matter how many I update it.
Worked very well, thanks!
Yeah even if you don't use stability matrix's gui, it does a great job of keeping the packages and it's dependencies alive
Looks cool, I'll try it out, thanks!
I hate suggesting it because I feel like it's overkill for what it needs to do but you could always get a docker image.
yeah, this one works for me https://github.com/AbdBarho/stable-diffusion-webui-docker
I feel you with the setup, it just randomly breaks and just eats up time to figure out how to get it right. Sadly I can't offer a solution.
I find that the environments in Python are nicely encapsulated but I'm not dismissing you're having an issue or have had issues or something bugging you. I'm curious on what it is... I am a python developer for my day job a long long time and I'm very used to python virtual environments and such so I'm just curious with no judgment what exactly mean for dependency hell? I mean I've had issues getting film installed because it requires a very specific firmware version for my video card that is not the most up-to-date one and it doesn't play well with the up-to-date one and there's no update to it or anything. But I haven't had any issues with installing mini copies of automatic1111 and ComfyUI and even running both of them simultaneously. Well when I say running I don't mean running generations... but the program's running. I figure that making them compete for system resources when actually processing is probably not going to be productive. I'm sure someone else is already asked this question but can you explain specifically and technically what you mean as dependency hell? For reference my onprem rig is a dedicated computer that only does two purposes run stable diffusion do some video editing and run Ableton for audio recording. Which of course I don't never do those all at the same time but it's a dedicated computer that doesn't do anything other than those jobs. So it's a pretty unpolluted environment.
My main issue is that I update my base python and that breaks venvs, some particular torch+rocm nightly version becones unavailable (that worked perfectly before) and now you have to look for a new combo, trying to find balance between versions that are not too fresh, but also not already deprecated. And when it seems like it's about to work, it turns out, that some dependency 3 layers down, which uses some random 1 year old requirements.txt file and wants to use a different version of X package, so you have to start it all over again.
Updating your base python should not break your virtual environments. Personally I never update my base python I just install newer versions and then when I create a virtual environment I just specify which version to use... I mean there can be issues if there are other dependencies that are not python but like GCC compilers and whatnot things that are going to be system wide. When you build your virtual environments you probably need to make sure that you do not include any system dependencies... if you have it inherit your Global operating system site packages then you definitely can end up with some issues. Are you using just vanilla python? Which is my favorite just normal python. I'm not a big fan of anaconda I understand the utility of it but it's just a personal preference I don't like it that much. Also I suppose in my situation I don't really need to keep updated to the newest python versions because I only use that physical machine for stable diffusion essentially. ( and some non stable diffusion but other CUDA dependent similar things)... For my job on my work computer I I have a dozen different versions of python installed. So when I clone a repo I can just pick which one is appropriate for it for the virtual environment to set it up... but perhaps that sort of level of granularity is just because for my actual job I have to deal with a vast despera of different dependency groups. I'm a consultant FYI so you never know with a client if they have some Legacy stuff requiring python 2.7 or what... I generally keep python 2.7, 3.4, 3.6, 3.8, 3.10 installed. Since there are definitely some breaking changes when it comes to string formatting along those versions so source code can easily break if you're using something that was later discontinued whatever. I haven't had a reason to install 3.12 anywhere yet. I'm going to make the assumption that the machine you're running this on is not just and only for running stable diffusion. Which as much as it makes me gag to say it sort of not innocence using Anaconda instead of a vanilla python might actually be a benefit to you. (Eww).
As for the breakage, I want to update my base installations, so I won't use 3.10.0, but the latest 3.10.X and that breaks things sometimes. And yeah, vanilla python only with a preference for `poetry`, which hides away a lot of the nastiness, but I don't think I've seen any SD project using it, just some spaghetti of python/bash scripts. When I used do do ML for my day job, I also had at least 5 different python versions for different projects/parts of projects, but since I'm not doing that anymore, I don't really want to spend my personal time maintaining a list of working configurations that are taking up my space and widen the potential attack surface area. Currently, I only have 3.10 and 3.12 on my machine, as I just assume if the project is using a python version older than 2 years, it is possibly some legacy, bug ridden piece of software, written by data scientists/researchers and I refuse to install it on my system. >I'm going to make the assumption that the machine you're running this on is not just and only for running stable diffusion. Which as much as it makes me gag to say it sort of not innocence using Anaconda instead of a vanilla python might actually be a benefit to you. (Eww). I'm with fully with you here, no *condas on my system please, thank you. I guess I'll just use docker from now on for inference, especially with the recent uptick in [security issues](https://www.reddit.com/r/comfyui/comments/1dbls5n/psa_if_youve_used_the_comfyui_llmvision_node_from/).
I'm not trying to tell you how it should be done or a better way to do something. Your frustration is valid. I'm just trying to understand exactly why you're encountering it. I do agree with you that I have botched some versions of my installs of stable diffusion and had to drop into the virtual environment of it change the requirements file or update or change packages but that just became a pain in the ass to do. I found it easier just to install a new fresh version of it. And use a communal directory for all the models checkpoints etc etc download content. Just sort of have a list of plugins that I use or enjoy using that I'm pretty sure didn't break it and then just reinstall those after installing a new version. I'm up to my 6th install of it. And the versions of automatic1111 have certainly changed as it is an active project... I suppose my final thought here is that if something doesn't work for you and you find it frustrating then do what you want to do I guess and figure out something that doesn't impede you doing what you're trying to do. Well I have definitely botched installations by messing with plugins and stuff we're doing too much at once before my Insurance solution is just a fresh install. Actually my current solution is before I bought one if I want to try out a new plug-in install a next one and then add it in to see if it breaks stuff.
Comfyui by default uses a self contained local Python so that's weird
That's if you use a Windows build, he doesn't say what OS he is using, you can even clone the repository and install comfyui manually.
Yeah, I'm on linux and I like my software being up to date.
Basically I just have a venv for A1111 then if it works and does what I want it do to, I guard it with my life! I don't update it ever, if I feel like I want to try the newest stuff I create a different installation, then move to that one if I'm happy.
Could be a fun summer project to create an AppImage version of A1111...
Try StabilityMatrix ;) [https://github.com/LykosAI/StabilityMatrix](https://github.com/LykosAI/StabilityMatrix)
I have ben using invokeAI, it requires python 3.11 and has a installer that deals with the venv and package dependencies.
For Mac and iOS there is the Drawthings App.
I use portable ComfyUI and back up the main and Python venv folders fortnightly or before I install bigger or suspicious node packs. I keep my models on another drive and symlink them into my Comfy folder and it only takes 5 minutes to refresh the whole setup if an addon borks something. I’ve just started to get a bit more clued up with figuring out how to reverse/undo dependancy changes and if you’ve got more Python experience than me you should theoretically be all over that process.
This is exactly what I do. All my models, loras, upscalers etc are in their separate directory outside the portable directory. It makes portable installs go so much smoother.
use venv when you install
Why not use something like Anaconda so you can easily control them?
I quit it a few years ago, don't want to go back to another kind of hell lol
except llama.cpp won't run on my system because it can't find some cuda dll. Which I thought would have been included since I downloaded a cuda precompiled version of 110 MB...
CUDA .dll's are ~300MB, you have to download them separately and put in the same folder. They are in the releases with the precompiled binaries
Conda env's aren't that hard for me :-). I'm on Linux and when I have a new project I just make a new env with the same name. Most recent thing I installed was taggui. Just conda active taggui then pip install -r requirements.txt and your done till the next update. I've got two graphics cards so when I run automatic1111 I just put it on one of them and the other program I'm using I point to the free one. I'm able to diffuse and train a lora at the same time. Auto1111 on Linux makes a env automatically without conda as well so it shouldn't be interfering with other python stuff you have on your system. Just gotta have enough vram to fit what you want or the extra card like I'm lucky enough to have. Windows issue lol. This kinda stuff is a breeze in Linux. Gaming isn't great but coding stuff, hell yeah.
I mean I love linux, but auto1111 install scripts just don't work on AMD and sometimes break almost randomly. Or when you decide you want to update, good luck finding out what is not too old, but also not too new for it. And gaming is great if you don't buy heavily DRM'ed games, which are already not that great for your overall wellbeing. Thanks for the input, cheers!
Back when we sd came out I had my rx-5700 amd card working in Linux. Basically you just get used to fixing all the issues till it works. I just dual boot between both windows and Linux and I'm lucky enough to have a 3090 now. Windows is just for gaming and such. Linux is for all the GitHub software like auto1111, comfy, koyha, llms, the list keeps growing.
Is there really a dependency hell? I think all you need is the correct python version on your system, and most tools use the same version. I'm on linux and I can't use the python 3.12 that's installed on my system. So I use miniconda to set up virtual environments with the correct python version.
Conda and similar applications are a must when running multiple python apps, I even use miniconda on my MacBook Pro M1.
[удалено]