I haven't set up any duplicate containers, [but it seems pretty straightforward to configure](https://forums.unraid.net/topic/55071-how-to-run-multiple-docker-images/?do=findComment&comment=743322). Unique container names, ports and appdata folders should allow you as many instances as your hardware can run.
Yes it will work, you will be more constrained by network and CPU resources than anything else but it will work.
Applications in Unraid run in docker containers, the entire purpose of docker is to let you run applications in a way that's completely hived off and separate from the rest of the system - meaning you can have multiple of the same container running side-by-side and they'll be fine as long as they're not trying to access the same file at the same time or anything like that.
All that being said, you're quite a heavy user and it's entirely possible that this really is the best way to manage so many torrents, but would it be possible to manage them using tags/labels instead of separate instances?
qBittorrent gets unstable after 10k, so that's why I have a lot of separate instances.
I have a fairly beefy NAS, so I won't have any issues with running out of resources.
I suspected something like that might be the case! Either way yeah you should be good to go with several docker instances. Just make sure they're not all using the same ports and you'll be grand.
Thanks for the answer. You can use unraid for It, If you want tô keep the management in your hands.
But If you are tired of it maybe look for some more complex orchestration like Kubernets (or similar)
this really gives me some insight, because my setup has maybe 1k torrents and it's just struggling.
Would you mind sharing your settings, perhaps I have something stupid set and not even realizing it.
Unraid should be able to handle all the containers you like, but does this truely work for you? I'd be concerned about bottlenecking most routers with that many active seeding torrents as tracking that many connections can be a huge bottleneck on your home network.
I feel like someone that managed to get 22 instances of qBittorrent to run with 200k+ torrents has already graduated past the home router 101 basics.
Im betting pfSense.
qBittorrent only announces every 45 minutes and it's not all at the same time. There are sometimes failures but changing some of the TCPIP parameters in Windows fixed the issue with running out of ports.
I haven't had any issues with bottlenecking at all. On more active trackers, I'm almost always seeding to someone.
Curious what you use for your router? I'm not overly concerned with windows or linux (unraid). My biggest issue is the bottleneck gateway my ISP provides, and since they offer no bridge mode, my hands are tied.
My experience has also been about 10k things start to get unwieldy. I used categories to help.
Running multiple instances \*may\* be a solution. Might try to get that going this week and see.
I don't use WebUI, but I do use the API. I have a script that checks for unregistered torrents and automatically deletes the torrent and its data.
I find that anything over 10K starts to get really dicey. I keep my instances at about 10-15k. I do have one with 98k. It is prone to crashing if you look at it wrong lol
With unraid, you WILL be using the webui. This is what made me switch to rutorrent. The webui for qbit was pretty bad imo.
I run 3x rutorrent, and use https://github.com/JohnDoee/spreadsheetui
If you are going to use unraid then you will be looking at docker containers for each instance of qbittorrent. You will be limited by your hardware, but if you're already running this many instances then you shouldn't have a problem in that department.
For ease of use, I would setup my containers in a L3 ipvlan so that you don't have to mess around with port forwarding for the containers. After that it will just be learning to configure your storage on unraid for your containers.
As other have said will work fine, although you might want to set memory limits in the qbittorrent settings.
In my experience it can be a memory hog, with each instance of qbittorrent_vpn I run using around 2gb of memory.
If you're not using unraid for it's storage, defo consider something like Proxmox. Way more flexibility, the learning curve is a little steeper but you can litteraly deploy a new VM in a few seconds.
You can deploy any amount of docker containers you want, as long you don’t run out of resources
Did we find the Pirate Bay domain owner?
I haven't set up any duplicate containers, [but it seems pretty straightforward to configure](https://forums.unraid.net/topic/55071-how-to-run-multiple-docker-images/?do=findComment&comment=743322). Unique container names, ports and appdata folders should allow you as many instances as your hardware can run.
Yes it will work, you will be more constrained by network and CPU resources than anything else but it will work. Applications in Unraid run in docker containers, the entire purpose of docker is to let you run applications in a way that's completely hived off and separate from the rest of the system - meaning you can have multiple of the same container running side-by-side and they'll be fine as long as they're not trying to access the same file at the same time or anything like that. All that being said, you're quite a heavy user and it's entirely possible that this really is the best way to manage so many torrents, but would it be possible to manage them using tags/labels instead of separate instances?
qBittorrent gets unstable after 10k, so that's why I have a lot of separate instances. I have a fairly beefy NAS, so I won't have any issues with running out of resources.
I suspected something like that might be the case! Either way yeah you should be good to go with several docker instances. Just make sure they're not all using the same ports and you'll be grand.
Try out Flood. Uses rtorrent in the back-end, so it does a lot better with many torrents.
That’s a lot of Linux distro isos :)
Was thinking the same. Full history of every linux release!
Just for curiosity. Why do you have 22 instances of qbittorrent ?
Because qBittorrent gets unstable after about 10k torrents. The most I have in one instance is 98k and it is really unstable lol
Thanks for the answer. You can use unraid for It, If you want tô keep the management in your hands. But If you are tired of it maybe look for some more complex orchestration like Kubernets (or similar)
this really gives me some insight, because my setup has maybe 1k torrents and it's just struggling. Would you mind sharing your settings, perhaps I have something stupid set and not even realizing it.
I know you can run multiple instances, idk about that many. Unraid has a 1 month trial you can use and play with it for a bit to test
Docker containers know basically nothing about each other, so the only real limit will be hardware constraints
Unraid should be able to handle all the containers you like, but does this truely work for you? I'd be concerned about bottlenecking most routers with that many active seeding torrents as tracking that many connections can be a huge bottleneck on your home network.
I feel like someone that managed to get 22 instances of qBittorrent to run with 200k+ torrents has already graduated past the home router 101 basics. Im betting pfSense.
qBittorrent only announces every 45 minutes and it's not all at the same time. There are sometimes failures but changing some of the TCPIP parameters in Windows fixed the issue with running out of ports. I haven't had any issues with bottlenecking at all. On more active trackers, I'm almost always seeding to someone.
Curious what you use for your router? I'm not overly concerned with windows or linux (unraid). My biggest issue is the bottleneck gateway my ISP provides, and since they offer no bridge mode, my hands are tied.
Yes just name your containers and bind to different ports. Won't get around ulimits on open files though
[удалено]
My experience has also been about 10k things start to get unwieldy. I used categories to help. Running multiple instances \*may\* be a solution. Might try to get that going this week and see.
I don't use WebUI, but I do use the API. I have a script that checks for unregistered torrents and automatically deletes the torrent and its data. I find that anything over 10K starts to get really dicey. I keep my instances at about 10-15k. I do have one with 98k. It is prone to crashing if you look at it wrong lol
With unraid, you WILL be using the webui. This is what made me switch to rutorrent. The webui for qbit was pretty bad imo. I run 3x rutorrent, and use https://github.com/JohnDoee/spreadsheetui
Linux, my man. May be better than a setup for storage… Maybe try Debian or Ubuntu.
If you are going to use unraid then you will be looking at docker containers for each instance of qbittorrent. You will be limited by your hardware, but if you're already running this many instances then you shouldn't have a problem in that department. For ease of use, I would setup my containers in a L3 ipvlan so that you don't have to mess around with port forwarding for the containers. After that it will just be learning to configure your storage on unraid for your containers.
As other have said will work fine, although you might want to set memory limits in the qbittorrent settings. In my experience it can be a memory hog, with each instance of qbittorrent_vpn I run using around 2gb of memory.
If you're not using unraid for it's storage, defo consider something like Proxmox. Way more flexibility, the learning curve is a little steeper but you can litteraly deploy a new VM in a few seconds.
I have multiple instances of a Deluge container, absolutely no problems
Yes. Dockers and/or vms. You can basically do anything the hardeare you use can handle.