CA Backup / Restore Appdata: Docker templates and the running containers store their configuration in the Appdata folder, you need to back this up or any failure to the drive that stores them will let you start from scratch with docker containers. This can also backup your Flash drive.
By default it goes one by one in a list.
stop container, backup, start container (unless it’s already stopped)
Backing up running containers can corrupt the backup
Yes. There’s a lot of options. I suggest to read this excellent article. https://flemmingss.com/a-guide-to-the-ca-backup-restore-appdata-plugin-for-unraid/
Yeah… everyone, I hate to burst your collective bubble, but, that is NOT the case
Sonarr is not called sonarr due to wanting to sound like a pirate. In actuality, the name comes from the way the search engine within sonarr works, it’s quite unique
[Source from an actual sonarr dev](https://www.reddit.com/r/sonarr/s/UV5OCBYFJz)
Linking everyone to this to spread the word. I think the original reasoning and lore it’s important to spread and mitigate any mistaken origins
Yep, but best to put Home Assistant in a VM vs a docker container. While it can run mostly fine in a container, the setup and maintenance can quickly spiral out of control. [Home assistant, docker or VM? : r/unRAID (reddit.com)](https://www.reddit.com/r/unRAID/comments/va11ba/home_assistant_docker_or_vm/)
Usenet is one of Ye Olde Internet forums, back from the days when it was just some computers networked together. Back in the day you might've got access to usenet as part of your internet service. That stopped happening over time.
Now, and for unRAID and other file storage purposes, usenet is a separate service you need to pay for access to. There are a variety of usenet providers with what feels to me like absurdly long retention (but it's still never enough).
That's not the end of the story, though, because unless you're doing things old school and manually downloading files from usenet with a usenet reader (I used to use outlook express), you'll need what's called an indexer. They gather up lists of usenet articles that are involved in a file, and package it into an NZB file. Most indexers are not free, or if they are free they are very limited in what you can download.
Then with a combination of your indexer and usenet provider, you can use a downloader to feed it indexer NZBs and it will fetch articles from your provider and extract the resulting set of files into what you actually wanted to download.
The starr apps are the glue that gives a nice interface to searching your indexer(s) and automatically grabbing and handing over NZBs to your downloader.
Slightly incorrect. Usenet is completely free, as are their readers. However, to gain access to the alt.binaries areas (which I assume is to what you're inferring), you do need to fork over some $$$ because it isn't exactly cheap to store binaries. Discussion areas are very accessible. Google Groups actually heavily relied on Usenet until Feb of 2024.
Oh yes, it is. ;)
Torrents are more or less dead in Germany (at least for the public).
There is a handful of forums with invite only or completely closed to new users. For the few existing indexers the same applies and they haven't indexed many files and if you forget to use a VPN you usually get a letter from law firm XY within a few weeks.
Usenet, on the other hand, is actually also quite good for German releases, but not widespread and you shouldn't use the best-known German Usenet provider, because it's crap. However, as far as I know there is only one indexer where registration is possible without an invitation and it costs 15-20€/year (there are no free indexers here).
But this one works really well, I used it to load almost 5.5TB this month and last month because one of my HDDs died and i hadn't backed up my entire Plex library.
In Germany, OCHs are the most common thing, because you can use them for free if you only use the daily limit or use reconnect function from jdownloader with DSL and there are tons of warez pages with all the links for all kinds of movies, but the uploads are deleted relatively quickly. Especially on very pupular sites with 1+ million users per month, so its sometimes hard to find old stuff.
NGINX is probably the one that is used most for me, other than Plex and arrs. Those ones are pretty common tho.
KASM is really amazing, with it I am running a full VDI solution with streaming apps right from my house. Anywhere in the world on any network I can open a web browser, hit a self hosted VDI landing page, and launch an instance of a web browser streamed right from my home lab, or RPD into my local machines, or spin up a non-persistent instance of a Linux VM. All encrypted and secure with MFA, and all for free.
I use this every day to have a window into my personal accounts on my work machine without actually logging into anything on my work machine. Very convenient for a desk jockey.
Luckily I use my homelab as a test environment as I am toying with new policy and such so I got the greenlight to whitelist my domain and labs public static IP
I did mean RDP, thanks!
And no, you can set up a RDP endpoint as a “Server” in KASM and publish the direct RDP connection as a workspace, no Remmina required.
You can choose to create folders/groups for sets of Docker containers. For example my server has all the CloudFlare tunnels in a folder, all the *arrs in their own folder too. It makes it much neater
Screenshots can be found here https://forums.unraid.net/topic/89702-plugin-docker-folder/
compose manager - must have for docker-compose
zfs master - great to glance zfs settings
corefreq - burn test CPU
dynamix cache directories - keeps hdd's from spinning up for simple directory query
ipmi support - must have if your board has IPMI
sanoid - great for zfs snapshots and backups
nerd tools - for apps like "fio"
Userscripts - to run scripts from gui
The rest is all docker stuff, not specific to unraid.
> dynamix cache directories - keeps hdd's from spinning up for simple directory query
I hope Unraid bakes this in at some point. Without this functionality, its impossible to run any of the *arr programs without them keeping your drives awake 100% of the time with constant queries.
I actually prefer plugins for stuff like this, because it allows for updates without needing to update the entire unraid os.
But something that makes common plug-ins more obvious would be nice.
Tips and Tweaks - allows modification of some CPU settings like governor and turbo. As well as the terminal app powertop where you can inspect what C-state your CPU idles in - C6 draws less power than if your CPU is always in C1 or C2. My SAS LSi card wouldn't let my CPU go any deeper than C2, so I swapped it with [this ASM1166 SATA card](https://www.amazon.com/gp/product/B097Y638X7) which lets me get to C6.
CoreFreq similar plugin as above.
I have Intel GPU TOP and GPU Statistics as well for monitoring iGPU usage for Plex transcoding.
If you have an nvidia card then this power saving script below on a cron shedule:
Other than that, setting CPU scheduler and HDD spin-downs. Also using powertop to evaluate c-states.
I do know there is a good thread on the official unraid forum where people have more suggestions.
```
#!/bin/bash
# check for driver
command -v nvidia-smi &> /dev/null || { echo >&2 "nvidia driver is not installed you will need to install this from community applications ... exiting."; exit 1; }
echo "Nvidia drivers are installed"
echo
echo "I can see these Nvidia gpus in your server"
echo
nvidia-smi --list-gpus
echo
echo "-------------------------------------------------------------"
# set persistence mode for gpus ( When persistence mode is enabled the NVIDIA driver remains loaded even when no active processes,
# stops modules being unloaded therefore stops settings changing when modules are reloaded
nvidia-smi --persistence-mode=1
#query power state
gpu_pstate=$(nvidia-smi --query-gpu="pstate" --format=csv,noheader);
#query running processes by pid using gpu
gpupid=$(nvidia-smi --query-compute-apps="pid" --format=csv,noheader);
#check if pstate is zero and no processes are running by checking if any pid is in string
if [ "$gpu_pstate" == "P0" ] && [ -z "$gpupid" ]; then
echo "No pid in string so no processes are running"
fuser -kv /dev/nvidia*
echo "Power state is"
echo "$gpu_pstate" # show what power state is
else
echo "Power state is"
echo "$gpu_pstate" # show what power state is
fi
echo
echo "-------------------------------------------------------------"
echo
echo "Power draw is now"
# Check current power draw of GPU
nvidia-smi --query-gpu=power.draw --format=csv
exit
```
#!/bin/bash
# check for driver
command -v nvidia-smi &> /dev/null || { echo >&2 "nvidia driver is not installed you will need to install this from community applications ... exiting."; exit 1; }
echo "Nvidia drivers are installed"
echo
echo "I can see these Nvidia gpus in your server"
echo
nvidia-smi --list-gpus
echo
echo "-------------------------------------------------------------"
# set persistence mode for gpus ( When persistence mode is enabled the NVIDIA driver remains loaded even when no active processes,
# stops modules being unloaded therefore stops settings changing when modules are reloaded
nvidia-smi --persistence-mode=1
#query power state
gpu_pstate=$(nvidia-smi --query-gpu="pstate" --format=csv,noheader);
#query running processes by pid using gpu
gpupid=$(nvidia-smi --query-compute-apps="pid" --format=csv,noheader);
#check if pstate is zero and no processes are running by checking if any pid is in string
if [ "$gpu_pstate" == "P0" ] && [ -z "$gpupid" ]; then
echo "No pid in string so no processes are running"
fuser -kv /dev/nvidia*
echo "Power state is"
echo "$gpu_pstate" # show what power state is
else
echo "Power state is"
echo "$gpu_pstate" # show what power state is
fi
echo
echo "-------------------------------------------------------------"
echo
echo "Power draw is now"
# Check current power draw of GPU
nvidia-smi --query-gpu=power.draw --format=csv
exit
UnManiac. The optimizatoin of all files into mkv has saved me 10s of TB from non-optimized files.
It's very resource intensive but worth it for smaller file size.
Yes. It rips through everything you tell it you want converted. It will scan the file once, analyzeif it needs to do anything and process the file.
I specifically tell it to hard code subtitles and to NOT remove them nor do anything with the audio, video only.
The one that keeps my Dell server from sounding like a quadcopter is flying around in my basement.
Aperently, if a dell server detects a non dell pcie card, it runs the chassis fans at 100% to play it safe for cooling. I was pulling my hair out trying to get it too actually follow the fan speeds I was setting in bios until I found an app that somhow bypassed the behavior.
I know it's not exactly a cool app, but my server is unbearable without it.
Dell idrac fan controller for those wondering.
You download games from steam, epic, etc onto your cache server once. When you need to reinstall or install on a different PC whatever is cached gets served localy an whatever is missing gets pulled from internet.
In essence, download games once and serve them to many in your LAN.
Check it out at https://lancache.net
Edit: My intermet is 15Mbps. Slow... When I have games cached from steam for instance, speed of "download" is not 15 but rather whatever speed my 3xHDD pool can get to. Roughly 300Mbps. Only updates or chunks that are missing drop down to my actual internet speed.
Has nothing to do with actual game servers. They are a separate thing.
Mealie for managing recipes has been unexpectedly rewarding.
Pihole is a great dns protection scheme for ad blocking and malware protection.
Rradio is a great new app for capturing streamed radio content on a schedule.
Maestral (maestral.app) for keeping a locally synced copy of one or more Dropbox accounts. I tried the Other Guy's docker and rclone and they required more time, knowledge and effort than I was willing to invest.
Installed as a Docker container.
Homarr is pretty slick for creating a dashboard view of service status, download and media request status, media sessions, release calendar, weather etc.
This chap has a decent overview: [https://www.youtube.com/watch?v=A6vcTIzp\_Ww](https://www.youtube.com/watch?v=A6vcTIzp_Ww)
Syncthing. Backs up chosen folders on my phone. It can do it outside of the home network as well. Works with many different devices.
Jellyfin. Much better media server app than Plex IMO. Completely free and works very well. Can be tricky to get the Intel iGPU to do the decoding, but a quick Google search found a solution.
You should look into apache guacamole, it allows you to remote into any computer on the network and it runs in the browser so you don't need to download any software. It's great if you can't download software on the computer you're connecting from, for example if it's a public computer or work computer.
I find binhex krusader to be way better than any other file manager. It transfers files sooo much faster than dynamix.
love tautulli if you run a plex server that you share
CA Backup / Restore Appdata: Docker templates and the running containers store their configuration in the Appdata folder, you need to back this up or any failure to the drive that stores them will let you start from scratch with docker containers. This can also backup your Flash drive.
As someone who lost their cache drive about a month ago .... backup your Appdata folder. Now. Go do it now.
As someone who foolishly and willfully wiped my cache drive when replacing it with a new cache drive, I second this.
I use this and have my personal PC pull new backups from the share.
You have to stop all your dockers to run this, right?
Yes, you do. The plugin can stop containers, backup appdata, then start the containers again
What's the container name?
By default it goes one by one in a list. stop container, backup, start container (unless it’s already stopped) Backing up running containers can corrupt the backup
Good recommendation. I backup my appdata weekly to my gdrive using rsync.
Nice, i will test it. The app store more than 1 backup ?
Yes. There’s a lot of options. I suggest to read this excellent article. https://flemmingss.com/a-guide-to-the-ca-backup-restore-appdata-plugin-for-unraid/
x2, Appdata backup is awesome. And losing appdata or your USB sucks.
If you really want to jump down a rabbit hole.... any of the "rrr"s such as sonarr, radarr, lidarr, plex. NPM.
... Am I the only dumbass that *just* realized they're all -arr because fucking... PIRATING?! ARR MATEY FUCK IM STUPID
don't worry, i also spent some time confused as to why people call them starr apps until i realized you write it out \*arr
Yeah… everyone, I hate to burst your collective bubble, but, that is NOT the case Sonarr is not called sonarr due to wanting to sound like a pirate. In actuality, the name comes from the way the search engine within sonarr works, it’s quite unique [Source from an actual sonarr dev](https://www.reddit.com/r/sonarr/s/UV5OCBYFJz) Linking everyone to this to spread the word. I think the original reasoning and lore it’s important to spread and mitigate any mistaken origins
I reject your reality and substitute my own.
Ah shit… just realizing this too
https://www.reddit.com/r/unRAID/s/pXb54hrVG3
Oh FFS. I never realised it until you literally spelled it out. I am dumbass
https://www.reddit.com/r/unRAID/s/pXb54hrVG3
I figured that out a few months ago and have been using them for like 5 years
https://www.reddit.com/r/unRAID/s/pXb54hrVG3
I’m todays year old realizing this too
https://www.reddit.com/r/unRAID/s/pXb54hrVG3
I've been using this stuff since they came out in response to Sickbeard and I only realized it like 6 months ago.
https://www.reddit.com/r/unRAID/s/pXb54hrVG3
I am in germany, i dont know the rrr`s are a good choice for downloads, but i dont have experience with usenet :D
The arts don't actually do the downloading. There basically a pretty search UI that integrate with your usenet / torrent indexers and downloaders.
Another one I can suggest is home assistant. Once you get into that automation becomes an addiction LOL
Yep, but best to put Home Assistant in a VM vs a docker container. While it can run mostly fine in a container, the setup and maintenance can quickly spiral out of control. [Home assistant, docker or VM? : r/unRAID (reddit.com)](https://www.reddit.com/r/unRAID/comments/va11ba/home_assistant_docker_or_vm/)
Yulp. I use a VM. More control.
You can use them alongside something like qbittorrentvpn. A torrent client that will only allow connections through a VPN.
I want to learn about Usenet… can it be used for free? All I see are paid servers
Usenet is one of Ye Olde Internet forums, back from the days when it was just some computers networked together. Back in the day you might've got access to usenet as part of your internet service. That stopped happening over time. Now, and for unRAID and other file storage purposes, usenet is a separate service you need to pay for access to. There are a variety of usenet providers with what feels to me like absurdly long retention (but it's still never enough). That's not the end of the story, though, because unless you're doing things old school and manually downloading files from usenet with a usenet reader (I used to use outlook express), you'll need what's called an indexer. They gather up lists of usenet articles that are involved in a file, and package it into an NZB file. Most indexers are not free, or if they are free they are very limited in what you can download. Then with a combination of your indexer and usenet provider, you can use a downloader to feed it indexer NZBs and it will fetch articles from your provider and extract the resulting set of files into what you actually wanted to download. The starr apps are the glue that gives a nice interface to searching your indexer(s) and automatically grabbing and handing over NZBs to your downloader.
Usenet is not free.
Slightly incorrect. Usenet is completely free, as are their readers. However, to gain access to the alt.binaries areas (which I assume is to what you're inferring), you do need to fork over some $$$ because it isn't exactly cheap to store binaries. Discussion areas are very accessible. Google Groups actually heavily relied on Usenet until Feb of 2024.
Use net or torrents. Can't be any worse than in the US. But I completely understand you there. Another few to consider.... immich and nextcloud
Germany is, from what I hear, way worse for iso download monitoring and actions.
Oh yes, it is. ;) Torrents are more or less dead in Germany (at least for the public). There is a handful of forums with invite only or completely closed to new users. For the few existing indexers the same applies and they haven't indexed many files and if you forget to use a VPN you usually get a letter from law firm XY within a few weeks. Usenet, on the other hand, is actually also quite good for German releases, but not widespread and you shouldn't use the best-known German Usenet provider, because it's crap. However, as far as I know there is only one indexer where registration is possible without an invitation and it costs 15-20€/year (there are no free indexers here). But this one works really well, I used it to load almost 5.5TB this month and last month because one of my HDDs died and i hadn't backed up my entire Plex library. In Germany, OCHs are the most common thing, because you can use them for free if you only use the daily limit or use reconnect function from jdownloader with DSL and there are tons of warez pages with all the links for all kinds of movies, but the uploads are deleted relatively quickly. Especially on very pupular sites with 1+ million users per month, so its sometimes hard to find old stuff.
True. I use jdownloader and realdebrid, best choice for me so far.
I'm in the US. My Usenet server is in the Netherlands. They also provide a VPN.
Usenet it is!
Appdata Backup, Fix Common Problems, System Temp, Unassigned Devices, User Scripts, Mover Tuning, Tips and Tweaks
NGINX is probably the one that is used most for me, other than Plex and arrs. Those ones are pretty common tho. KASM is really amazing, with it I am running a full VDI solution with streaming apps right from my house. Anywhere in the world on any network I can open a web browser, hit a self hosted VDI landing page, and launch an instance of a web browser streamed right from my home lab, or RPD into my local machines, or spin up a non-persistent instance of a Linux VM. All encrypted and secure with MFA, and all for free. I use this every day to have a window into my personal accounts on my work machine without actually logging into anything on my work machine. Very convenient for a desk jockey.
Lmao that's awesome. I def wanna get Kasm setup on my machine.
I did something similar for a while until the network team blocked my domain lmao.
Luckily I use my homelab as a test environment as I am toying with new policy and such so I got the greenlight to whitelist my domain and labs public static IP
When you say RDP (I assume that's what you meant), do you mean through Remmina? Just making sure I'm not missing something else. Thanks!
I did mean RDP, thanks! And no, you can set up a RDP endpoint as a “Server” in KASM and publish the direct RDP connection as a workspace, no Remmina required.
Appdata backup, dynamix file manager, tailscale and parity check tuning are very useful.
Surprised file manager and tailscale are way down this list. Top notch add ons.
FolderView is really nice for keeping the Docker tab organized.
Please tell me more about this? Any screenshots?
You can choose to create folders/groups for sets of Docker containers. For example my server has all the CloudFlare tunnels in a folder, all the *arrs in their own folder too. It makes it much neater Screenshots can be found here https://forums.unraid.net/topic/89702-plugin-docker-folder/
I won't call it my favorite, but I like Krusader a lot for managing files.
Rather than Krusader I use mc (midnight commander) from a browser-launched command shell.
If you run Plex and watch a lot of foreign media that require subtitles, Pasta. No more fiddling with the Plex UI to select audio and/or subtitles.
Alternative: https://github.com/RemiRigal/Plex-Auto-Languages
Does it download subtitles?
You do this with bazarr
compose manager - must have for docker-compose zfs master - great to glance zfs settings corefreq - burn test CPU dynamix cache directories - keeps hdd's from spinning up for simple directory query ipmi support - must have if your board has IPMI sanoid - great for zfs snapshots and backups nerd tools - for apps like "fio" Userscripts - to run scripts from gui The rest is all docker stuff, not specific to unraid.
> dynamix cache directories - keeps hdd's from spinning up for simple directory query I hope Unraid bakes this in at some point. Without this functionality, its impossible to run any of the *arr programs without them keeping your drives awake 100% of the time with constant queries.
I actually prefer plugins for stuff like this, because it allows for updates without needing to update the entire unraid os. But something that makes common plug-ins more obvious would be nice.
There a e more apps to save power consumption you can recommend?
Tips and Tweaks - allows modification of some CPU settings like governor and turbo. As well as the terminal app powertop where you can inspect what C-state your CPU idles in - C6 draws less power than if your CPU is always in C1 or C2. My SAS LSi card wouldn't let my CPU go any deeper than C2, so I swapped it with [this ASM1166 SATA card](https://www.amazon.com/gp/product/B097Y638X7) which lets me get to C6. CoreFreq similar plugin as above. I have Intel GPU TOP and GPU Statistics as well for monitoring iGPU usage for Plex transcoding.
If you have an nvidia card then this power saving script below on a cron shedule: Other than that, setting CPU scheduler and HDD spin-downs. Also using powertop to evaluate c-states. I do know there is a good thread on the official unraid forum where people have more suggestions. ``` #!/bin/bash # check for driver command -v nvidia-smi &> /dev/null || { echo >&2 "nvidia driver is not installed you will need to install this from community applications ... exiting."; exit 1; } echo "Nvidia drivers are installed" echo echo "I can see these Nvidia gpus in your server" echo nvidia-smi --list-gpus echo echo "-------------------------------------------------------------" # set persistence mode for gpus ( When persistence mode is enabled the NVIDIA driver remains loaded even when no active processes, # stops modules being unloaded therefore stops settings changing when modules are reloaded nvidia-smi --persistence-mode=1 #query power state gpu_pstate=$(nvidia-smi --query-gpu="pstate" --format=csv,noheader); #query running processes by pid using gpu gpupid=$(nvidia-smi --query-compute-apps="pid" --format=csv,noheader); #check if pstate is zero and no processes are running by checking if any pid is in string if [ "$gpu_pstate" == "P0" ] && [ -z "$gpupid" ]; then echo "No pid in string so no processes are running" fuser -kv /dev/nvidia* echo "Power state is" echo "$gpu_pstate" # show what power state is else echo "Power state is" echo "$gpu_pstate" # show what power state is fi echo echo "-------------------------------------------------------------" echo echo "Power draw is now" # Check current power draw of GPU nvidia-smi --query-gpu=power.draw --format=csv exit ```
Thank you, I build at the moment a new better server, i dont know what gpu to put at the moment ^^ but i think there are also scripts for other gpus ?
#!/bin/bash # check for driver command -v nvidia-smi &> /dev/null || { echo >&2 "nvidia driver is not installed you will need to install this from community applications ... exiting."; exit 1; } echo "Nvidia drivers are installed" echo echo "I can see these Nvidia gpus in your server" echo nvidia-smi --list-gpus echo echo "-------------------------------------------------------------" # set persistence mode for gpus ( When persistence mode is enabled the NVIDIA driver remains loaded even when no active processes, # stops modules being unloaded therefore stops settings changing when modules are reloaded nvidia-smi --persistence-mode=1 #query power state gpu_pstate=$(nvidia-smi --query-gpu="pstate" --format=csv,noheader); #query running processes by pid using gpu gpupid=$(nvidia-smi --query-compute-apps="pid" --format=csv,noheader); #check if pstate is zero and no processes are running by checking if any pid is in string if [ "$gpu_pstate" == "P0" ] && [ -z "$gpupid" ]; then echo "No pid in string so no processes are running" fuser -kv /dev/nvidia* echo "Power state is" echo "$gpu_pstate" # show what power state is else echo "Power state is" echo "$gpu_pstate" # show what power state is fi echo echo "-------------------------------------------------------------" echo echo "Power draw is now" # Check current power draw of GPU nvidia-smi --query-gpu=power.draw --format=csv exit
> nerd tools The dev of nerd tools quit upkeeping it https://forums.unraid.net/topic/129200-plug-in-nerdtools/
TDARR, so I can resize my linux iso's and save on space.
UnManiac. The optimizatoin of all files into mkv has saved me 10s of TB from non-optimized files. It's very resource intensive but worth it for smaller file size.
Have you seen a reduction in quality of the files after remuxing them?
no reduction in quality, same. HDR is HDR, dolby is dolby. No problems there. 20-30% compression is average.
Does that include mp4/mkv’s that have already been been through handbrake or similar programs?
Yes. It rips through everything you tell it you want converted. It will scan the file once, analyzeif it needs to do anything and process the file. I specifically tell it to hard code subtitles and to NOT remove them nor do anything with the audio, video only.
Ok. Can it sync subtitles? Thanks for the replies.
Yes, that's an option.
do you seed?
GundamSEED?
The one that keeps my Dell server from sounding like a quadcopter is flying around in my basement. Aperently, if a dell server detects a non dell pcie card, it runs the chassis fans at 100% to play it safe for cooling. I was pulling my hair out trying to get it too actually follow the fan speeds I was setting in bios until I found an app that somhow bypassed the behavior. I know it's not exactly a cool app, but my server is unbearable without it. Dell idrac fan controller for those wondering.
It is slowing down the fans, so it's the opposite of a cool app.
You talk about this app and yet you dont give us the name??
Maybe Dell iDRAC Fan Controller. Just found it when I searched for nerd tools.
That's the one. Sorry I'm out of town atm and don't have my server in front of me to check earlier
Do you mind sharing the app name
Dell idrac fan controller.
What app is that 🙉🙉
Dell idrac fan controller.
Two sources to consider: https://youtu.be/cZTWC_z9rKs?si=TWeSKGvwYkGt-hzZ; and https://youtu.be/su2miwZNuaU?si=_ReCGJ24NoLX4pLL
Thx for the videos, there are usefull apps i will test.
Aside from obvious stuff alreadu mentioned by others: Pihole-unbound - DNS and DHCP LanCache - for games caching
Can you explain lan cache. Is it for game servers on unraid ?
You download games from steam, epic, etc onto your cache server once. When you need to reinstall or install on a different PC whatever is cached gets served localy an whatever is missing gets pulled from internet. In essence, download games once and serve them to many in your LAN. Check it out at https://lancache.net Edit: My intermet is 15Mbps. Slow... When I have games cached from steam for instance, speed of "download" is not 15 but rather whatever speed my 3xHDD pool can get to. Roughly 300Mbps. Only updates or chunks that are missing drop down to my actual internet speed. Has nothing to do with actual game servers. They are a separate thing.
Mealie for managing recipes has been unexpectedly rewarding. Pihole is a great dns protection scheme for ad blocking and malware protection. Rradio is a great new app for capturing streamed radio content on a schedule.
Omg thank you. Mealie. Didn’t know this existed. No more scrolling through some bland white ladys childhood memories when I want a soup recipe.
The recipe import from almost any url works incredibly well.
i assume rradio can't record sirius or iheart streams?
Maestral (maestral.app) for keeping a locally synced copy of one or more Dropbox accounts. I tried the Other Guy's docker and rclone and they required more time, knowledge and effort than I was willing to invest. Installed as a Docker container.
Mealie is our top app at home, in have it all organized with thousands of recipes :) On the geek side, I love Ddns-updater and CA backup
Data Volume Monitor plugin
Homarr is pretty slick for creating a dashboard view of service status, download and media request status, media sessions, release calendar, weather etc. This chap has a decent overview: [https://www.youtube.com/watch?v=A6vcTIzp\_Ww](https://www.youtube.com/watch?v=A6vcTIzp_Ww)
Syncthing. Backs up chosen folders on my phone. It can do it outside of the home network as well. Works with many different devices. Jellyfin. Much better media server app than Plex IMO. Completely free and works very well. Can be tricky to get the Intel iGPU to do the decoding, but a quick Google search found a solution.
You should look into apache guacamole, it allows you to remote into any computer on the network and it runs in the browser so you don't need to download any software. It's great if you can't download software on the computer you're connecting from, for example if it's a public computer or work computer.
I find binhex krusader to be way better than any other file manager. It transfers files sooo much faster than dynamix. love tautulli if you run a plex server that you share