What's up, selfhosters? - Sunday thread
-
Presuming you can put OpenWRT on it, it'll be fine as a single box
IMHO, I just prefer having it all as separates and then fix / change / upgrade parts as I go - but I soon run out of places to hide them
-
I know this isn't sexy but I've been working on my documentation. Getting configs etc properly versioned in my gitea instance, readmes updated etc. My memory is not what it once was and I need the hints when things break.
-
Same here. I got Gemini to write a shell script for me that I can run on my Proxmox host which will output all of my configs to a .txt file. I asked it to format the output in a way a LLM can understand so I can just copy/paste it next time I need to consult AI.
-
IMO you should stick with a local device store only. If you're worried about the state getting hold of the data, having any backups is gonna be a liability.
-
Anyone know how to set up NPM on truenas scale? I've spent all day trying to get my SSL certs and it fails every damn time. Just says the donation is unknown or that it can't find my npm install
I'm using a freedns domain tho so maybe I'm gonna need to try buying a domain.
-
What’s your alternative ? Owncloud ? No thank you
-
Pretty cool! I also try to improve my documentation
-
This sounds interesting. Although I'm not even sure of what sort of configuration I would need to keep between reinstalls lol.
-
I finally set up Joplin server. It is a revelation after too long using Syncthing to sync databases. I wasn't able to use Joplin on Android anymore- the sync to file system had gotten too slow. Now everything syncs pretty much instantly!
-
Managed to set up immich remote machine learning (old 7th gen Optiplex to gaming PC). If only I bought an nvidia card.. I wasn't able to get it my AMD 7800 XT to work with immich ML.. Next up is setting up microservices because immich is crippling my unraid server
-
System updates have been a faff. I'm 'ssh'ing over tailscale. When tailscale updates it kicks me out, naturally. Which interrupts the session, naturally. Which stops the update, naturally.
Have a look at Screen. You can create a persistent terminal to start your update in, disconnect (manually or by connection loss), and resume the session when you reconnect, with it having completed the update while you were gone.
-
Sending is someone else’s problem.
It becomes my problem when I'm the one who wants the files and no free service is going to accept an 80gb file.
It is exactly my point that I should not have to deal with third parties or something as massive and monolithic as Nextcloud just to do the internet equivalent of smoke signals. It is insane. It's like someone tells you they don't want to bike to the grocer 5 minutes away because it's currently raining and you recommend them a monster truck.
-
I set it up a couple weeks ago. It's alright; facial recognition works pretty well, the files are easy to manage, and setup was pretty straightforward (using docker).
Searching for images works fairly well, as long as you're searching for content and not text. Searching 'horse' for example does a pretty good job showing you your pictures of horses, but often misses images containing the word horse. Not always, but it's noticeable to me.
The mobile apps work well too; syncing files in the background as they appear, optionally creating albums based on folders. Two things I find missing though are the ability to edit faces/people in an image (you've gotta do that from a browser), and the ability to see what albums an image is in and quickly navigate to one.
It's a developing project that's well on it's way. A good choice imo.
-
I'm having some crazy deja vu reading this 5 comment thread....
It's been a few months since I visited one of these general "how's everyone's week been" threads, but the last time I did someome else was talking about just having setup paperless, struggling to get their scanner to print to ftp, thinking about email, someone had suggested wireshark; it feels like I just re-read that exact conversion again, but they're new comments....
Freaky.
-
Mostly the stuff in /etc/pve, plus whatever you installed in additional software
-
Thanks for the mention
Yeah, copyparty was my attempt at solving this issue - a single python-file for receiving uploads of infinitely large files, usually much faster than other alternatives (ftp, sftp, nextcloud, etc.) especially when the physical distance to the uploader is large (hairy routing).
I’m not gonna put an upload on my site, that’s a security nightmare waiting to happen.
curious to hear your specific concerns on this; maybe it's something that's already handled?
-
OK 80 GB is for sure an edge case. Nextcloud won't even work for that due to PHP memory limits, I think.
Interesting problem. FTP is an option, with careful instructions to an untutored user. Maybe rsync over a VPN connection if it is always the same sender.
Not even sure what else would reliably work, except Tannenbaum's Adage.
-
You're describing the world wide web, except giving others write access
-
Sure! I mostly followed this random youtuber's video for getting Wyoming protocols offloaded (Whisper/Piper), but he didn't get Ollama to use his GPU: https://youtu.be/XvbVePuP7NY.
For getting the Nvidia/Docker passthrough, I used this guide:
https://www.bittenbypython.com/en/posts/install_ollama_openwebui_ubuntu_nvidia/.It's working fairly great at this point!
-
Setting up let's encrypt auto cert renewal with ACME. Also looking to setup some monitoring service, basic stuff like CPU, memory usage etc.
If anyone has recommendations that have an android app available, that would be awesome.