I’ve been tempted to try and install plasma mobile on a tablet.
Why no arch install?
Also, the few points others are talking about needing others, there’s a group-finder and I’d say most people running those raids in group finder groups don’t talk at all, so you can just pretend they’re NPCs if you want.
I will never get tired of comedy responses to photoshop requests. It’s just a timeless classic.
Nah. There are some nvidia issues with wayland (that are starting to get cleared up), and nvidia’s drivers not being open-source rubs some people the wrong way, but getting nvidia and cuda up and running on linux is pretty easy/reliable in my experience.
WSL is a bit different but there are steps to get that up and running too.
Agree with others, this guide is a bit more work than you probably need. I don’t really run windows much anymore but I did have an easier time with WSL like the other poster mentioned.
And just to check, are you planning on fine-tuning a model? If so then the whole anaconda / miniconda, pytorch, etc… path makes sense.
But if you’re not fine-tuning and you just want to run a model locally, I’d suggest ollama. If you want a UI on top of it, open-webui is great.
Hopefully you’re only forwarding the minimal select of network ports and not all ports/traffic? If so then you’re good, like someone else said if you’ve got a router and it’s forwarding selected traffic then no need for anything else
Tons of remote jobs out there, probably a higher percentage for startup jobs. Most remote places will have people in different time zones and some sort of core hours they expect people to be in, but having some discussion you’ll probably be able to find one that’s accommodating.
One good site to start looking:
Good luck
Respect, but…