Anna’s archive would be the go to i think. You can choose the language in the sidebar.
🏠 HTTP 301 - moved permanently to feddit.org 🏠
Anna’s archive would be the go to i think. You can choose the language in the sidebar.
also probably for the planet and a lot of animals.
Yeah i know, i just prefer to use the distros native package manager. That said, i use the jellyfin client from flathub and that one now warns me as well that it depends on qt5.15 (works fine though, since flatpak can have multiple versions of dependencies).
qt was stuck with 5.15 because the telegram app depended on it (sigh). Had to do a dnf upgrade --best --allowerasing
for the update to qt6 (and the removal of telegram lol).
But now everything works fine.
I hope it went well :) i was completely ready to go back changing the image tag to v2 but didn’t need to.
I will try cyberpunk one day if its on sale and my pile of shame has gotten smaller.
I made the switch to Heroic from Lutris because the integration is just better. I used both for a while, bc the witcher 3 worked better on the legacy version for me, and heroic didn’t let you choose the (legacy or nextgen), while lutris only had the legacy version. But now you can install any version you want on Heroic (looking at you, every other platform with forced updates). Also, while Lutris downloads the offline installers off of GOG, heroic installs it via the GOG galaxy redistributable. This also makes it possible to sync playtime and savegames, although this is experimental right now. As soon as they start implementing achievements (which i think they have planned) its feature complete for me.
Updates of heroic itself and the games always went fine, although it must be said that the most challenging titles i have on gog right now are witcher 3 and metro exodus.
The codec thing really is a bummer. But thats really one of the few things you would have to do on Fedora while theres plenty of other pitfalls with other distros too. Like an older kernel or having to manually configure drivers for some hardware with Debian, or having to deal with canonicals shenanigans on Ubuntu.
Maybe one of the more niche distros is a better guess for some, like Nobara or Bazzite for gaming.
I hope so. I’ve been using Linux for 10 years for everything except gaming. And two years ago i went fulltime with proton and lutris (switched to heroic though).
And let me tell you, we’re at a point where its multiple times more straight forward to just install something like Fedora KDE, and do almost anything windows can, than trying to deal with whatever the hell microsoft is up to these days.
The biggest problem still is software discoverability. It is our duty to guide newcomers where they want to go instead of gatekeeping.
The choice gesture vs. nav button is usually part of android itself. In my case (pixel with therefor nearly stock android) its in settings -> system -> navigation mode (or something similar since its in german in my case). If you can’t find it, search for “navigation button your phone model”.
Edit: sorry, i just realized you meant the app drawer, not the overview of currently opened apps. I don’t know the answer to that.
Edit edit: ok i found something in lawnchairs settings. The last setting is called gestures. If you have navigation buttons enabled instead of nav gestures (see above) you could bind the home button to open the app drawer.
Lawnchair is a fork of the original pixel launcher that gives some quality of life upgrades to an already good piece of software, like being able to remove the search bar, resizable / reshapable icons and fonts usw. Its also open source, so feel free to check out the github.
There is no app store release right now but they are working on it. I’ve used the alpha versions for two years now and it has worked fine so far.
I hope it will change software discoverability on linux for the better.
Babe wake up, new flathub frontend just dropped
Take that snap!
Im so looking forward to this. When i tried to use tmpfs / ramdisk, the transcoding would simply stop because there was no space left.
Yes, since we have similar gpus you could try the following to run it in a docker container on linux, taken from here and slightly modified:
#!/bin/bash
model=microsoft/phi-2
# share a volume with the Docker container to avoid downloading weights every run
volume=<path-to-your-data-directory>/data
docker run -e HSA_OVERRIDE_GFX_VERSION=10.3.0 -e PYTORCH_ROCM_ARCH="gfx1031" --device /dev/kfd --device /dev/dri --shm-size 1g -p 8080:80 -v $volume:/data ghcr.io/huggingface/text-generation-inference:1.4-rocm --model-id $model
Note how the rocm version has a different tag and that you need to mount your gpu device into the container. The two environment variables are specific to my (any maybe yours also) gpu architecture. It will need a while to download though.
Huggingface TGI is just a piece of software handling the models, like gpt4all. Here is a list of models officially supported by TGI, although they state that you can try different ones as well. You follow the link and look for the files section. The size of the model files (safetensors or pickele binaries) gives a good estimate of how much vram you will need. Sadly this is more than most consumer graphics cards have except for santacoder and microsoft phi.
I tried Huggingface TGI yesterday, but all of the reasonable models need at least 16 gigs of vram. The only model i got working (on a desktop machine with a amd 6700xt gpu) was microsoft phi-2.
I have a 5800X processor and 6700XT GPU which the site claims is barely ok. In reality my CPU is hardly ever doing anything when gaming while the GPU is at 100% usage all the time.
For a higher GPU, like the 7900XT (which i believe will have the power of a speculative 8800XT which i want to buy) the site claims the CPU to be the bottleneck.
Now i know that there is more to it than just the usage in the control center, but does this seem sketchy to someone else?
Nope, theres a .rar file with 61mb.
See https://archive.org/download/discworld-complete-43-books-by-pratchett-terry-z-lib.org
Like one of the comments mentioned: there is yt-dlp for now at least.