• 0 Posts
  • 89 Comments
Joined 2 years ago
cake
Cake day: April 23rd, 2023

help-circle
  • In general, Bazzite being immutable just means the core system isn’t modular to the end user to the degree that Arch is. You of course can use flatpaks or appimages like any distro, and there are still several ways to install traditional rpm/deb/aur programs (the usual Fedora method doesn’t work because dnf doesn’t exist). If it’s just an app that doesn’t require significant integration with the OS, the recommendation is to install them into a distrobox container (where dnf does exist) and then distrobox-export [program] to make them visible to the host system. VPNs need a little more integration so those are installed by layering with rpm-ostree and then enabling the systemd service(s). Layering makes updates take longer to install so it should be avoided when possible.

    One of the interesting things about Universal Blue’s images like Bazzite is if you want the benefits of atomic while also having a more custom system than they offer without having to install a bunch of things in rpm-ostree, the process to build a custom image based on one of theirs is apparently quite easy to do and automate, though I haven’t done it myself.


  • In general, yes. Most of the difficulty is due to being on Linux and running games through the Proton/WINE compatibility layer, so there can be an extra layer of jank involved, but it’s very possible.

    If modding consists of dropping files into the game directory, it will work almost exactly the same as in Windows. However, if some of those files replace the game’s DLLs, then whatever WINE runner you use might need to be told to use the DLLs in the game directory instead of its own.

    If you need to use a mod manager, that situation is still not ideal - native Linux mod managers I know of are only the Nexus Mods app (very new, there’s some talk of it being integrated directly into the Heroic launcher) and Limo. Everything else, you’ll be running whatever bespoke Windows mod manager your game uses through Proton/WINE, probably with Steam Tinker Launch, possibly Lutris.

    tl;dr There can be an extra layer of complexity over modding on Windows, but it’s otherwise comparable.


  • During boot, you’re presented with 4 snapshots you can choose between so if an update did happen to break something, it’s easy as just choosing an older snapshot after a reboot.

    Those are actually just two snapshots, there’s a bug in GRUB that displays them twice. Purely visual, and you can fix it with a ujust script, run in the terminal with ujust configure-grub. There are lots of little scripted tweaks and installations available; you can get most of the list by running ujust by itself. Incredible work by the maintainers.







  • You’re entirely correct, but in theory they can give it a pretty good go, it just requires a lot more computation, developer time, and non-LLM data structures than these companies are willing to spend money on. For any single query, they’d have to get dozens if not hundreds of separate responses from additional LLM instances spun up on the side, many of which would be customized for specific subjects, as well as specialty engines such as Wolfram Alpha for anything directly requiring math.

    LLMs in such a system would be used only as modules in a handcrafted algorithm, modules which do exactly what they’re good at in a way that is useful. To give an example, if you pass a specific context to an LLM with the right format of instructions, and then ask it a yes-or-no question, even very small and lightweight models often give the same answer a human would. Like this, human-readable text can be converted into binary switches for an algorithmic state machine with thousands of branches of pre-written logic.

    Not only would this probably use an even more insane amount of electricity than the current approach of “build a huge LLM and let it handle everything directly”, it would take much longer to generate responses to novel queries.




  • Webtoon is still shitty in other ways. When they adapt a property, they want it their way, regardless of the author’s original vision. I’ve seen several stories that originated on Royal Road get Webtoon adaptations, and the adaptations always seem to change or leave out important parts of the story, making characters look stupid or just completely replacing entire sets of characters, forcing the story to diverge substantially when inevitably something they got rid of turns out to have been critically important to where the author was taking things. They turn great stories into middling slop every single time.




  • The purpose of this plant is in fact not long-duration storage, but secondary functions as you mentioned, and it’s also meant to be a proof-of-concept. Per an article from CNESA’s English site when the plant’s construction began in June 2023:

    This project represents China’s first grid-level flywheel energy storage frequency regulation power station and is a key project in Shanxi Province, serving as one of the initial pilot demonstration projects for “new energy + energy storage.” The station consists of 12 flywheel energy storage arrays composed of 120 flywheel energy storage units, which will be connected to the Shanxi power grid. The project will receive dispatch instructions from the grid and perform high-frequency charge and discharge operations, providing power ancillary services such as grid active power balance.





  • Router-level VPN is going to be more difficult to configure and cause more problems than just having it on all your devices. There are some games where online play just refuses to work if connecting through a VPN. Some mobile apps are the same. When a website blocks your currently selected server, and the usual solution is switching to another server, that’s going to be more difficult and more tedious when it’s configured at the router level. In addition, if you do something like using a self-hosted VPN in order to connect remotely to a media server on your home network, that becomes more difficult if your home router is on a different VPN.

    If you’re trying to keep local devices in the building from phoning home and being tracked, a PiHole or router-level firewall might be a better solution. I think if you’re running a pfsense or opnsense router and are a dab hand with VLANs then maybe you could get what you’re looking for with router-level VPN, but it’s a huge hassle otherwise. Just put Mullvad on your computers and phones and call it a day.


  • Unfortunately I can’t even test Llama 3.1 in Alpaca because it refuses to download, showing some error message with the important bits cut off.

    That said, the Alpaca download interface seems much more robust, allowing me to select a model and then select any version of it for download, not just apparently picking whatever version it thinks I should use. That’s an improvement for sure. On GPT4All I basically have to download the model manually if I want one that’s not the default, and when I do that there’s a decent chance it doesn’t run on GPU.

    However, GPT4All allows me to plainly see how I can edit the system prompt and many other parameters the model is run with, and even configure multiple sets of parameters for the same model. That allows me to effectively pre-configure a model in much more creative ways, such as programming it to be a specific character with a specific background and mindset. I can get the Mistral model from earlier to act like anything from a very curt and emotionally neutral virtual intelligence named Jarvis to a grumpy fantasy monster whose behavior is transcribed by a narrator. GPT4All can even present an API endpoint to localhost for other programs to use.

    Alpaca seems to have some degree of model customization, but I can’t tell how well it compares, probably because I’m not familiar with using ollama and I don’t feel like tinkering with it since it doesn’t want to use my GPU. The one thing I can see that’s better in it is the use of multiple models at the same time; right now GPT4All will unload one model before it loads another.