It was a great adventure. But yeah, that setup was on 24/7. Not because of compilation, but it definitely made a lot of this more feasible
Gentoo unstable was a little bit tiring in the long run. The bleeding edge, but often I needed to downgrade because the rest of the libraries were not ready
Gentoo stable was really great. Back then pulseaudio was quite buggy. Having a system where I could tell all applications and libraries to not even link to it (so no need to have it installed at all) made avoiding its problems really easy
But when my hardware got older and compilation of libreoffice started to take 4h, I remembered how nice it was on Slackware where you just install package you broke and you’re done
Arch looked like a nice middle-ground. Most of the things in packages, big focus on pure Linux configurability (pure /etc files, no Ubuntu(or SUSE?) “you need working X.org to open distro-specific graphics card settings”) and AUR for things there are no official packages for. Turned out it was a match :)
Windows (~6 years) -> Mandriva (Mandrake? For I think 2-3 years) -> Ubuntu (1 day) -> Suse (2 days) -> Slackware (2-3 years) -> Gentoo unstable (2-3 years) -> Gentoo stable (2-3 years) -> Arch (9 years and counting)
The only span I’m sure about is the last one. When I started a job I decided I don’t have the time to compile the world anymore. But the values after Windows sum up to 21, should be 20, so it’s all more or less correct
If you want to access your computer from outside your LAN, it would be a good idea to at least secure it or, unfortunately the best, learn to understand what you are doing
Coming back to the topic, though, I’d start with checking these out
But machine will not do the creative part. It can only fill in the time-sinks around our creative ideas. Ask an LLM to tell you a joke no-one has ever heard before and then google it. The creative part still has to come from humans
EDIT; and the truth is that we very rarely come up with something creative. We mostly just recompile previously met combinations
trying to weasel out of putting some effort into something that sounds worth putting some effort into
But that depends what do they need it for
Personally I don’t see a difference between legalese boilerplate and 10k word story. But that discussion might lead us nowhere
What about text creation have you learned
In many cases I don’t want nor need to learn that. I just need volume about the key points
Why an LLM is any different?
Let’s say I want my RPG players to find a corporate mail that gives them some plot info. Why not ask an LLM to write the boilerplate around the info I want to give them? Just as example
Let’s not put any effort into anything: the machine will do it for me
So you are not using a calculator, I presume? Only math done on abacus is not being lazy?
If you want something local and open source, I think your main problem will be the number of parameters (the b
thing). ChatGPT-3 is (was?) noticeably big and open source models are usually smaller. There is, of course, an exchange about how much the size of the model matters and how the quality of the training data affects the results. But when I did a non-scientific comparison ~half a year ago, there was a noticeable difference between smaller models and bigger ones.
Having said all of that, check out https://huggingface.co/ it aims to be like GitHub for AIs. Most of the models are more or less open source, you will only need to figure out how to run one and if you have some bottlenecks on PI
My mom and grandma are using Manjaro. With grandma I’m the only one doing the updates of course, but with mom she usually can do it herself just using pamac-tray
. If that fails a phonecall is usually sufficient. Once in a few years I have to come and do something by myself
And when that happens I work with a distro that just works, instead of some broken crap
EDIT: I tried having Mint on their computers. Big mistake, it’s as broken as Debian and Ubuntu
EDIT: Xfce is very nice in such cases. It looks familiar for them while being manageable for me
In the end everything is maintained by the community, the only difference is that AUR is “everyone can maintain” and official is “we have team of official maintainers that decided to maintain these packages”. Personally I can’t imagine running without using AUR
But it’s fair if it doesn’t count for you
Pure speculation:
I think it’s just not that popular. Does it do something more than rclone or simple rsync? If not, then its main selling point would be GUI. But then, I think, either one can use the remote location via their file manager (like thunar with MEGA for example) or there is not that much difference between opening another app and using web. And if the selling point would be pausing and resuming download, torrents are probably more verstile
It is available in AUR, though, so maybe it’s only me that haven’t heard about it earlier
Also, it’s a java application. There is not much to package or depend on, I guess
Tinkering is all fun and games, until it’s 4 am, your vision is blurry, and thinking straight becomes a non-option, or perhaps you just get overly confident, type something and press enter before considering the consequences of the command you’re about to execute… And then all you have is a kernel panic and one thought bouncing in your head: “damn, what did I expect to happen?”.
Nah, that’s when the fun really starts! ;)
The package refused to either work or install complaining that the version of glibc was incorrect… So, I installed glibc from Debian’s repos.
:D That one is a classic. Most distributions don’t include packagers from other distros because 99% of the time it’s a bad idea. But with Arch you can do whatever you want, of course
My two things:
Controllers/CorsairPeripheralController/CorsairPeripheralControllerDetect.cpp
and change0x1B7C
-> 0x1B7D
make install
Haven’t tested it but it seems so. Android client has the button too