In case of Gnome it was addressed, just by different people. Gnome 2 continues to live on as MATE, so anyone who doesn’t like Gnome 3 can use it instead.
In case of Gnome it was addressed, just by different people. Gnome 2 continues to live on as MATE, so anyone who doesn’t like Gnome 3 can use it instead.
To provide features that Xorg can’t.
If you don’t need features like fractional scaling, VRR, touchscreen gestures, etc. you won’t notice a difference.
People who do use those, will. Because for them, those features would be missing or not complete on Xorg.
You’re linking a post… From 2010. AMD replaced radeon with their open source drivers (AMDgpu) in 2015. That’s what pretty much any AMD GPU that came out in the last 10 years uses now.
Furthermore, the AMDgpu drivers are in-tree drivers, and AMD actively collaborate with the kernel maintainers and developers of other graphics related projects.
As for Nvidia: their kernel modules are better than nothing, but they don’t contain a whole lot in terms of actual implementation. If before we had a solid black box, now, with those modules, we know that this black box has around 900 holes and what comes in and out of those.
Furthermore, if you look at the page you’ve linked, you’ll see that “the GitHub repository will function mostly as a snapshot of each driver release”. While the possibility of contributing is mentioned… Well, it’s Nvidia. It took them several years to finally give up trying to force EGLStreams and implement GBM, which was already adopted as the de-facto standard by literally everybody else.
The modules are not useless. Nvidia tend to not publish any documentation whatsoever, so it’s probably better than nothing and probably of some use for the nouveau driver developers… But it’s not like Nvidea came out and offered to work on nouveau to make up to par and comparable to their proprietary drivers.
k, so for the least used hardware, linux works fine.
Yeah, basically. Which raises a question: how companies with much smaller market share can justify providing support, but Nvidia, a company that dominates the GPU market, can’t?
The popular distros are what counts.
Debian supports several DEs with only Gnome defaulting to Wayland. Everything else uses X11 by default.
Some other popular distros that ship with Gnome or KDE still default to X11 too. Pop!_OS, for example. Zorin. SteamOS too, technically. EndeavorOS and Manjaro are similar to Debian, since they support several DEs.
Either way, none of those are Wayland exclusive and changing to X11 takes exactly 2 clicks on the login screen. Which isn’t necessary for anyone using AMD or Intel, and wouldn’t be necessary for Nvidia users, if Nvidia actually bothered to support their hardware properly. But I digress.
Worked well enough for me to run into the dozen of other issues that Linux has
Oh, it’s no way perfect. Never claimed it is.
I like most people want a usable environment. Linux doesn’t provide that out of the box.
This both depends on the disto you use and on what you consider a “usable environment”.
If you extensively use Office 365, OneDrive, need ActiveDirectory, have portable storage encrypted with BitLocker, etc. then, sure, you won’t have a good experience with any distro out there. Or even if you don’t, but you grab a geek oriented distro (e.g. Arch or Gentoo) or a barebones one (e.g. Debian) you, again, won’t have the best experience.
A lot of people, however, don’t really do a whole lot on their devices. The most widely used OS in the world, at this point in time, is Android, of all things.
If all you need to do is use the web and, maybe, edit some documents or pictures now and then, Linux is perfectly capable of that.
Real life example: I’ve switched my parents onto Linux. They’re very much not computer savvy and Gnome with it’s minimalistic mobile device-like UI and very visual app-store-like program manager is significantly easier for them to grasp. The number of issues they ask me to deal with has dropped by… A lot. Actually, every single issue this year was the printer failing to connect to the Wifi, so, I don’t suppose that counts as a technical issue with the computer, does it?
wacom tablets
I use Gnome (Wayland) with an AMD GPU. My tablet is plug and play… Unlike on Windows. Go figure.
Both Intel and AMD GPUs work fine on Linux. Both work fine with Wayland.
Wayland has been around for over a decade and has been in a usable state for the last 3 or so years.
Attributing the fact that Nvidia stuff still barely works to the fact that some distros have made Wayland the default is just stupid wrong.
Besides, Nvidia experience isn’t/wasn’t the smoothest even on Xorg. Linux desktop is simply not a priority for Nvidia.
To be honest, most things in Nobra can be installed/done to regular Fedora. And, unlike Nobra, Fedora has more than 1 maintainer: goof for the bus factor.
Focusing on the things I need to actually do.
I swear, if even if I was forced to do something at gunpoint, I’d manage to get distracted anyway.
Almost everything that’s not Gnome can be considered lightweight, to be honest.
“Our goal is knowledge, so we’re going to obfuscate everything to fuck and make things unreadable”
1k USD. Should be enough to leave my shithole of a country, if I’m lucky.
Corporations have been trying to control more and more of what users do and how they do it for longer than AI has been a “threat”. I wouldn’t say AI changes anything. At most, maybe, it might accelerate things a little. But if I had to guess, the corpos are already moving as fast as they can with locking everything down for the benefit of no one, but them.
“AI” models are, essentially, solvers for mathematical system that we, humans, cannot describe and create solvers for ourselves.
For example, a calculator for pure numbers is a pretty simple device all the logic of which can be designed by a human directly. A language, thought? Or an image classifier? That is not possible to create by hand.
With “AI” instead of designing all the logic manually, we create a system which can end up in a number of finite, yet still near infinite states, each of which defines behavior different from the other. By slowly tuning the model using existing data and checking its performance we (ideally) end up with a solver for some incredibly complex system.
If we were to try to make a regular calculator that way and all we were giving the model was “2+2=4” it would memorize the equation without understanding it. That’s called “overfitting” and that’s something people being AI are trying their best to prevent from happening. It happens if the training data contains too many repeats of the same thing.
However, if there is no repetition in the training set, the model is forced to actually learn the patterns in the data, instead of data itself.
Essentially: if you’re training a model on single copyrighted work, you’re making a copy of that work via overfitting. If you’re using terabytes of diverse data, overfitting is minimized. Instead, the resulting model has actual understanding of the system you’re training it on.
So… You say nothing will change.
OpenSUSE + KDE is a really solid choice, I’d say.
The most important Linux advice I have is this: Linux isn’t Windows. Don’t expect things to works the same.
Don’t try too hard to re-configure things that don’t match the way things are on Windows. If there isn’t an easy way to get a certain behavior, there’s probably a reason for it.
I’ve heard that he passed some of his duties onto other people.
However, I’m not aware of anyone within the team criticizing his behavior or statements, which, while might be a bit of a stretch, likely implies that everyone related to the project, at the very least, tolerates, if not outright shares the the views.
I find it practically impossible to trust claims of people like that, to be honest.
“Accusing with no concrete proof” is exactly what GrapheneOS developers are doing in regards to other projects. Claiming other products are a scam, particularly when those products somewhat compete with yours, is a pretty big red flag.
Reviewing the source code of an entire operating system is not a task doable by a single person, particularly when that person is not an expert in the field.
A proper code audit needs to be done by a team of professionals capable of spotting things like actual security vulnerabilities and logic errors that might result in more data being exposed, than advertised.
Considering the lead developer of GrapheneOS bans anyone from their chat for asking how an Android phone with GrapheneOS compares to a non-android phone, such as a PinePhone or Librem 5, in terms of security, because, according to said developer, PhonePhone and Librem5 are “scam products” and even asking questions about them is “spreading misinformation” and “promotion of fraud”, I’d be quite, quite vary of the claims GrapheneOS developers make about its security.
I have a 120 gig SSD. The system takes up around 60 gigs + BTRFS snapshots and its overhead. A have around 15 gigs of wiggle room, on average. Trying to squeeze some /home stuff in there doesn’t really seem that reasonable, to be honest.
Wayland has it’s fair share of problems that haven’t been solved yet, but most of those points are nonsense.
If that person lived a little over a hundred years ago and wrote a rant about cars vs horses instead, it’d go something like this:
The rant you’re linking makes about as much sense.