Maybe; it does sound like reducing the size of the driver is potentially possible as well https://www.phoronix.com/news/AMDGPU-Headers-Repo-Idea
Hiker, software engineer (primarily C++, Java, and Python), Minecraft modder, hunter (of the Hunt Showdown variety), biker, adoptive Akronite, and general doer of assorted things.
Maybe; it does sound like reducing the size of the driver is potentially possible as well https://www.phoronix.com/news/AMDGPU-Headers-Repo-Idea
See my reply to funtrek’s reply.
If a “safe C++” proposal truly proposes a safe subset, then yes your C++ code would have to opt-in to doing unsafe things. For the purposes of this discussion of a safe subset … the point is moot.
Rust still allows people to do (basically) whatever they want via unsafe blocks.
Right; any solution they come up with presumably needs to be more scalable than “new drivers” and “old drivers”. Eventually there will be too large a set of “old drivers” and we’ll end up in the same situation with a small “new drivers” driver and a large “old drivers” blob.
Yeah… Even as a third party, I definitely have not been enjoying the smell when I’ve bumped into it. I don’t think it should be a criminal offense, but I hope we can move past “I need to light a thing on fire and just screw up the air for everyone in my vicinity.”
That’s really not even close to the optimistic scenario. It’s arguably not even in the pessimistic scenario if you’re not just in the “make stuff up club.”
We’re talking at most half a meter of rise by 2050, at most 2 meters by 2100, at most 4 meters by 2150. The intermediate projection is a third of a meter by 2050. The optimistic projection (which we’re not going to hit) is 3/20th of a meter.
Climate change is real. The risk of famine is real. The risk of global conflict is real. The risk of trying storms is real. However, “doomsday everybody dies” is not really on any serious projections. The worst case is “a lot of people in a lot of poor nations die and rich nations have more wars and more immigration.”
Yeah this is either projection because they didn’t see the mainstream media doing it or an attempt to drive a wedge and create controversy where there shouldn’t be any.
I was going to defend “well ray tracing is definitely a time saver for game developers because they don’t have to manually fake lighting anymore.” Then I remembered ray tracing really isn’t AI at all… So yeah, maybe for artists that don’t need to use as detailed of textures because the AI models can “figure out” what it presumably should look like with more detail.
I’ve been using FSR as a user on Hunt Showdown and I’ve been very impressed with that as a 2k -> 4k upscale… It really helps me get the most out of my monitors and it’s approximately as convincing as the native 4k render (lower resolutions it’s not nearly as convincing for … but that’s kind of how these things go). I see the AI upscalers as a good way to fill in “fine detail” in a convincing enough way and do a bit better than traditional anti aliasing.
I really don’t see this as being a developer time saver though, unless you just permit yourself to write less performant code … and then you’re just going to get complaints in the gaming space. Writing the “electron” of gaming just doesn’t fly like it does with desktop apps.
For my grandfather… The issue wasn’t the shows, but he specifically wants a few news programs and will not under any circumstances go without them.
This was a problem for even going to Internet based streaming options because he just will not accept anything without those shows for more than a few months.
Meanwhile he also complains he doesn’t have enough to watch and says he can’t afford it (he can, he just doesn’t like what it cost)… But those dang news channels… and just his outlook on TV in general.
Sure, there’s a cost to breaking things up, all multiprocessing and multithreading comes at a cost. That said, in my evaluation, single for “unity builds” are garbage; sometimes a few files are used to get some multiprocessing back (… as the GitHub you mentioned references).
They’re mostly a way to just minimize the amount of translation units so that you don’t have the “I changed a central header that all my files include and now I need to rebuild the world” (with a world that includes many many small translation units) problem (this is arguably worse on Windows because process spawning is more expensive).
Unity builds as a whole are very very niche and you’re almost always better off doing a more targeted analysis of where your build (or often more importantly, incremental build) is expensive and making appropriate changes. Note that large C++ projects like llvm, chromium, etc do NOT use unity builds (almost certainly, because they are not more efficient in any sense).
I’m not even sure how they got started, presumably they were mostly a way to get LTO without LTO. They’re absolutely awful for incremental builds.
Slow compared to what exactly…?
The worst part about headers is needing to reprocess the whole header from scratch … but precompiled headers largely solve that (or just using smaller more targeted header files).
Even in those cases there’s something to be said for the extreme parallelism in a C++ build. You give some of that up with modules for better code organization and in some cases it does help build times, but I’ve heard in others it hurts build times (a fair bit of that might just be inexperience with the feature/best practices and immature implementations, but alas).
There’s no precompiler in C++. There’s a preprocessor but that’s something entirely different. It’s also not a slow portion of the compile process typically.
C++ is getting to the point where modules might work well enough to do something useful with them, but they remove the need for #include preprocessor directives to share code.
A co-op infamous game for PC I would drop full price on, no questions.
Full disclosure: I never played the sequels (I play co-op almost exclusively now), but I absolutely loved the original game. It would be great to be able to go back to that dystopian world and rediscover my powers (or maybe other variants of them that appeared from disasters) in say … London, DC, or LA (instead of NYC) with a friend.
It doesn’t help that they have said a lot of just straight up anti-consumer stuff in the last year.
It’s not a new problem; I remember back in 2016 looking up one MAGA supporter on Twitter that seemed EXTREMELY enthusiastic via reverse image search.
I ended up finding a girl in Brazil that had presumably never set foot in “Nebraska” where this MAGA supporter was allegedly “born and raised.”
I’ve never heard of using protobuf in an HTTP API… But, I guess that should be fine.
That’s when you use different exit codes. 1 for failure during simulation, 2 for simulation failed.
Shame they wouldn’t listen.
What did you hate about it? I mean CentOS is fine other than IBM killed it
That’s a laudable difference /s. Using Rust is also an “opt-in” option.