![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://programming.dev/pictrs/image/170721ad-9010-470f-a4a4-ead95f51f13b.png)
Your hands and wrists must not hurt yet. You’ll eventually come to see writing code as tedium.
Your hands and wrists must not hurt yet. You’ll eventually come to see writing code as tedium.
Other way around, actually; C was one of several languages proposed to model UNIX without having to write assembly on every line, and has steadily increased in abstraction. Today, C is specified relative to a high-level abstract machine and doesn’t really resemble any modern processing units’ capabilities.
Incidentally, coming to understand this is precisely what the OP meme is about.
C’mon, I think you have better reading comprehension than that. He’s a professional data scientist specializing in machine learning. He went to grad school, then to big industry, then to startups, and is currently running a consultancy. He is very clearly not “on the side of the road.” He’s merely telling executives to fuck off with their AI grift.
I think they’re saying that e.g. you shouldn’t index a natural key unless you know that you’re going to search/collate by that key as a column. Telling the database that a certain column contains (a component of) the primary key is adding a restriction to that column.
This shit is why I cannot recommend Truffle/Graal. Yes, it’s cool technology. Yes, it works well. Yes, I remember Chris Seaton. Yes, most of it is Free Software. However, Oracle is still the fucking lawnmower, and it’s not safe to build upon anything they can convince a judge they might own.
Alternatives include RPython (my preference) and also GNU Lightning.
Direct rendering infrastructure in Linux predates widespread use of “digital rights management” as a term of art by about two or three years. “We were here first,” as the saying goes. That said, the specific concept of direct rendering managers is a little newer, and probably was a mistake on its own merits, regardless of the name.
Oracle Ruined America’s Cup (Larry Ellison)
Yes, if that’s the only reason one is using fail2ban
. Honestly, I won’t miss it.
Well put. And this is a generic pattern; for example, GPUs are only faster than CPUs if the cost of preparing the GPU and retrieving the result is faster than directly evaluating the algorithm on the CPU. This also applies to main memory! Anything outside of the CPU can incur a latency/throughput/scaling tradeoff.
There is no evidence that any human understands computers.
Define your terms before relying on platitudes. Mutability isn’t cleaner if we want composition, particularly in the face of concurrency. Being idiomatic isn’t good or bad, but patterned; not all patterns are universally desirable. The only one which stands up to scrutiny is efficiency, which leads to the cult of performance-at-all-costs if one is not thoughtful.
It has nothing to do with knowing the language and everything to do with what’s outside of the language. C hasn’t resembled CPUs for decades and can’t be reasonably retrofitted for safety.