![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://programming.dev/pictrs/image/170721ad-9010-470f-a4a4-ead95f51f13b.png)
Your hands and wrists must not hurt yet. You’ll eventually come to see writing code as tedium.
Your hands and wrists must not hurt yet. You’ll eventually come to see writing code as tedium.
Other way around, actually; C was one of several languages proposed to model UNIX without having to write assembly on every line, and has steadily increased in abstraction. Today, C is specified relative to a high-level abstract machine and doesn’t really resemble any modern processing units’ capabilities.
Incidentally, coming to understand this is precisely what the OP meme is about.
C’mon, I think you have better reading comprehension than that. He’s a professional data scientist specializing in machine learning. He went to grad school, then to big industry, then to startups, and is currently running a consultancy. He is very clearly not “on the side of the road.” He’s merely telling executives to fuck off with their AI grift.
I think they’re saying that e.g. you shouldn’t index a natural key unless you know that you’re going to search/collate by that key as a column. Telling the database that a certain column contains (a component of) the primary key is adding a restriction to that column.
This shit is why I cannot recommend Truffle/Graal. Yes, it’s cool technology. Yes, it works well. Yes, I remember Chris Seaton. Yes, most of it is Free Software. However, Oracle is still the fucking lawnmower, and it’s not safe to build upon anything they can convince a judge they might own.
Alternatives include RPython (my preference) and also GNU Lightning.
Direct rendering infrastructure in Linux predates widespread use of “digital rights management” as a term of art by about two or three years. “We were here first,” as the saying goes. That said, the specific concept of direct rendering managers is a little newer, and probably was a mistake on its own merits, regardless of the name.
Oracle Ruined America’s Cup (Larry Ellison)
Yes, if that’s the only reason one is using fail2ban
. Honestly, I won’t miss it.
PyPy exists. (This is news to around 95% of the community.)
Sarcasm needs to be humorous; you’re merely rattling off insults. Anyway, it’s pretty uncommon that somebody literally “can’t contribute code;” anybody who can learn how to use a computer and post juvenile horseshit to Lemmy can learn how to write code. I’m a former professional musician; writing code is my backup career, taking less practice and effort than playing the piano. I encourage you to try putting in some effort; for the same time it takes to write around 500 comments/month on Lemmy, you could probably build a program that automates or simplifies some portion of your life.
And seriously, by doubling down on the idea that being Neanderthal is bad or deficient, you’re spouting some nasty rhetoric. It doesn’t matter whether you’re serious or not; eventually, you’ll forget that you were being ironic. “Those who play with the devil’s toys will be brought by degrees to wield his sword” and all that.
As a hardware hacker, I’ve experienced Apple’s anti-FLOSS behavior. I was there when Apple was trying to discourage iPodLinux. In contrast, when we wanted to upstream support for the Didj, LeapFrog gave us documentation and their kernel hackers joined our IRC channel. It’s the same reason that people prefer ATI/AMD to nVidia, literally anybody to Broadcom, etc.
Your “entire fucking point” is obvious from the top-level comment you replied to; you’ve taken offense to somebody pointing out that writing FLOSS on Apple hardware is oxymoronic. And it’s a bad point, given that such a FLOSS hacker is going to use Homebrew or Nix in order to get a decent userland that hasn’t been nerfed repeatedly by an owner with a GPLv3 allergy and a fetish for controlling filesystem layouts. Darwin is a weird exception, not one of the easy-to-handle BSDs.
Also, what, are you not anti-Apple? Do you really think that a fashion company is going to reward you for being fake-angry on Lemmy?
Steps 8, 9, and 11 assume that the filesystem is in a Linux-compatible state and Windows-compatible state simultaneously.
You’re literally posting from the SDF’s instance. If you’re not going to support FLOSS, then consider migrating to a server which reflects your beliefs. (Also, go take an anthropology course so that you don’t embarrass yourself by dehumanizing people online.)
Mattermost is the most obvious option; it’s a clone of Slack. IRC is another good option, although I know a lot of people hate it because they prefer features to freedom. I cannot recommend Matrix; the UX is fine but the cryptography has a few issues, as documented by Soatok here.
The biggest barrier is writing lots of formatted data to disk without a pre-existing filesystem structure. Look at nixos-anywhere for an example; the first thing it does is ensure that it’s booted into Linux, because otherwise it can’t trust that the disks are laid out properly.
You’re cheering for exploitation of a commons.
And for anybody thinking of implementing M-expressions, look at Wolfram Mathematica, which is the only popular M-expression-oriented language. It turns out that high-level graph-rewriting semantics are difficult to make efficient! (If you want to try, you might also want to look at GHC or other efficient graph-rewriters to see what the state of the art is like outside Planet Wolfram.)
Ah, no worries. There should be an introduction-to-literature course in your native language, covering the classics and important works of your native culture. I still stand by the rest of the recommendations. By “bachelor of arts” and “bachelor of sciences” I mean how your college/university accredits degrees; computer science and engineering are usually “science” degrees but many universities have an alternative “art” version that you can choose.
The article is hilariously ill-researched, to the point where it might well be a marketing post. GCC used to have Java support; it was discontinued due to a lack of interest and usage. There is no fundamental barrier to implementing AOT Java. Java’s designers intentionally were making a statically-typed first-order Smalltalk that could be efficiently JIT’d; the language is designed for JIT instead of AOT.
It has nothing to do with knowing the language and everything to do with what’s outside of the language. C hasn’t resembled CPUs for decades and can’t be reasonably retrofitted for safety.