

Are you actually arguing that the US should take Greenland? What in the actual fuck?


Are you actually arguing that the US should take Greenland? What in the actual fuck?
What are you talking about? What citations?


https://en.wikipedia.org/wiki/Algebraic_data_type
Some reading material for you. Sum types allow for proper, compiler-enforced error handling and optionality rather than the unprincipled free for all that is exceptions and nullability.
Tony Hoare, the person that originally introduced nulls to the programming world, is oft-quoted as calling nulls the “billion dollar mistake”. Here’s the talk: https://www.infoq.com/presentations/Null-References-The-Billion-Dollar-Mistake-Tony-Hoare/.
Nulls are absolutely pervasive in Java and NPEs are not avoidable. At minimum, most of the ecosystem uses nulls, so most any library will have nulls as part of its interface. Null is an inhabitant of every type in Java (even Optional, ironically). You cannot escape it. It’s a fundamental flaw in the design of the language.
Btw, you also can’t escape it in Typescript, either, due to unsoundness of the type system and the fact that many types for libraries are bolted on to the original JS implementation and may possibly be inaccurate. But still, it’s a lot less likely than Java.
Why are you talking about functional programming? Python sure as hell isn’t FP.
Setting aside the fact that that is not even remotely true, do you think Linux = Red Hat? What about almost every other distro being run by volunteeers?
I’ve only ever seen redhat used by government and some corporations. As far as the broader community goes (especially the foss community), they are a pretty minor player.
It’s honestly insane that you can sit there and shill for Microsoft these days. They’ve always been pretty evil, but now they’ve gone so far off the deep end they’re even driving away people who have been all-in on Microsoft their whole lives. Even non-tech people are getting simply fed up with all of the spying and intrusive, AI-infested bullshit. Linux marketshare has been steadily increasing over the last couple of years, and it doesn’t look like it’s slowing down anytime soon. And all of it is, ultimately, because Windows is forcing people away.
Eh, git is never really that fucked. If you understand how it works, it’s generally not hard to get back to a state you want (assuming everything has been committed at some point, ofc).
I would much rather people try to spend some time trying to understand and solve a problem first. I had a “senior” engineer who would message me literally every morning about whatever issue he was facing and it drove me absolutely nuts. Couldn’t do anything for himself. Unsurprisingly, he was recently laid off.
My time should be respected.
cat file.txt | grep foo is unnecessary and a bit less efficient, because you can do grep foo file.txt instead. More generally, using cat into a pipe is less efficient than redirecting the file into stdin with <, like grep foo < file.txt.
I mean, among people that use terminals, it’s very normal. Commonplace, even.
Because that’s a perfectly normal and reasonable thing to do?


Wtf are you talking about? It doesn’t have a fucked up name origin at all. It was named “master” as in “master recording”, like in music production. Proof: https://x.com/xpasky/status/1271477451756056577.
Master/slave concepts were never a thing in git. The whole renaming thing was really fucking stupid. Caused plenty of breakage of scripts and tools for absolutely no good reason whatsoever.


It’s great for non-HTML markup, like https://hyperview.org/.
A lot of the hate is undeserved. It has had awful paradigms built around it (like SOAP), but that doesn’t make XML inherently bad by any means.


Yup. It’s insanity that this is not immediately obvious to every software engineer. I think we have some implicit tendency to assume we can make any tool work for us, no matter how bad.
Sometimes, the tool is simply bad and not worth using.


Admittedly I’m not sure if it works for Japanese, but English has online tools you can use to print out a sheet to write out every character and scan to turn into a font file. Would be surprising if it didn’t exist for Japanese.
So ultimately you probably just need someone with neat handwriting.
Naturally, vim is still acceptable.


Except it’s not seamless, and never has been. ORMs of all kinds routinely end up with N+1 queries littered all over the place, and developers using ORMs do not understand the queries being performed nor what the optimal indexing strategy is. And even if they did know what the performance issue is, they can’t even fix it!
Beyond that, because of the fundamental mismatch between the relational model and the data model of application programming languages, you necessarily induce a lot of unneeded complexity with the ORM trying to overcome this impedance mismatch.
A much better way is to simply write SQL queries (sanitizing inputs, ofc), and for each query you write, deserialize the result into whatever data type you want to use in the programming language. It is not difficult, and greatly reduces complexity by allowing you to write queries suited to the task at hand. But developers seemingly want to do everything in their power to avoid properly learning SQL, resulting in a huge mess as the abstractions of the ORM inevitably fall apart.


Access modifiers are definitely something I despise about OOP languages, though I understand that OOP’s nature makes them necessary.


The encryption thing is definitely weird/crazy and storing the SQL in XML is kinda janky, but sending SQL to a DB server is literally how all SQL implementations work (well, except for sqlite, heh).
ORMs are straight trash and shouldn’t be used. Developers should write SQL or something equivalent and learn how to properly use databases. eDSLs in a programming language are fine as long as you still have complete control over the queries and all queries are expressable. ORMs are how you get shit performance and developers who don’t have the first clue how databases work (because of leaky/bad abstractions trying to pretend like databases don’t require a fundamentally different way of thinking from application programming).
I think the point is not that it’s a MacBook, but that the senior is using a single laptop instead of a full multi-monitor setup.
Personally as a senior, I use 4 monitors. My eyes are too shit to stare at a tiny laptop screen all day, and I want slack/browser/terminal windows on their own screens. It’s much more comfortable as well.
Not sure what you’re trying to say. Are you saying this is a good thing?