I don’t mean BETTER. That’s a different conversation. I mean cooler.
An old CRT display was literally a small scale particle accelerator, firing angry electron beams at light speed towards the viewers, bent by an electromagnet that alternates at an ultra high frequency, stopped by a rounded rectangle of glowing phosphors.
If a CRT goes bad it can actually make people sick.
That’s just. Conceptually a lot COOLER than a modern LED panel, which really is just a bajillion very tiny lightbulbs.
The protocol of communication of computer parts is open source? Since when?
What the fuck is USB? And why is that proprietary?
Regardless, AMD vs nVidia might work together, but not optimally these days.
Since forever, which protocol do you think it’s not? For a few examples here’s PCI and DDR5
USB is a standardized connector, with again an open source protocol. Here’s the specification in case you’re interested https://www.usb.org/document-library/usb-20-specification
I would need a source for that, I’ve had AMD +Nvidia up until very recently and it worked as expected.
USB is absolutely not a standardized connector, otherwise it would only be one type of connector, not the dozen or so they’ve made over the decades. There’s nothing universal about it.
And if it was open source, then why doesn’t VirtualBox release the source code for their USB extension package?
USB is absolutely standardized, I even sent you the 2.0 spec, you can get the spec for the other versions on the same website.
Different versions/connectors have different specs, all of them open, otherwise different manufacturers wouldn’t be able to create devices that use it.
That’s ridiculous, first of all the name relates to the fact that it can be used for any data transfer as long as it’s serial. Secondly the sheer amount of different devices from different manufacturers that can be plugged via USB should give you a hint of just how universal and open the standard is.
The standard is open, implementations of it are not, it’s like OpenGL or Vulkan.
USB 1.0, 1.1, 2.0, 3.0, 3.1, A, B, C connectors, large and small.
Not even counting the various charging rates and voltages…
And yet most of the time in the past 2 year the best choice for a gaming PC would be a 3D cache Ryzen with an Nvidia GPU. Is there something particular you have in mind that supposedly doesn’t work with an AMD chipset and an Nvidia GPU?
PCI-Express is not an open standard but both AMD and Nvidia are members and it’s what both use for their GPUs and AMD for it’s chipsets (as well as Intel). It’s certainly not a secret cabal.
It’s all in the same family, literally…
https://www.cnn.com/2023/11/05/tech/nvidia-amd-ceos-taiwan-intl-hnk/index.html
This supports your claim of AMD vs Nvidia not working optimally together how?