For those veteran linux people, what was it like back in 90s? I did see and hear of Unix systems being available for use but I did not see much apart from old versions of Debian in use.
Were they prominent in education like universities? Was it mainly a hobbyist thing at the time compared to the business needs of 98, 95 and classic mac?
I ask this because I found out that some PC games I owned were apparently also on Linux even in CD format from a firm named Loki.
Slackware and Red Hat were the two distros in use in the mid 90s.
My local city used proper UNIX, and my university had IRIX workstations and SunOS servers. We used Linux at my ISP to handle modem pools and web/mail/news servers. In the early 2000s we had Linux labs, and Linux clusters to work on.
Linux on the desktop was a bit painful. There were no modules. Kernels had to fit into main memory. So you’d roll your own kernel with just the drivers you needed. XFree86 was tricky to configure with timings for your CRT monitors. If done wrong, you could break your monitor.
I used FVWM2 and Enlightenment for many years. I miss Enlightenment.
You mean your graphic drivers, right? not your actual hardware…
No. The wrong timing parameters could definitively break your hardware.
Me too! Has E17 come out yet? 😆
E16 was better
that was the last time i contributed; i created a LCARS port and now there are hundreds of them everywhere.
LCARS interface… that is something I haven’t seen in a loooooooong time
Enlightenment is on version 26
Guess you missed the joke that it was 13 years between E16 and E17 🙂
SGI workstations had the best GUI. That shit looked straight out of Hollywood
I used Enlightenment on Arch Linux for a year, in 2020-21. The PC had 4G ram and an HDD, Enlightenment was blazing fast. I could type enlightenment_start to a tty and reach a Wayland desktop under a second with 250M ram used total. E is still alive and kicking.
How wrong did you have to be to break your monitor? Because I’m positive I got it very wrong a whole lot of times and never managed that.
By the late 90’s most monitors were smart enough to detect when sync speed was too far off and not try to display an image.
It was the old monitors that only supported a single or fixed set of scan rates that you had to worry about damaging. Some could be very picky and others were more tolerant.
Thank goodness I had a newer monitor then, because I would definitely have toasted several.
I managed to make mine do some very worrying noises, but none of my monitors broke either, even though the bandwidth I based my calculations on was often kinda made up.