Dodoid wrote:Hey, that means we have a new unsuspecting employee to drown in emails about open sourcing IRIX.

I would certainly hope someone will take advantage of the opportunity.

I do agree modern computing is rather boring. You don't see the groundbreaking innovations anymore. It seems the days of expensive, exotic systems, with groundbreaking and mind blowing performance - have come to an end.

The technological progress has largely stagnated and it's a terrible shame.

Krokodil wrote:I do agree modern computing is rather boring. You don't see the groundbreaking innovations anymore. It seems the days of expensive, exotic systems, with groundbreaking and mind blowing performance - have come to an end.

The technological progress has largely stagnated and it's a terrible shame.

Krokodil wrote:[...] I do agree modern computing is rather boring. You don't see the groundbreaking innovations anymore. It seems the days of expensive, exotic systems, with groundbreaking and mind blowing performance - have come to an end [...]

I agree with what you have observed, but I think the homogenization of computing (converging to x86 / Windows / Linux) is the sign of a mature technology. That isn't to say that x86 (or, say, Linux) was the best of what was technologically available -- far from it -- but a combination of luck and market forces caused things to settle out. Generally economics will favor the lowest cost tool that gets the job done most of the time.

When technologies are in their infancy you see a much wider ecosystem of competing products and unique offerings, but when they mature, things homogenize and are treated as a foundation. The details are shoved under the hood and people move on to the next level of abstraction. All the excitement of early electrical delivery and the "War of the Currents" has ended and we now just expect to get our watts with 120V AC at 60Hz (at least in this country). Or in telecom, everything from strowger and crossbar switches to TDM circuits and ATM has given way to boring old IP transit and Metro Ethernet.

It is hard to find good examples, because the pace of innovation for computers has been so quick. I don't think there has been anything similar in recorded history where humans were able to witness the birth and maturity of a world changing technology (and all the upheaval in-between) within a single generation. But the gist of it is that units of computing power are now as generic as watts from the wall. And there are, of course, the equivalent "power companies" of computing: Amazon Web Services, Google Cloud, Microsoft Azure, etc. Unless you are an engineer at Intel or ARM there probably isn't a lot to get excited about with hardware any more. Computing has become a commodity.

I think the same is true of the Operating System; between configuration management tools (Chef, Puppet), containerization (Docker, J2EE), and VM based or interpreted languages (JVM, .NET CLR, Python, Javascript), there isn't really a lot of reason to get excited about or to interact with the OS directly anymore, in a modern context. With the assumption of a Windows or Linux environment, the OS has been turned into yet another generic layer in the stack (like a machine's BIOS).

With the base layers "solved" to some extent, I think the innovation has moved into the software domain. I have been very impressed with the developments in image recognition. As another example, AWS has made massively parallel computing accessible to everyone, and there is still a long way to go in taking advantage of that with a lot of the foundational pieces still missing. Horizontally scalable databases (Cassandra, Voldemort, DynamoDB, etc.) are still in their infancy. I don't think people are very good at programming for truly horizontal/ parallelized environments, but in any case, I think software is where the excitement is these days. It is no longer focusing on how the watts are generated (which I say knowing that this still fascinates myself as well as most others here), but rather what they are used for.

Krokodil wrote:[...] I do agree modern computing is rather boring. You don't see the groundbreaking innovations anymore. It seems the days of expensive, exotic systems, with groundbreaking and mind blowing performance - have come to an end [...]

I agree with what you have observed, but I think the homogenization of computing (converging to x86 / Windows / Linux) is the sign of a mature technology. That isn't to say that x86 (or, say, Linux) was the best of what was technologically available -- far from it -- but a combination of luck and market forces caused things to settle out. Generally economics will favor the lowest cost tool that gets the job done most of the time.

When technologies are in their infancy you see a much wider ecosystem of competing products and unique offerings, but when they mature, things homogenize and are treated as a foundation. The details are shoved under the hood and people move on to the next level of abstraction. All the excitement of early electrical delivery and the "War of the Currents" has ended and we now just expect to get our watts with 120V AC at 60Hz (at least in this country). Or in telecom, everything from strowger and crossbar switches to TDM circuits and ATM has given way to boring old IP transit and Metro Ethernet.

It is hard to find good examples, because the pace of innovation for computers has been so quick. I don't think there has been anything similar in recorded history where humans were able to witness the birth and maturity of a world changing technology (and all the upheaval in-between) within a single generation. But the gist of it is that units of computing power are now as generic as watts from the wall. And there are, of course, the equivalent "power companies" of computing: Amazon Web Services, Google Cloud, Microsoft Azure, etc. Unless you are an engineer at Intel or ARM there probably isn't a lot to get excited about with hardware any more. Computing has become a commodity.

I think the same is true of the Operating System; between configuration management tools (Chef, Puppet), containerization (Docker, J2EE), and VM based or interpreted languages (JVM, .NET CLR, Python, Javascript), there isn't really a lot of reason to get excited about or to interact with the OS directly anymore, in a modern context. With the assumption of a Windows or Linux environment, the OS has been turned into yet another generic layer in the stack (like a machine's BIOS).

With the base layers "solved" to some extent, I think the innovation has moved into the software domain. I have been very impressed with the developments in image recognition. As another example, AWS has made massively parallel computing accessible to everyone, and there is still a long way to go in taking advantage of that with a lot of the foundational pieces still missing. Horizontally scalable databases (Cassandra, Voldemort, DynamoDB, etc.) are still in their infancy. I don't think people are very good at programming for truly horizontal/ parallelized environments, but in any case, I think software is where the excitement is these days. It is no longer focusing on how the watts are generated (which I say knowing that this still fascinates myself as well as most others here), but rather what they are used for.

Oh well, atleast we have our beloved machines to enjoy and remember the golden age.

Krokodil wrote:I do agree modern computing is rather boring. You don't see the groundbreaking innovations anymore. It seems the days of expensive, exotic systems, with groundbreaking and mind blowing performance - have come to an end.

The technological progress has largely stagnated and it's a terrible shame.

I don't think that's the problem. We've taken an important step backwards, but it's in the area of standards. A few years ago, everybody (except Microsoft) was pushing strong for standards: OpenGL, OpenCL, just to name two. Today everybody is pushing for their proprietary APIs: NVIDIA trying to drop OpenCL and pushing for their proprietary CUDA API, Apple pushing Metal and even ignoring Vulkan, while introducing a "new" language.

So I strongly disagree about we being at a mature point of technology. Quite the contrary, we're in the most immature point in decades.

I disagree with all the race for lower power consumption, and I actually find it rather un-ecological.

We are all witnesses that capable hardware is being tossed out and marked as obsolete for no practical reason.Whilst I agree that sometimes lower power consumption makes a lot of difference, nowadays we are deploying more and more renewable power sources (solar, wind) when, in the meantime, a large share of rare earth metals that go into making the computers are not recoverable.

We live in a time where it's ok to throw away usable and working things.

My IntelliStation 285 can hold up to 32GB of RAM, my G5 Quad can hold up to 16GB of RAM. Nowadays computers are starting to be sold SOMETIMES with 16GB of RAM, even the so called "workstations".

And for the few folks with a REAL need for more than 32GB of RAM, yes, go ahead, be my guest, go for a newer machine. (I'm not talking about consolidated virtualised environments with all their advantages).

Not always we will be able to produce or earn more with a faster computer. You can't write your book faster. A photographer won't be editing that many photos at the same time. A database won't necessarily get much larger just because a faster and more capable computer is available. And yet, we generate a LOT of trash.

Software is bloated. A web page with a 500-words news article would eat 2-3MB of RAM. You could load large pictures separately. Nowadays we see the same page loading with 300MB. An operating system boots up eating 1000x more memory, and doesn't offer 1000x more functionality. What does Windows 10 offers that Windows 95 doesn't? Bluetooth? Wifi? IPv6? DirectX-OpenGL-whatever? Indexed file system and fast searches? Cortana? Does it really need to eat that much more of memory and system resources?

Developers deal with so many abstraction layers when they are coding that they are totally oblivious to what happens underneath, and everything is irritatingly slow. My G5 Quad with Debian reacts instantly to almost anything I throw at it (besides Web). If it is stuck processing something (like a video or a picture), I can do something in the meantime with realtime responsiveness.

My work Windows 8 computer, with all the Symantec security bloat, software to block USB storage, software to do incremental cloud backups (that renders the computer unusable when it operates), and all the crap, make the computer unresponsive for clicks and interaction (even though it may encode videos faster). On Windows 10, I hate when Microsoft sends me a popup asking me to try Edge, or if i remembered to configure my Wireless network - I DON'T WANT EDGE, AND I'M ON A WIRED NETWORK - please don't waste processor cycles and RAM with this crap, please! And please stop switching Bluetooth on all the time. I don't use it, and it eats battery!

This is getting long, no one is going to read this far. Computing is frustrating nowadays, and I'm not even that old.I only use my newer computers when I have to (for work sometimes), or to browse the Internet.All my music, my photography, my videos, my writing - all is done in older computers, with simpler software.

Shiunbird wrote:I disagree with all the race for lower power consumption, and I actually find it rather un-ecological.

We are all witnesses that capable hardware is being tossed out and marked as obsolete for no practical reason.Whilst I agree that sometimes lower power consumption makes a lot of difference, nowadays we are deploying more and more renewable power sources (solar, wind) when, in the meantime, a large share of rare earth metals that go into making the computers are not recoverable.

We live in a time where it's ok to throw away usable and working things.

My IntelliStation 285 can hold up to 32GB of RAM, my G5 Quad can hold up to 16GB of RAM. Nowadays computers are starting to be sold SOMETIMES with 16GB of RAM, even the so called "workstations".

And for the few folks with a REAL need for more than 32GB of RAM, yes, go ahead, be my guest, go for a newer machine. (I'm not talking about consolidated virtualised environments with all their advantages).

Not always we will be able to produce or earn more with a faster computer. You can't write your book faster. A photographer won't be editing that many photos at the same time. A database won't necessarily get much larger just because a faster and more capable computer is available. And yet, we generate a LOT of trash.

Software is bloated. A web page with a 500-words news article would eat 2-3MB of RAM. You could load large pictures separately. Nowadays we see the same page loading with 300MB. An operating system boots up eating 1000x more memory, and doesn't offer 1000x more functionality. What does Windows 10 offers that Windows 95 doesn't? Bluetooth? Wifi? IPv6? DirectX-OpenGL-whatever? Indexed file system and fast searches? Cortana? Does it really need to eat that much more of memory and system resources?

Developers deal with so many abstraction layers when they are coding that they are totally oblivious to what happens underneath, and everything is irritatingly slow. My G5 Quad with Debian reacts instantly to almost anything I throw at it (besides Web). If it is stuck processing something (like a video or a picture), I can do something in the meantime with realtime responsiveness.

My work Windows 8 computer, with all the Symantec security bloat, software to block USB storage, software to do incremental cloud backups (that renders the computer unusable when it operates), and all the crap, make the computer unresponsive for clicks and interaction (even though it may encode videos faster). On Windows 10, I hate when Microsoft sends me a popup asking me to try Edge, or if i remembered to configure my Wireless network - I DON'T WANT EDGE, AND I'M ON A WIRED NETWORK - please don't waste processor cycles and RAM with this crap, please! And please stop switching Bluetooth on all the time. I don't use it, and it eats battery!

This is getting long, no one is going to read this far. Computing is frustrating nowadays, and I'm not even that old.I only use my newer computers when I have to (for work sometimes), or to browse the Internet.All my music, my photography, my videos, my writing - all is done in older computers, with simpler software.

</rant>

I understand if you don't like Windows on the work computers, that sounds awful, but if you like Debian on your PowerMac G5, and you mention something eating battery indicating that you own a laptop with Windows 10, why not try Debian on that as well? I don't mind Windows, I use it alongside Hackintosh, but I have a lot of friends who didn't like it, switched to Linux on their main machines, and were fine.

Shiunbird wrote:This is getting long, no one is going to read this far. Computing is frustrating nowadays, and I'm not even that old.I only use my newer computers when I have to (for work sometimes), or to browse the Internet.All my music, my photography, my videos, my writing - all is done in older computers, with simpler software.</rant>

I got there...

Turning your back on Windows.. possibly turning your back on all three! I know how you feel.

Shiunbird wrote:This is getting long, no one is going to read this far. Computing is frustrating nowadays, and I'm not even that old.I only use my newer computers when I have to (for work sometimes), or to browse the Internet.All my music, my photography, my videos, my writing - all is done in older computers, with simpler software.

I do agree. I find my SGI systems much more pleasant to work with. I actually am using the Octane II as the work computer of my "DSNET3" network. Most everything on it is quick and it even plays DVD quality movies with no trouble. Even Xpdf loads almost instantly.

You know, its nice, that's why I use my SGI's. When I open up a piece of software on my Octane, its ready to go just about instantly. For the programs it does run, it does a much better job than a modern PC. Even with the limited I/O bandwidth to disk, which seems to be the limiting factor, the software and operating system has an overall responsiveness to it that is unmatched by the top of the line PC next to it that is 10x more powerful in every department.

The software and operating system are much less bloated, and the hardware is robust and powerful for its age even. I feel like increasing software bloat has tracked right along with increasing computing power. So the net increase we have received in useability is minimal, because the programmers find ways to waste the extra computational power almost as fast as it is advanced to us. Just imagine how responsive a modern computer could be if programmers managed to keep their applications as trimmed and streamlined as something like Maya 6 on irix, or Photoshop 3 even.

The things I can still do on an SGI I am grateful for, as the software is snappy, responsive, and just plain works. Of course as time goes on, the things we can do under irix become more and more limited, but it never ceases to amaze me how well a nicely written piece of software can handle under irix on a 20 year old computer, doing the same task as a current piece of software under windows 7. The windows 7 version manages to load way slower, be less stable, and offer what would seem to be not many more features for all the bloat.

The unix philisiphy of software being designed as a succinctly targeted tool, making it do one thing perfectly, rather than many things half assed, has stood the test of time well. Its sad to see it lost to time, as even linux these days has become a bloated mess. A CDE desktop, or irix desktop from 20 years ago had all the features I would want in a desktop / window manager, anything else just wastes cpu cycles and gets in my way.

Shiunbird wrote:I only use my newer computers when I have to (for work sometimes), or to browse the Internet.All my music, my photography, my videos, my writing - all is done in older computers, with simpler software.

</rant>

I have had the same realization, when it comes to creative works or even general Office drone work that are done with computers. Great example is George R.R. Martin does all his writing using WordStar on MS-DOS. Less distractions from useless add-ons can allow you to focus on your creative process better.

The guy was very good at what he does and in giving lectures. One of the most fascinating things he talked about was that the peak clockspeed and general performance of CPUs was reached about 2003-2005. The reason why is we will only be able to shrink transistors so far, making defects from manufacturing more prevalent in the finished product. Of course, multi-core technology and improved manufacturing processes have gotten around this problem and we certainly have better performing CPU's compared to 10 years ago, but this is only taking us so far.

The real purpose of his talk was about IBM's neurosynaptic-chip and Watson which makes use of said chip. I found the architecture of the neurosynaptic-chip fascinating and truly amazing that such a thing has been developed. The tl;dr is that the chip computes/processes/thinks just like our brains. This allows it to identify objects in images far more accurately than traditional CPU's and supporting software.

Meyerson's talk was definitely inspirational to me. With this, I feel there is still interesting computing going on our there, just in different ways and different avenues.

pilot345 wrote:The unix philisiphy of software being designed as a succinctly targeted tool, making it do one thing perfectly, rather than many things half assed, has stood the test of time well. Its sad to see it lost to time, as even linux these days has become a bloated mess. A CDE desktop, or irix desktop from 20 years ago had all the features I would want in a desktop / window manager, anything else just wastes cpu cycles and gets in my way.

I also like the commercial UNIX desktops for usability. 4Dwm and CDE have a minimalist, functional aesthetic and user experience that I haven't yet found a match for in the Windows, OS X, or Linux worlds.

Linux (and its 30 different window managers) I feel is particularly a mess. I have been using Linux desktops off and on since the late 90s. Every few years I go back and try it again but it never seems to have gotten its act together. While the open source community may have people who are able to create code, it seems to be sadly lacking in the UX and graphical design areas. The user experience of a Linux desktop is simply horrible. With the various components coming from so many disparate groups, no competent UX paradigm, and no concrete vision for the overall system and its usability, it is unsurprising that the whole system feels like a patchwork (and an often visually displeasing one at that). I also think the bipolar nature of trying to act as a bad desktop OS vs. an acceptable server OS has been to Linux's detriment, originating things like systemd or NetworkManager that end up being wholly inappropriate for one use or the other.