Posted
by
samzenpus
on Friday February 24, 2012 @01:46AM
from the pentium-80-how-I-miss-you dept.

An anonymous reader writes "A look back at two articles from 1995, touting high end computers and 'must haves.' How times have changed... ...'Memory (RAM): We seem to have convinced most manufacturers to adopt eight megabytes as standard, compared with four megabytes in 1994. Don't buy less than eight. The difference in performance between an eight megabyte machine and a four-megabyte machine can be dramatic.'"

This is why i think we all need to just look around and be amazed every once in awhile (no not at the porn, although I admit the new HD porn is quite impressive visually) at how far we have come. I was late getting into X86, with the VIC and Trash 80 lasting me most of the 80s so when I finally did get an X86 it was a whopping 60MHz Pentium with 8Mb of RAM, and hard drives were...what? 4200RPM? I remember them being slow as Xmas and more than a little prone to head crash and mine was a huge 200Mb. Graphics of course were 2D, I wouldn't be getting my first voodoo for another couple of years, and finally Internet was a 28k modem that frankly on a good night at 3AM you may get a quarter of that speed and had to run a background mouse program to keep the ISPs from kicking you off while you were trying to read.

Now I type this on a computer with 6 cores at 2600MHz, I've gone from 8Mb to 8Gb on the RAM front, hell my $50 GPU has more memory and faster clocks than my first four PCs combined and the thing has 3Tb of capacity and can even run every OS I used from 81 until today at the same time! And of course laptops then were these heavy power sucking "backpack busters" as we called them and frankly if you didn't have some serious money to spend good luck getting one. Now across from me is a dual core netbook that weighs 3 pounds and cost less than my VIC and maxing it out at 8Gb of RAM cost less than i paid for the floppy for my VIC.

So I think we should all stop and look around once in awhile at all we take for granted now because its truly amazing how fast and far we have come. Now even the machines I shitcan because they are simply too old are 10 times faster than my first X86, its truly amazing. Now most of us have crazy pipes that hardwire us instantly to the world, HD screens, surround sound, its just nuts how much we all have now.

Eh. There's not much of a difference. We're still using the same hardware and architecture as 1995. Heck, I can run the same OS on a computer made in 1995, or in 2012. Yeah, hard drives are bigger, and Intel's chips are faster, and yeah, PC's have a bit more RAM, but other than that, it's just more of the same. If anything, I'm amazed at how little computers have changed in the past 18 years.

Yes, but the difference has consequences. Capabilities have increased by a factor of a thousand or more in several areas. This has made certain things practical--such as effectively removing these resources as important limiting factors on most programs. In addition, it has made areas previously almost impossible because of these limitations--such as complex digital video editing on a normal microcomputer.

Not to mention playing video of good quality on a normal microcomputer.

Sure, there are some outliers in terms of improved capabilities, like video editing and even watching TV. But 90% of us are using PC's the 90% of the same way now that we did in 1995: Working with MS Office documents, handling email, web surfing, moving around files, etc. It may be prettier, easier, and faster, but it isn't dramatically different.

Me and nearly all of my neighbors did, but then again we lived in a suburb close to many large IT employers at the time. The internet was prevalent enough in 1995 for it be featured in popular media. http://www.imdb.com/title/tt0113957/ [imdb.com]

If you're using Wikipedia as a reference for how many web sites there were in '95 then you're doing it wrong. And standing on my lawn.

That list you're referencing is the number of sites founded prior to '95 that are still operating today, which is a much shorter list than what was available back then. The internet in '95 was an exciting place, it seemed like everything was available if you knew where to look. Early adopters were rewarded with the opportunity to be part of building the new cyberspace, and

I was doing both, but even there the differences are huge. For example, back then I would check my email once or twice per day. As in, people would send me an email and it would be stored on a server for a while, and then some time later I would get it. Downloading my mail often took a minute or two - and most of it was plain text. Now, my mail client is basically always connected to the server. I get notified as soon as mail is available and I read it as soon as I want a break from whatever I'm doing. If I wanted to send someone a picture, I had to upload it to some FTP or web space and then they'd download it (and I'd just hope no one guessed it was there).

The web back then was purely static. There was no JavaScript (depending on when in 1995, it was either not released, or so new that hardly anyone was using it). Frames were all the rage - they reduced bandwidth, which was useful, but also broke the back button, which wasn't. Animated gifs and embedded midi tracks were the height of dynamic behaviour. Most companies had a little bit of marketing information online, if anything. Things like online shopping were pretty rare - Amazon existed, but I couldn't order groceries online, for example. I could get news from the BBC, but not very much.

Such as using unsuitable or bad algorithms, wasting enormous amounts of memory, disk space and bandwidth on trivial tasks, using layer upon layer of badly structured APIs and on top of that a browser with an interpreted language running software we use daily (like gmail). Who would have thought it possible back then?

Such as using unsuitable or bad algorithms, wasting enormous amounts of memory, disk space and bandwidth on trivial tasks, using layer upon layer of badly structured APIs and on top of that a browser with an interpreted language running software we use daily (like gmail). Who would have thought it possible back then?

Either you weren't around back then or you are too young to remember but...

The lavish 33MHz and 8MB RAM (compared to the older generations of 16 bitters and 8 bitters) allowed lazy programmers to write such terrible algorithms and waste vast numbers of cycles on interpretd languaes like Visual Basic etc etc. My god, I mean windows 95 wasted so much CPU just to look a bit prettier. Real programmers still did everything in DOS.

Also, while some programmers have got lazier, others have not. Many algorithms have got much, much faster.

Also, while some programmers have got lazier, others have not. Many algorithms have got much, much faster.

And those layered APIs that the grandparent complains about make this easier. Now we don't have everyone implementing searching and sorting themselves, someone does it once and it's shoved into a shared library. The same with more complex things like image compositing.

I sincerely hope that you're being sarcastic, or at the very least, trolling.Back in 1995, there were plenty of "lazy", inexperienced and just downright poor programmers. However, aside from a few cases here and there, the objective was always the same then as it is now - get the job done in a reasonable time. In 1995, we had to invest a lot of time optimising and hand coding ASM to meet that objective due to the mentioned limitations in PC's. These days, hardware is so fast and plentiful, we can get on with doing other things and spend less time optimising. It doesn't matter how much memory the program is using or how many CPU cycles are being wasted when the job gets done in 2s versus 1.4s.

Sure, you might see it as wasteful or even lazy, but all you're really doing is substituting one form of inefficiency with another - the inefficiency of the program with the inefficiency of the programmer's time. Hardware is cheap, good programmers are not. If a company is spending £40,000 a year on a single programmer, they'll get far more value spending an extra £1000 on a faster Processor or more RAM than they will out of having him spend weeks hand-coding and debugging ASM ops for every application/routine he writes.

Yes, there will always be the exception and "throwing hardware at the problem" isn't the right solution, either, but saving time is saving money and that's why we have "inefficient" programs.

In 1995, Visual Basic 4 was released. Anyone who thinks that there were no bad programmers around then was either not alive or not paying attention.

That said, there are now a lot more programmers and, more importantly, the number of tasks where slow code is fast enough has increased and speed has stopped being the main concern. Software projects often live for over a decade and being able to continue to modify the code to meet new requirements in ten years is a lot more important than having it run very fast now (and what does 'very fast' mean? If it completes the day's processing in 0.5 seconds instead of 0.005 seconds, who cares?). Back in 1995, throwing away your code after a couple of years was only just going out of fashion.

A script I can today code in 30 minutes and run for 5 minutes is better than an application I had to write 15 years ago that took 4 hours to write, just to be able to run it under an hour of processing.

I don't dispute that when it concerns code you write for yourself. But when "optimized developer time" results in e.g. 5% of millions of Thunderbird users having to wait 3+ minutes to read their email because they have large inboxes and TB is terrible at sorting/storing/displaying mails in large folders, it does not seem to be a good trade-off. As one of the affected users, I'd much prefer it if they stabilized their ever-growing bloatware feature set (that has translated into no visible gain for users) an

True, things may be only a thousand or so faster/larger than 18 years ago. This might sound like slow progress, until you also realize that progress was made in other vectors such as physical size and power consumption. You do realize that the tiny smartphone in your pocket is significantly better than the humongous desktop PC of 1995, right?

This is actually want impress me the must.My smartphone (HTC Desire) have more computing power, than my PC I used back in 1994 did.

If I were in one of those bad "time traveller" movies, and brought my cellphone, they wouldn't believe I only came from 18 years in the future with the amount of power my cellphone have.

They wouldn't believe you anyway. What can you do with a modern smartphone in 1994? No youtube, twitter, facebook, gmail, no 3G, not even GPRS, no not even GSM, and after three days (because you're battery last 3x longer because you don't use video and 3G) it goes dead because there is no usb to load the battery. The only thing left is to use it to sniff some coke from the smooth touchscreen, but that's it.

If you brought an iPhone back to 1995 they wouldn't believe you were from the future because it has an Apple logo on it, and everyone knew they were always just a few months from going out of business.

If I wanted to type a 10,000 word document or play a complex RPG I'd take that 1995 PC over a smartphone anyday. Sure , the smartphone has much better hardware but from a usability point of view it leaves a lot to be desired compared to PC or any era.

The only reason you can run the same OS is that the x64 architecture supports emulation of the old 32 bit x86 architecture which supports emulation of the 16 bit architecture that came before it. Maybe you didn't notice these jumps, but they were there. There's another jump just happening, the move from magnetic hard disks to solid state disks. That's again one you don't notice unless you know about the technical difference, but it's still a pretty big difference. And yes we have more RAM, and yes that's even an example of something that's essentially still very similar to 1995 RAM, but even then, miniaturization is kind of a big deal. The chips may still work in the same way but there were huge advances in the technology that is used to produce them, which are hidden from most normal users. The basic idea of how a computer works is still the same, of course, but then, that hasn't changed in almost a century. And it probably won't change anytime soon - the next big change is probably the move to smaller, portable devices that require even less inside knowledge to operate. Maybe, ten years from now, you'll look at your phone and say "why this is so different from the computers we used to have to put up with- finally they changed something!" because the package looks different, but the overall architecture will still be the same.

While you appear to have a solid technical knowledge base, it is clear you have little to no practical knowledge or experience with SSDs other than off the cuff comments you've read here or there.

Let's go through some of your misconceptions shall we...

Price. Yes they are more expensive than mechanical hard drives. But the speed boost is substantial and worth it. I remember paying $200 for a 30GB HDD a long time ago. Now I can get a 128GB SSD for $160. My 128GB Crucial M4 is limited by my 3Gbs SATA 2 connect

I have a Toshiba T1910 from 1994 on my desk; I found it in a cupboard after a clear out at work. 4MB RAM, monochrome screen, 200MB HDD, 486SX 25MHz processor, Windows 3.11.

Boot time, from power on to ready-to-work (no HDD activity after boot), including a 3 second memory test, is 51 seconds. Yes, I can do a lot more with my 2GHz dual core 4GB RAM workstation (get prettier graphics, browse the internet) but this laptop has Word, Excel, Powerpoint, networking.

Really? Because I used Word 6 in 1995 on a machine with 5MB of RAM. It showed the splash screen for a well over 30 seconds before launching and I couldn't then run an image editor without exiting Word or the machine would thrash and become unusable. Saving a multi-page document would often take 5-10 seconds during which time Word froze. Word 2 was a bit faster (although saving was still slow). Oh, and Word 2 took about 15% of my total hard disk space just for a standard install...

Bidets my anonymous friend! You haven't experienced high culture until you've had a warm jet of water shot between your ass-cheeks and a nice, gentle breeze across your recently wetted-and-washed rear end!

A few months ago a friend and I went to Japan for a week and a half of tourist-ing it up. I had been before, he hadn't. When we got off the plane and he had to go the bathroom, I made sure to follow him in and stand outside the stalls so I could hear the scream as he used a Japanese toilet for the first time. That alone was worth the price of my plane ticket.

think about the RISC processor. It was developed by the guys at Acorn to run their RISC-OS.

This made me cringe. The RISC processor was developed at UCB. The ARM processor was developed at Acorn, inspired by the RISC processor and the 6502. Given that ARM processors now outsell Intel processors about 10 to 1, I don't think it's so unthinkable.

The article is a bit outdated, but I mean that in the opposite sense of it reporting computer stats from 1995. It seems a bit a year out of date on its stats. Am I nitpicking? Sure.

The 28.8 modem was introduced in 1994, and I recall it being in fairly wide use by summer 1994. Likewise, 17" monitors were not unusual or prohibitively expensive back then. I had a decent enough 17" that ran maybe $300 or so. The Apple repair tech knocked it off my table, and I ended up with a really nice 17" Sony CRT and a mass

I do recall that CRT monitors were for a very long time much cheaper than LCD/TFT screens. And for an even longer time faster (especially in refresh rates). Also CRT never really came down in price - stayed more or less the same, as materials/manufacturing/transportation are the bulk of their cost.

Indeed back in the days 17" was not expensive, back in 1995 I was using 15" already. I got a cheap second-hand one, a few years old, excellent condition. And early 2000s switched to a flat screen one.

A 24" CRT is still massive. Never ceased to be massive. I mean, ever tried to lift such a beast? You may have had to reinforce your desk before putting one of those on it! That huge chunk of glass just won't get any lighter, no matter what.

A 24" CRT is still massive. Never ceased to be massive. I mean, ever tried to lift such a beast? You may have had to reinforce your desk before putting one of those on it! That huge chunk of glass just won't get any lighter, no matter what.

I still have a 32" CRT TV, and one of the main things that's keeping me from getting a flat screen of some kind is WTF am I going to do with this beast? It's 150 lbs, but that's deceptive. It's 150 lbs of poorly-balanced, somewhat fragile dead weight. One person cannot carry it anywhere, at least nobody I've seen has figured out how. Two can manage, but I don't own a car. Funny how people are willing to deliver stuff for next to nothing, but you can't find someone to haul it back out again.

Grab it at the screen. Hold the screen against your chest, and your hands around the bottom of the screen.
about 80% of the weight is in the glass in the front of the screen, and the rest of the monitor will balance properly. It's kinda counter-intuitive.but it works.

I started moving 30" screens around in the late '80s when they cost a few thousand dollars. Never had to pay for dropping one.

I still have a 32" CRT TV,..... It's 150 lbs of poorly-balanced, somewhat fragile dead weight. One person cannot carry it anywhere, at least nobody I've seen has figured out how.

I've worked in IT support for about a decade and a half now and the move from CRT to TFT is an absolute godsend.
My personal favourite was when someone wanted their PC under their monitor to save desk space - you had to lift 50-odd lbs of monitor and then brace it with one hand to slide the desktop underneath cos' there was no way the desktop would slide into place with the monitor resting on the top.
When we migrated to TFT's I wrecked my back for about a week lifting all the old monitors as we got rid of them but the pain was worth it to never see those b4stards again...

CRT's are great for playing Light Gun games. I have an old 26" CRT TV in my basement hooked up to my PS2, and it works great with my GunCon lightguns. The light guns they have for LCD are junk and not accurate.

For me, the most dramatic example of the progress of hardware in the intervening years is Emacs.

It used to be regarded as a heavyweight editing environment, comparable in scope and resource requirements to a full programmer's IDE. There was even a special server designed just to allow several editing windows (aka frames) to coexist.

Now, it's so lightweight and fast to load up, my web browser launches a completely independent Emacs for each comment field in a web page, my MUA launches its own Emacs for writing mails, I have multiple independent Emacs processes for editing code, and another for writing LaTeX.

One of the things that infuriates people who like GUIs is when
Cut'n'Paste doesn't work properly in some applications. Even when it
works, the data transfers through cut and paste depend a lot on what
an application will recognize or let you copy. E.g. you might copy an
image from the web browser and yet you can't insert it as a background for
your music player.

I'm the same way with editing text. Nearly every application requires
some text input

My 486 only had four megabytes of RAM. I had to reboot and bypass CONFIG.SYS and AUTOEXEC.BAT to run Doom. The reason? My mouse driver took up too much memory. And this was in DOS, where you only had three or four drivers to begin with.

(Before any other old folks ask -- I already had other drivers in upper memory so the mouse driver wouldn't fit there.)

Doom used a DOS extender. As such, you could pretty much have all your drivers in base memory without any of that UMB mangling.

Ultima VII and the Voodoo Memory Management (http://ultima.wikia.com/wiki/Voodoo_Memory_Manager) on the other hand....required a lots of base memory and you really couldn't run anything like EMM386 reliably. Was...interesting to get Ultima VII working with 2MBs of RAM.

Doom was a lot easier to run than games that used earlier DOS extenders.

Remember Zone 66? Just to run Zone 66's crazy DOS extender I had to use a config.sys menu [rsvs.net] to boot into a separate configuration that only loaded my sound driver but not the memory manager. Total pain to set up.

I'm saying this not because the power was so good, but because nothing compares to Red Baron, Secret Weapons of the Luftwaffe, and Xwing. EA/Bioware could have scored big with SWTOR by using Xwing vs TieFighter style combat in an MMO context where you can upgrade your ship. Instead the space combat is a gimmick and the game is barely an MMO with so few people on each server.

What if they brought back Stunt Island as Stunt Island 2? Allow people to autoshare videos on Youtube. Allow people to share/rate missions like they do on Little Big Planet. Have multiplayer with watchers/chatters. Have car racing too if you want to go all out.

Maybe I'm not in the mix anymore, but when I played some modern flight sims they showed an out of cockpit view and you just flew around using the mouse. Maybe someone could point me to where the good competitive gaming flight sims are that I am not aware of?

Another thing we're missing from the early/mid 90s is adventure games, but I don't miss them any further than I can get without the blue key.

Yep. And if you go with the informal version of Moore's law, "X doubles every year and a half" where X is just about any measure of computer capability, we're still almost on track. 2^10 = 1024, as/.er should know by heart; strictly speaking, this should mean about a thousandfold improvement between 1995 and 2010 rather than 2012, but everything you list was available two years ago, if at a somewhat higher price. And yes, X may just as well be boot time as RAM or processing power.;)

32 GB flash drives are now pretty common and at roughly the same $/GB as 16 GB units . In the last month i have bought a 32 GB micro SD, and (2) 32 GB flash drives all at or below $1/GB. 64 GB and 128 GB units are also already available at retail., just not as common and the pricing scales fast.

I just built out a 4 Ghz 8 core machine with 16GB RAM and a pair of 1.5TB drives for right about $600.

I recall back in '91 my 486DX/33 with 4MB RAM cost somewhere in the ballpark of what you cited ($2,500).

Boot times are relative. My Linux machine with everything I don't need removed boots faster than the 486 with DOS, Desqview, and a fairly dirty config.sys and autoexec.bat.:) You can make anything boot slow if you try hard enough.

I noted the article still thinks a CD/DVD/BluRay player is normal. Aren't they obsolete already?

It's been five years or more since I had a working DVD player in any of my PCs. Except my iBook which has one built in, and that's also some six years old now, and the DVD player in it has barely been used in that time.

I used to burn CDs with photos and so - still have some, from many years ago, and really should copy them to a USB stick or so before I really don't have a CD drive any more. I used to burn CDs for Linux installation; now that's done from USB stick. I used to burn CDs as archive as my hard disk got full. Modern hard disks are so big, they don't fill up. And if they do, the capacity of a CD-R or even DVD-R doesn't do much to solve that problem. A bigger hard disk is the only reasonable solution.

And monitor - well I still use 15". It's good enough, and my desk isn't that big. Those also didn't come down in price as drastically as the other components did.

What I also noticed is that in the US just 85% of adults have a mobile phone, and 90% live in a household with at least one mobile phone. I think that's a really low number. Where I live there's close to a 200% (yes, that's two phones per person, not only per adult - many people have indeed multiple mobile numbers, and many are used by regular visitors) penetration of mobile phones.

I was 14 years old in 1994
I had a Macintosh LC 475 [everymac.com] back then. It had a 25 Mhz Motorola 68040 CPU [wikipedia.org] and had come pre-installed with Microsoft Virtual PC for the Mac [microsoft.com] which emulated x86 architecture on the Motorola 68040.
A magazine called PCQuest [ciol.com] ( It was a geek-focussed magazine then; it's a CIO-focussed magazine now ) came out with Slackware on the CD. ( I cannot remember the version)
I managed to installed Linux as a VM on my Mac 18 years ago using this. [slashdot.org] ( That's a link to my blog post with more details as to how I did it )
Of course I did not know what Virtualization was. I did not have an internet connection even!
It took me a year to get X running - just by reading the man pages and configuring modelines and hsync and vsync values
My proudest moment was when I wrote my own man page using nroff ( IIRC ) and it showed me bold fonts in a terminal. I did not know even know what a terminal was, except that Jeff Goldblum destroyed the Aliens by uploading a computer virus through it ( Movie: Independence Day )
I am nostalgic

33 mhz would have been low end back then, 66 mhz or 120 probably more likely in a new computer (in Late 1995 I got a pretty beefy 150mhz Pentium). While 4ghz is probably very high end by today's standard (people tend to get more cores rather than more hz).
Soundcard, not many people get a high end soundcard like the one listed, the real equivalent to the SB 16 is probably the onboard sound cards. A highend Adlib soundcanvas or Roland could probably stand up to today's highend cards in terms of sound qualit

When i think back to 1995 i expected a machine/os, which has/uses *lots of cores and bandwidth to ram*, where everything is reasonably multi threaded and where programs can exchange data in a reasonable, transparent way.

Nothing came true. Application still freeze when waiting for sth, a massive CPU still has to be running to do simple background operations, we still exclude Bitmaps is text documents because nothing else works, and my CPU is still waiting for the RAM, even longer than before.

I was probably still using a 386 SX/16 with 4mb of RAM. Installed OS/2 on it and it actually worked reasonably well. After a while I installed Linux on it. Early slakware. That also worked reasonably well except it didn't have enough graphics prowess to actually run X11 in VGA. I just used terminal mode. It was fine. Ran slirp for PPP off Florida's gate.net and did a lot of MUDDing.

I eventually upgraded from that system to a dual 486/66 with 16 MB of RAM and an S3 video card. That thing was a beast. Ran X

I love the column on video, where the 1995 columns says "24-bit", and the 2012 column...oh wait, we're still 24-bit. Everything else has advanced by several orders of magnitude, but we're still limited to just 8 bits per color channel (RGB = 24 bits in total) going out over the DVI cable (and the display itself). Sure, now you can drop a few G's on a 10-bit (30 total) monitor (if your software can even make use of it), but it's kind of sad that progress has been so slow.

More than 8 bits per channel is simply not important for output, internal memory support for HDR matters but there just isn't a need for higher color, especially since things like color tempurature of the display mean no image looks the same unless on a fully calibrated system

I love the column on video, where the 1995 columns says "24-bit", and the 2012 column...oh wait, we're still 24-bit. Everything else has advanced by several orders of magnitude, but we're still limited to just 8 bits per color channel (RGB = 24 bits in total) going out over the DVI cable (and the display itself). Sure, now you can drop a few G's on a 10-bit (30 total) monitor (if your software can even make use of it), but it's kind of sad that progress has been so slow.

What's worse, human vision is still limited to about 10 million nuances and can't even take advantage of 24 bits. Time for an upgrade!

Of course computers have N times the speed and memory. Regarding computer science concepts and algorithms, where is the real progress in that field? Most of the concepts used today were designed before 1995 - and a lot of them even before the modern computers ever existed.
CPU and memory is a confortable progress - but is not a revolution. Still to come.

might as well have another one of these threads, it's been a few years since the last one I think...

Acorn Atom circa 1980 ish. 12KB of RAM with the expansion pack. No storage at all (you could link to a tape recorder to very slowly store and recall data). No display (it plugged via a PAL lead into a TV). BASIC language and operating system fitted into a 2K ROM module if I remember correctly. I still have it on my shelf but haven't been able to plug it into a power source or TV for years.

What surprises me is that most of the older games from around this era have yet to be rivalled even today. Nevermind the fact that games back then didn't have EULAs, DRM restrictions, or DLC. You got what you paid for, and that came in a full sized box adorned with awesome artwork- and on the inside, you got a CD in a jewel case and a manual as thick as your thumb.

We had gems like Descent, Descent II, Command and Conquer, Warcraft 1, Warcraft 2, Tyrian, Raptor: Call of the Shadows, Duke Nukem 3D, Crusader: No Remorse and Crusader: No Regret, Mass Destruction, Wipeout (the original Psygnosis game was a MS-DOS release- it ran straight off the CD and had an absolutely awesome soundtrack from Cold Storage), Star Wars: Dark Forces, X-Com, SimCity 2000, etc.

Just after that era we got gems like C&C: Red Alert, Total Annihilation, and Starcraft.

Not a single game had any kind of grinding wankery in the form of "achievements" or "trophies". You bought a game, you got 10 to 20 hours of entertainment in a box. It was that simple.

Today, you're lucky if: A) $69.99 gets you something even remotely worth playing (since demos and shareware are long forgotten), and B) maybe 2 hours of actual entertainment wrapped in 20 hours of fucking around in a giant sandbox to boost some stupid number so you can proceed with the main quests/missions. Oh, and you don't actually "own" games anymore. You're licensing them, they only work 5 times (if you're lucky), and the disks often come in paper envelopes publishers have gotten so goddam cheap.

But hey, EA's releasing the next big version of MW or CoD! So whoopie! Nevermind the fact that they've driven Westwood Studios and Origin into the ground, and now they've done the same to Maxis and have focused their attention on Bioware. CRANK THAT FRANCHISE WHORING FACTORY TO FULL THROTTLE BOYS, WE HAVE CONSUMERS TO EXPLOIT!

When I started, I had 4K and saved programs I typed to cassette tape! The differences between then and 1995 are orders of magnitude greater than 1995 to now.

I clearly recall the last three jaw-dropping moments:

circa 2001, Seeing AMD beat intel to the market with a 1GHz processorcirca 1997, being able to download a music file in less time than it took to play.circa 1991, seeing a postage-stamp video of the moon launch on Quicktime from the Apple Developer CD they distributed monthly.

Other than that, its all more of the same, or far enough back in history as to be a blur.

I worked for the Computer Graphics lab at the University of British Columbia in 1992, and we had one machine we called Brutus. It had about 4 boards of memory (each larger than a desktop morherboard) in it for a total of 380Megabytes of ram. When I mentioned that I worked with a machine with 380Meg of memory, most people would go "ooooh, that's a really big hard drive!".
Nevermind..

(( The PC with 4Meg of ram, running OS/2 was considered a toy even though it was more than most normal people had on their desktop. ))

Heh. The other day, I helped a friend set up his new USB wireless modem on his laptop. When he plugged it in, the ISP's app loaded automatically but Windows 7 couldn't recognize the hardware, and brought up a window to search for a driver on the internet,...

Ah... 1995. I remember back then talking to my girlfriend (now wife) about how things "used to be back in the day."

One of the things I noted even then was the reliance on the Internet. I recall stating something like, "back in the 80s, I could spend an entire stretch of days at a time, stuck in my room writing stupid home-brewed programs in my Commodore 64, with very little sleep; I could always find something to do with that little machine without any network connectivity or external communications. Nowadays, I sit at my computer desk, and if the 'Net is down, can't check my e-mail, can't use my browser, can't log into the BBS... it's useless, and I turn the fucker off."

Today, if my cable-modem connection goes down, I just grab my iPad and play Bejeweled or some other game, watch a movie, or catch up on my reading.

My, how times change.

It is not that I've grown less reliant on my Internet connection. I think it's just that modern machines are much more pleasant to use for many other use cases.

You see, in the 80s I was discovering computers and every silly "GOTO 10" statement was an adventure. In the 90s, I was exploring the vast frontiers of the Internet, and while using a PC was a fscking pain, I endured it for the value of the network and communications.

Now, the device is not a pain to use, and I use it for many other things than just exploring the Internet or communicating with others. This is the actual progress of our technologies: Convivial machines to fit human lifestyles.

It is amazing what we have now. I truly feel like I live in The Future.