Home | Polski Polski
Attached
Go back | Previous | Next

The sockets that didn’t match

Both my home and work PCs are at this very moment equipped with 640 megabytes of memory. For a longer while that number seemed oddly familiar, and then I suddenly realised – my first PC has also had 640 of RAM, only it was kilobytes...

My first memory-related problem occured at my colleague’s house, when we couldn’t figure out why “Test drive” wouldn’t even start on his computer when it was running in perfectly order on mine. After several more or less strange hypotheses, we found out that despite his belief he only had 512 kilobytes of RAM in his PC/XT machine. Upgrade to 640 kilobytes next week solved the problem, but if I was to get the impression that it’ll always be as smooth as that, I couldn’t be more wrong.

Back then, compared to 8-bit Atari’s mere 64 kilobytes, 10 times that much RAM seemed a vast amount of space for data (hell, 130 XE’s doubled RAM was way too big for my imagination). Apparently I wasn’t alone with this feeling, having in mind Bill Gates’ arguably most famous quote. But be it this unfortunate quote or not, from that day forward it was one hell of a bumpy ride, and it doesn’t seem to get less coarse anytime soon.

Soon the average computer configuration started to be equipped with 1 or 2 megabytes, and we got to know all the strange flavours of computer memory. Suddenly it was all differently sized and overlapping chunks – conventional memory, high memory, extended memory, expanded memory... – and some of us have spent long hours trying to figure it all out. Or die trying, because we didn’t really have the choice.

The reason was simple: games. We wanted to play them, and – as someone once said – games being the most cutting-edge software were also always the most demanding. So we were memorizing all the strange acronyms such as XMS, EMS, UMB, HMA, creating different versions of CONFIG.SYS and AUTOEXEC.BAT, playing with DEVICEHIGH and LOADHIGH, carefully choosing between EMM386 and QEMM, and dealing with other similarly nonsense issues.

Yes, they were nonsense. The Amiga guys laughed at us, and they had every right to do so, because even if they had their own slow memory and fast memory problems, they were nowhere as close to the total madness as what we had on our DOS-based PCs. The total time wasted on resolving issues which were just a byproduct of some engineers’ blind vision must have been huge. Not mentioning the recurring frustration of there always being one kilobyte of conventional memory too little to run something – and sticking in even 32 new megabytes wouldn’t help one iota, because that obviously was a different type of memory. A different chunk.

And it wasn’t only other games. At that point I was writing my own game and I remember doing all kind of special tricks in order not to cross the 600 KB border – which I believe I’d gone past anyway, only someday the project simply got abandoned.

Having that in mind, it’s somewhat funny how I’m starting to miss those times. The task of squeezing every single byte of lower memory was usually very rewarding when finally accomplished (even if the new game – the direct reason – was crap) and I’m still a little bit proud of my polished AUTOEXEC/CONFIG combo with two dozens of options which allowed me to run every single game I wanted. Because of all the nonsense, for the average Joe Sixpack it looked just like magic, and let’s face it – who doesn’t want to be perceived as a magician?

During the next couple of years, with the invention of DPMI and 32-bit Windows slowly taking over the world, the problems slowly diminished. But we got other things to occupy our minds with – for example, virtual memory with its not-so-virtual problems (my work partner moans about not being able to increase VM size on his Windows Me every couple of days). Or parity which sometimes was just fake-parity (we rather inappropriately used to call it “boolshit”). Or the neverending story of “the sockets that didn’t match” – there were DIPs, SIMMs, DIMMs, EDORAMs, SDRAMs, DDRs and whatnots, and usually what you were about to put in your computer was different from what the mainboard manufacturer had in mind.

Quite recently I was facing that problem once again. I was trying to get the best out of 15 or so computers in our company’s laboratory. Rather old computers, I must add, back from the beginnings of the Pentium era. Some of them had too little memory, whereas some of them had too much of it, and I thought it might be good to try to even things out.

The result? A complete fiasco. I accomplished precisely nothing. I was running around for three hours with all the different chips in my hands and even if the edge connectors matched, usually only one computer could recognize the memory properly – quite obviously, the one I originally removed the chip from.

And I’m wondering... will the software-hardware cycle repeat itself? I heard there’s a new problem on the horizon – the 4-gigabyte boundary – and some workarounds have already been created. Hopefully, things won’t get as complicated this time and maybe one day adding memory to the computer will become as simple as just choosing the desirable size in a shop, bringing it home and putting it in a standardised slot? Because the first time I heard about modular structure of PCs which IBM was so proud of, I thought it would look exactly like that...

by Marcin Wichary



  Previous | Next
Page added on 4th January 2003.

Copyright © 2002-2005 Marcin Wichary
Printable version | Contact | Site map