Pages

29 October 2012

270. Artificial limits

I don't normally care about windows or mac. It's been a long time since I bothered converting people to linux, and I read news about windows like I would read news about BSD -- with only mild interest.

But since I recently upgraded one of my nodes to 32 GB RAM I spent some time googling about what things I could do with it (the purpose of all that ram is computational chemistry -- in particular frequency calculations) and stumbled across a post: http://forums.anandtech.com/showthread.php?t=2234771

The thread is called: "How much RAM is too much?", and someone answered "193 gb", which turned out to be a reference to the artificially imposed limits on windows 7: http://www.zdnet.com/blog/hardware/max-memory-limits-for-64-bit-windows-7/4254

Apparently you can use 8 gb for the lower end, 16 gb for 'normal' home use, and 192 gb for the high end versions. I guess the fact that there's a 192 gb limit at all (I don't think there are any single-board boxes that can take much more than 64 gb, or possibly 128 gb, at the moment) is to avoid a repeat of XP, where the OS stopped making MS money long after they had tried to deprecate it.

The number of physical cpus is limited to 1 for the low end and 2 for the 'professional' versions. When it comes to logical cores it's 32 for 32 bit and 256 for 64 bit.

I'm not sure if the latter is an artificial or real limit, but the former certainly is artifically imposed.

Oh well, doesn't hurt being reminded every now and again about the frankly absurd things that the commercial world of software comes up with. You don't often get 'pro' versions in the FOSS world...

No comments:

Post a Comment