www.schnapple.com

Home
ViaTexas.com
What She Said
SchnappleCam
Contact

November 9, 2002
 
8:50 PM
Chinese Democracy to be released in February. I'll be very interested to see if lightning can strike twice...

November 8, 2002
 
4:55 PM
I think I'll take advantage of this lull at work (boss goes to Doc. right as things go stale) to dispel another myth or two that have been bugging me.

General Computing Myth #1: The latest (whatever) is always neccessarily the best.

I'm running two PC's at home. If you count the one my Wife runs as well as the dinky one in the garage running Lindows, then that's four. My system runs Windows XP. We decided it would be good to have a new system from spare parts to do the things we don't want to trouble our PC's with, such as being a print server, a test web server, as well as being a backup server (to this end we moved the tape drive to it).

Since I've seen the light of the Whistler line of operating systems (Windows XP), I don't want to go back to Windows 98, which was notorious for locking up with regards to network operations. However I also didn't want to lay down the wad for a new copy of XP, nor did I want to break the law and pirate one. So I did some looking and discovered the Windows.NET Server 2003 Customer Preview Program. Sign up with Microsoft and you can download and install Windows.NET Server 2003 RC1. This copy will function for 360 days and is the successor to Windows 2000 Server. I signed up, they let me download it, and I'm very much pleased with the results.

Now there are a few drawbacks. For starters no one makes Windows.NET drivers for anything, so you have to guess as to whether or not to use Windows 2000 drivers or XP Drivers. Also, some devices don't have equivalent drivers for Windows 2000 Server or they outright say they don't support it. It's kinda a crapshoot. Plus there's always the very curious things about a Microsoft Server Operating System - like why does it bother with putting Solitare and Minesweeper or certian other things that pose small risks on consumer OS's but would be huge liabilities on production servers - like "easy file sharing".

Nonetheless, it does all I want it to and I'm happy. But I see on the private newsgroups they let you have access to that there are a number of people who decided to download this operating system and install it as their new home operating system. I can almost understand why you might think this would be a much better deal than say Windows 98, but some people blew away their Windows XP Professional partitions and opted instead to install .NET Server. Then they complain (and get flamed immediately) that "this thing won't play Counter-Strike, WTF???"

This is the embodiment of the myth that the latest whatever is always the best. Windows XP Pro was the best OS Microsoft ever did, so Windows.NET Server must be even better, right? Complaining that Windows.NET Server (or even Windows 2000 if you get right down to it) won't run your games is like complaining that your new car won't fly unless you drive it off a cliff - and then it crashes.

The last place I used to work at would never apply the Windows NT 4.0 Service Packs until a few months had gone by. It would never fail that the service packs would break something vital to the operation and then you're stuck backtracking and (ugh) reinstalling. I'll never forget when a new student worker (like me) was hired - he couldn't understand why all the PC's in the library were running Windows 3.1 in 1999. I could never seem to make him "get" the fact that these were old machines, new licenses were expensive, and that (at that point) Windows 3.1 still did everything neccessary to access the online card catalogs and such. It blew his mind when we upgraded almost all of them to Windows NT 4.0 instead of Windows 2000 (literally a few weeks old at that point).

General Computing Myth #2: A Bigger number means better.

This myth can be dispelled by anyone running an 800MHz Macintosh that can outpace a 2GHz PC in some areas (most of which involve Adobe code).

I blame this on two things - the mass public's misunderstanding of what a version number means, and game consoles in the early 1990's.

The ruler of the roost in 1989 was the Nintendo Entertainment System (NES). Atari and Sega might as well not have tried. But that year Sega unveiled a competitor in the form of the Genesis. The NES, they pointed out, was an 8-bit system. The Genesis was a 16-bit system, so it was therefore better. Anyone can see that 16 is twice as much as 8, so the notion instilled in the public was that the Genesis was (at least) twice as good as the NES. This wasn't neccessarily untrue, either. The Genesis could run faster games (though that was more a function of clock speed than processing bandwidth), could run bigger game levels (more a function of memory than of processing power) and could display more colors. The fact that it ran more colors was a result of the CPU being 16-bit.

Of course, in two years the Super NES (SNES) followed, and by all accounts was a superior system to the Genesis. Of course, playing catch-up with the Genesis, it would have to be better to even stand a chance, but it allowed for even more colors and more elaborate games (it was the system of choice for Japanese RPG developers). But it, being a 16-bit system, was seen as "equal" to the Genesis by the general public.

So then a few years later it came time for the next generation of game systems. Naturally, the console makers started working on 32-bit systems. Atari and Nintendo both had the bright idea of trying to leapfrog that generation and play on the publics fascination with higher numbers, and came out with 64-bit systems. Atari even contributed to this notion by unveiling a "DO THE MATH!" campaign (decreeing that 64-bit was always neccessarily better than 32-bit). Of course by this point the clear notion of what a "bit" meant was getting muddied. The NES had an 8-bit CPU that did everything, the Genesis/SNES had 16-bit CPU's that did almost everything (usually sound was covered by a second processor). However, the Atari Jaguar had a 16-bit CPU and four other chips - two were 32-bit and two were 64-bit. Since the 64-bit chips were the graphics chips and graphics were everything, the highest common denominator won out. However, when 32-bit PlayStation games were looking and playing better, the Jaguar won out. The Nintendo 64 had more components that were 64-bit, but not too many more. Sega even tried to parlay the fact that the Sega Saturn had three different 32-bit processors as an advantage. That developers hated developing on these multiple processors or that the single PlayStation processor was still more powerful they didn't point out.

So when the next generation got primed (Dreamcast, PS2, etc.) lots of people were asking "how many bits?" and no amount of explaining could convince them that bits didn't matter anymore and that things like polygon count were more important. It didn't help that over-zealous game store employees trying to sell the Dreamcast pushed it as a "128-bit" system, and then later employees trying to make the Dreamcast look crappy in the light of the PS2 sold that one as a "256-bit" system.

Truth be told, the Dreamcast does have what could be considered 128-bit guts, same as PS2, but I don't think the makers of either system bothered with the bit thing. Now here's the fun part - the Microsoft Xbox, what most people agree is the most powerful cat on the block, is a 32-bit system. Seriously. The Intel Pentium III processor it has inside is a 32-bit CPU, the Nvidia chip does 32-bit color, and the sound components are at best 24-bit.

Now on the PC front, Intel is putting out a chip called the Itanium (there's already an Itanium2 on the way). AMD is coming out with a chip called the Hammer. These are 64-bit CPU's. This means that the bandwidth is twice as much and the amount of memory they will need is twice as much. It does not mean that the chips (at comparable clock speeds) are faster. They require everything to be recompiled. They require specialized operating systems. And since they want to keep their instruction sets low, optimizations have to be done on the compiler end. One person at QuakeCon 2002 asking Carmack if DOOM 3 would be available for 64-bit processors was surprised to learn that the answer was no and the reason was that a 64-bit CPU would actually make the game slower (since at this point DOOM 3's graphics are more a function of video cards than CPU speed). The only real advantages a 64-bit processor might have are in the server arena.

Then there's the version number conundrum. The public believes that a higher version number means better. There is a little bit of logic to this - experience brings something to the table. The latest consumer OS should be better than the previous one. Of course, history is full of exceptions - Windows ME was by all accounts a disaster compared to Windows 98, and like we've mentioned, in production environments people aren't always itching to upgrade to the latest version of anything (an extreme example of which is why I have a job programming COBOL on a 1985 mainframe).

However, people have played on this notion for some time now. AOL has lavish ads to point out new versions of its client software - that it took them eight major versions to make certian things work the public seems not to mind. One WWII flying game (whose name eludes me) came out with numerous patches within its first six months, cumulating in a Version 5.0, which they then advertised on the new boxed copies of the game. Why a consumer would think that a game requiring five major revisions in six months after shipping befuddles me.

Microsoft contracted to provide PC-DOS to IBM back in the 1980's. They placed in their contract that they could also sell it as MS-DOS. When the arrangement expired, Microsoft released MS-DOS 6.0, but did not deliver a PC-DOS 6.0. Instead, IBM took PC-DOS and made their own improvements to it and released it as PC-DOS 6.1, skipping 6.0. Microsoft then released MS-DOS 6.2 and 6.22, followed by IBM's PC-DOS 6.3. These guys were trying to one-up each other to play with the public's notion that the higher version number equals better. It was all for nought, though, since PC-DOS lost (and IBM killed it in favor of the similarly ill-fated OS/2 2.0 project).

Microsoft Word was released as Word for Windows 1.0 and then Word for Windows 2.0. The next version, however, was named Word for Windows 6.0, skipping 3.0, 4.0 and 5.0. The official line was so that it would match the version number of the DOS version, but some say it was to make their version number as high as or higher than WordPerfect, the then market leader. Of course WordPerfect debuted on Windows at 5.0 or 5.1, to match their own DOS version, despite debuting the 5.0/5.1 equivalent on other platforms (Mac, OS/2) as 1.0. At one point in time, a version of Microsoft Office would have different versions of the programs (Word 2.0, Excel 5.0, etc.) but Office 95 had all products at 7.0 - not that it mattered, by that point it was Word 95, Excel 95, etc. Office XP is 10.0, and Office.NET (or whatever they call it) is 11.0

Microsoft's development products aren't immune, either. Microsoft's Visual InterDev was 1.0 with Visual Studio 5.0 (I think), but when Visual Studio 97 was released, it was bumped up to 6.0 to match the other releases. InterDev has pretty much been dissolved into the ASP.NET handling features of Visual Studio.NET.

And the year numbering scheme isn't impervious to the public's notions as well. Windows 95 was released, and so was Office 95, which required Windows 95. Makes sense. Microsoft then released Office 97, so the joke was if someone said they were running "Windows 97" they didn't know the difference between an OS and an office suite. So you can imagine how difficult it was for people to get that even though there was a Windows 2000 and an Office 2000, they didn't have to have Windows 2000 to run Office 2000. I wonder how many sales MS missed out on with that. Didn't matter - they did the same thing with Office and Windows XP and since XP was aimed at the average user (whereas 2000 wasn't), they don't mind the association one bit.

The one place the year numbering scheme makes the best sense is money management software - every year we see a new version of Microsoft Money and Quicken the same way we see a new year model of Ford Explorer every year, only some people buy into the notion (myself, a little) that you should buy the new software every year - more like a donation for your continued convienence and financial health. Plus the rebates are usually pretty healthy.

Anywho, another day, another couple of pieces of misinformation dispelled (I hope).

November 6, 2002
 
5:02 PM
I read a statistic the other day - apparently in the early 1990's, the amount of music being sold on cassette tape was 66% of the total music sales - today it's less than 4%. Personally, I'm wondering what is up with those 4% of people. Who is it that hasn't "upgraded" to CD yet? Some people might still have cassette decks in their cars, but how many people don't own a boom box that can copy your CD to a 99¢ blank tape?

I actually pondered this a few weeks back when I went into a record store in the mall. For some reason it just hit me that I didn't even know if they bothered to carry cassettes anymore, so I turned around. Yup, they still do. They only take up one portion of one wall and their availability is spotty, but you can still get cassettes of the latest albums.

Part of me misses the cassette - before I went CD in 1991, I had to buy these things as Vinyl became scarce. The cover art for albums is square, so either only a portion of the cassette cover is used for the cover art, or it has to be changed to be configured for the rectangular shape.

One of the things cassettes had going for them was the idea that they were more durable. True, throw a cassette on the pavement and the odds of it being playable afterwards are better than a CD landing face-up. However, play a CD 10,000 times and it will sound the same every time (hardware withstanding). But the thing that always got me was this - the CD is a piece of aluminum and plastic and it costs more than the cassettes - which take longer and are more expensive to make (they even have moving parts). Originally people had no problem paying more for CD's - new technology and all. But CD's never got cheaper (they got more expensive) and so the original premise, that CD's would go down in price after certian R&D costs had been recouped, never materialized.

I find it interesting that vinyl is still around, sort of. I don't know what record labels still make vinyl copies of new albums, but for even the artists that can, not all of them bother. What I do see a lot of is artists coming out with 12" singles - full sized vinyl records with a song and maybe a few remixes of that song. DJ's use them. I don't pretend to understand the whole notion of DJ'ing, but other than the mere concept and looks of spinning a record, I'm not sure if there is any real advantage over just cueing up a CD.

Now I see that BMG, a mega conglomeration of record labels, is unveiling CD copy protections on all discs sold in Europe soon (Europeans are apparently less vocal about rights). If it goes well they'll do it here (USA), too. They claim that the CD's are redbook compliant, meaning that if your player doesn't play them, you need to get a new player that can. This would be like tire manufacturers all deciding to make new tires a certian way and telling everyone if they don't fit on your car anymore to buy a new car, not their problem.

The irony is that the CD is destined to go away as well. Picture if you could just buy your music as MP3's. No more manufacturing costs, no more shipping costs, no more retailer middle man. People could buy their music online, not share it (if it's cheap enough people will buy it), and the record companies could stand to make more money. Dvorak had a column on this.

Funny that I worry about formats and it's all destined to turn into air eventually.



This page is powered by Blogger. Isn't yours?
archives
Pre-Blogger Archives

Hit Counter