However, it turns out the Dreamcast-centricness is unwarranted - Shenmue began development for the Sega Saturn! Here's a video clip that proves it - it's apparently on one of the Japanese Shenmue II discs and it's unlocked when you finish the game.
What blows me away about it is how far along this title was in development for the Saturn before it was shifted to the DC. Also, it makes me do a double-take on that machine. I had always heard the Saturn thought in "quad"s (i.e., 2-D) and the PlayStation thought in "tri"s (i.e., 3-D) and that it was mainly luck that 3-D took off. I had also heard that it was akin to jumping through flaming hoops trying to get the Saturn to do 3-D environments - clearly AM2 is a very talented developer.
This really wants me to go home and finish the first game (never got past the "wandering around aimlessly" part - which is most of the game).
There are these two words, digital and analog. Digital roughly means "ones and zeroes" and analog roughly means "not ones and zeroes". With me so far?
Picture a vinyl record player (for you children of the '80's, ask your parents). You take your needle and you place it in the groove. The vibrations of the needle cause the sound you would hear from the record. This is an "analog" method of sound reproduction at its very basic definition. Now picture a Compact Disc player. The CD spins and a laser bounces off the little microscopic pits on the disc. The pits are either there or they're not there. When there is a pit there, it's considered a "1", and when there's not a pit there it's a "0" (or vice versa, I can't remember which). The 1's and 0's are collected together to create a sample, a very quick "burst" of sound. The music you hear is comprised of lots and lots of these samples. 44,100 per second to be exact (double that if you count the left and right channels separately). This is "digital" sound reproduction.
So what's better? The quick answer is the CD. If you count a sample per second as a hertz (the common method of doing so) then a CD has a "sampling rate" of 44.1KHz. This is pretty much adequate - no one complains that a CD sounds subpar when they hear it. However, what is the "sampling rate" of a vinyl record? Well, as the record doesn't consist of data but rather of a single groove, the sampling ratre paradigm doesn't really apply, but if you were to force it to apply the rate would be infinity. Now compare infinity to 44,100. Which one comes out bigger?
Of course, this doesn't take into account real world concerns. Vinyl records are prone to scratches and dust - their storage is "naked" or "exposed" - as opposed to the plastic coating on a CD. You can, of course, scratch a CD so bad it screws up, but it's harder to do than a record. The data error correction is such that the laser can usually neglect minor scratches (unless it's so bad as to act as a "prisim") and dust. Also, a vinyl record is prone to wear due to simple friction - the grooves wear down over time and over excessive use. Finally, the vinyl record usually has to have a cushioned, staticless surface to spin on and a $1000 tonearm to perform at optimum conditions - the $50 boom box you can get at Target produces similar results to the $300 CD stereo component. Plus there's aesthetic concerns as well - more CD's can fit in a store and consumers decided they liked the smaller discs as well - you can take them in your car without having to convert them to cassette.
So the official answer to the question is that the vinyl record has the potential to sound better. However, the CD is more practical. It doesn't have the pops or scratches a record has and it doesn't have the "tape hiss" which plagues analog cassettes. For all practical purposes it's superior, but here's the rub: it's not an absolute superiority.
Why is this important? Well because an absolute superiority implies that it wins hands down and that, ipso facto, the characteristics which make it what it is make it a superior medium. Translation: the CD is better because it's digital.
Step in the wayback machine to 1986. You're sitting there playing Nintendo (once again, ask your parents if you're not sure). You have your little crappy controller that came with it. Now look down at it - there's a "D-pad" - the official name for the directional portion of the controller resembling a "+" sign. You hit left, Mario goes left. You hit right, Mario goes right. Now if you want him to go to the right faster, you don't hit to D-pad harder, you actually have to hold down a different button in addition to the direction. Why is that? Because it's a digital pad - the directions have values of either "1" (you pressed it) or "0" (you didn't).
Now it's a decade later and you're playing your Nintendo 64. You see two direction pads - one looking like a "+" and one looking like a tiny joystick. You're playing Super Mario 64. You move the little stick a little bit in one direction, Mario moves in that direction. If you move it all the way, Mario runs in that direction. The little joystick is an analog controller, and the Nintendo 64 was the first to bother with it. Later PlayStation models (the "Dual Shock" ones) had it and every console since does, but Nintendo was the first to innovate it. In this case analog once again means "not ones and zeroes" - the stick had a number of points to it. The PlayStation 2's Dual Shock 2 controller has analog buttons - they're pressure sensitive to 256 degrees (though I don't know if anything takes advantage of them yet).
This pretty much cinched the fact that digital was not absolutely better than analog, until Microsoft unveiled a new force feedback joystick with the tagline of using "advanced digital technology!".
Now think back to the mediums in which video is delivered. In the late 1970's to early 1980's, there were two different paradigms being pushed, the magnetic tape based mediums of VHS and Betamax, and the large compact disc like medim of Laserdisc. Since VHS and Betamax were based on the same principles as the analog audio cassette, calling them "analog" mediums seems an easy fit. Laserdisc, then, as it was a larger parallel of the compact disc (stored movies in terms of ones and zeroes) was a digital medium. Laserdiscs never took off beyond devout movie buffs for various reasons, none of which singly doomed the format - the fact that they were more expensive than VHS, the fact that it was a non-recordable medium, the fact that movies often spanned multiple sides, the fact that the public was being sold on the CD with the tagline of "smaller is better" and here was a LD the size of a vinyl record - the list goes on.
Today we have DVD. DVD is superior to LD in nearly every way - the discs are smaller, movies often fit on one side, the picture and sound is better than LD due to the latest technology, etc. DVD also winds up being a slap in the face of everyone who supported LD all these years. However, DVD employs something known as MPEG-2 compression to work its magic. A movie is still too big to fit on a DVD untouched, so people figured out that if you only draw the portions of a screen which change from frame to frame you can save space. Foe example, if you watch CNN you'll notice the little CNN logo in the corner never goes anywhere. Were this to be compressed on a DVD, the little CNN logo would only get drawn once (it works a little differently due to the use of "key frames", but you get the idea). A laserdisc never used any sort of compression, the frames were just presented one after another (which is why, even with a larger disc, movies often spread to two sides - the Star Wars movies spread to four sides each). As a result, for some reason Laserdisc is now seen as an "analog" medium - DVD is now the new "digital" medium.
This is wrong in my opinion - both mediums are digital.
Which brings us to the reason I made this post. There are cell phones out there, and there are a number of different methods of "doing" cell phones. Apparently in the last couple of years there's been a new type of phone network manufactured, so once again this new type of communication method is called "digital", and the older method is known as "analog". I know this because my wife had a cell phone, and then she was lured into trading her phone in for a new digital one. I never knew there was a difference (read: there wasn't one before they came up with the new ones). But what I started to notice in subsequent cell phone conversations was that there was a "lag" - a fraction of a second passed between the time she would say something and when I would hear it. Also, anything I said took a little while to get to her. This was annoying but tolerable.
Now I have a cell phone and of coure it's a digital one. The lag is now twice as bad - maybe even worse, since the phones are on different providers. As a result, a cell phone conversation is not entirely unlike a walkie-talkie conversation. I miss half of what my wife says and she misses half of what I say because we're talking over each other because what we're saying is not heard instantaneously, so what I'm talking over isn't being spoken at the moment and vice versa, and the minutes are drained on asking each other to repeat ourselves. Also when the other person stops talking you have to wait for a second or two to see if they really are done talking - a pause which can be easily misconstrued as an awkard or pissed off one.
The final irony is that I can't tell what's better - the sound quality is basically the same as the old "analog" phone. For that matter, if this is a digital phone, what was the old phone recieving - a cassette from the sky? This is the final straw for me in the whole "digital = better" debate - a clear "no".
So, to summize - I hate the fact that there is a misconception that "digital = better", I hate the way "digital" and "analog" are thrown around as buzzwords instead of useful terms, and I hate the fact that because of these facts a cell phone conversation more than a minute or two long is an excercise in pain.
When the 286 came out demand was so high Intel farmed out some production of it to a startup chip maker named AMD. After the demand settled they told AMD their services were no longer needed. Imagine Intel's surprise when AMD started coming out with chips whose architecture mimmicked the x86 architecture. Obviously being privvy to special Intel documents gave them the knowledge needed to be able to make these cloned chips. Interestingly enough their case held up in court - the judge figured they could have made the cloned chips even without the Intel knowledge (it just might have taken longer).
AMD even named their chips the 386, 486, etc., for which they were sued again. Intel claimed they owned the copyrights for those names, but of course they didn't have a case - you can't very well copyright a number. Intel quickly tried to rename their chips the i386 abd i486 ("i" for Intel - this was long before the iMac) but those names never took (and wouldn't have helped them in court anyway). Intel made the argument that the numbers were conjured from thin air and that the sequential-ness of them all was coincidence. They also made the argument that SX and DX were similarly conjured from thin air, that they didn't stand for anything, and that the DX2 was not a "clock doubled double precision" chip. This went so far that the clock tripled 486DX chip went under the name 486DX4, not the logical 486DX3, so that they could help the name argument (and make a quick buck off those who assumed it was a quadruple speed chip).
It didn't work - what Intel needed to do was to put their money where their mouth was and conjure up a new name for the 586, so they dubbed it Pentium - a made up word whose root was "penta-" (five). This worked - Pentium took off as a strong brand name. AMD's 586 chip, which they dubbed the K5, looked puny in comparison. Even their quick follow-up, the K6, bombed.
AMD's fallacy was that their chips were never as fast as Intel chips, even at the same clock speeds. This was less relevant, since AMD entries never debuted at a clock speed as high as their Intel counterparts.
Intel, meanwhile, unveiled the Pentium successor, Pentium II. They had pretty good ties to the strong "Pentium" brand, and since the greek prefix for six was "sexta-" (a "Sextium" wouldn't have worked, they sumised), and since sequential naming got them in trouble in the past, they kept the Pentium name, following up the Pentium II with the Pentium III and Pentium IV. They also came out with another "budget" chip, the Celeron (whose name became originally became synonymous with "celery") which was a Pentium II without cache. This was in response to a fad notion of the Network Computer - that people en masse would give up their desires to own pricey machines and instead all make do with dumb terminals. Woefully slow, they eventually gave it a limited cache and it became an attractive option for low priced computing.
Then just after the introduction of the Pentium III, AMD finally released their Athlon chip (what would have been the K7). Finally, they released a chip with a higher clock speed than an Intel chip and one who could benchmark faster as well. In addition to being marginally faster on real-world applications, it was also less expensive. Ever since then Intel and AMD have been waging a war on clock speed and price.
The other half of hating Microsoft is hating Intel, since the "Wintel" architecture is the market beast. Therefore, those who love to hate Microsoft love to love AMD and their Athlon. Me personally I went with Intel's Pentium III two years ago when I made my system, since Athlon was untested in the market. The added cost of locating an Athlon motherboard (at the time) negated any price difference the chip provided. And I think that while Athlon has proved itself (more or less) in the marketplace, I'll probably stick with Intel, but as my next processor purchase is down the road some, I'll keep and eye out on both.
One of the things that would happen with the Microsoft/Intel cozy alliance was that Microsoft would find something that they thought it would be great if the processor could perform as an operation, so they would phone up Intel and suggest it. Intel would agree and place it in their next chip. Now Microsoft is happy since they have software that uses the new operation (and they are the only ones so far that know how to use it) and Intel would be happy since now the most popular software code in the world ran better on their chips. This is why in the end they let AMD do whatever without further litigation - they just figured that they could do one better next time and AMD would just keep playing catchup.
Now history is repeating itself in a few ways. The Register is reporting that AMD is naming their next athlon the Athlon XP and had curiously delayed it to hit at the same time as Windows XP. They're also doing away with megahertz as a method of naming chips. the 1.3GHz model will be named the Athlon XP 1500+, the 1.4GHz model is named the Athlon XP 1600+, and so on. This gives the consumer the illusion of added speed, and the name change gets them all chummy with Microsoft, who probably doesn't have a problem burning bridges with Intel. Will it work? Who knows - the economy is going to shit as we speak (both before and because of the WTC incident) and so clinging to someone who you think is going to come out alive isn't unheard of, but this does put another fun wrinkle on which chip I should go for.
However, Handspring decided to come out with a Visor called Edge. Its gimmick was that it was slimmer than a standard Visor - so much so that you couldn't use the springboards - the main gimmick of the Handspring line - without an adapter. It sold disasterously, most (including Handpring) believing it had to do with the fact that it wasn't color.
Now Handspring is coming out with two new Visors, the Pro and Neo. Neo is basically the Platinum repriced and with three diffrerent translucent cases. Pro is Platinum but with 16MB of RAM, unprecedented on a PalmOS device (PalmOS can't address more than 8MB simultaneously, so either it's a modified PalmOS or there's a switching trick involved). Neat, but they'e still not color. Not that I'm in the market for a new PDA anytime soon - I still hold to the notion that they're only for organizing and my bottom-rung PDA does just fine with that - but whenever I buy a new PDA it's going to have color. I hope Handspring figures that out and these things get affordable soon.