By Tom Kidd
I'm writing this article so that I have better footing explaining that I know what I'm talking about in interviews, so it's not really for anyone's benefit but my own. However, I also wrote it to fill a gap in my estimation - there's not really any good explanations of .NET that aren't filled with marketing jargon or programming specifics. Though I may have some programming specifics in here, I try and keep it hype-free.
There's this thing called .NET from Microsoft. People know that it's something Microsoft does. Or makes. Or something. Most people, I believe, chalk it up to something they'll never do or mess with, and to some degree they're right. Kinda like when some mega-hugo-conglomerate corporation places an ad on TV with slow piano music and families dancing through cornfields at sunset, followed by the corporate logo. Not really advertising anything at all. Microsoft's ad campaign is the one featuring "one degree of separation" and that makes sense to someone like me, and the commercial is obvious in what it's saying - that .NET will allow everyone to work and compute better, faster, and easier - but it doesn't really tell anyone anything specific. In fact, they actually removed the .NET reference in the commercial at one point, but it's back in there now.
Now this would be fine if it weren't for one thing - Microsoft, for some reason, decided to place the ".NET" moniker on everything. Everything from the little messenger program shipping with Windows XP to a set of online services to the next version of their server operating systems. So people thought .NET was the little messenger program, or that it was the next server operating system. Confounding this problem is that reportedly no one at Microsoft except for the marketing team knew what .NET was, and it's uncertian if even they knew. They went a little nuts with the .NET name. One the one hand this hurt .NET since it created confusion, but it did make everyone know the name, so it's unclear if the mission has been accomplished or not. And finally the craziest part is that those that do "get" what .NET is have a hell of a time explaining it to others.
So let's clear up some things about .NET right off the bat, shall we? First of all, here's what .NET is not:
.NET is not a messenger program. Specifically the MSN IM program that comes with Windows XP is "Windows Messenger". You sign in using a ".NET Passport". This used to be called "MSN Passport" and I think it was also called "Microsoft Passport", "Hotmail Passport" and simply "Passport" at various points in time. Passport is basically a universal login program which, depending on whom you listen to, is either a complete waste of time or something that will potentially become very important. Basically instead of creating accounts on every website you go to - and relying on the security of all of them - you log in using your Passport and only have the one username/password. No biggie, right? Well that's what Passport is - a universal login. Instead of having to sign up everywhere, you sign up in just one place. So what does this have to do with .NET? Nothing, it almost did - Microsoft had this idea called Hailstorm or My Services and it would extend onto the Passport idea, adding on other things to the universal login. For example you would give Microsoft your credit card numbers and then instead of entering your number at every website you go to to buy something, you just say "use this card" and Passport/Hailstorm takes care of it - so the site never has to see your credit card, nor do any hackers or people within eyeshot. The idea was that this would help spur on e-commerce. The problem with this of course is trusting your credit card numbers to someone - especially when that someone has the security track record of Microsoft. Consumers didn't like the idea, sites didn't go for it, so now Hailstorm is "shelved" pending retooling.
.NET is also not a server or an operating system, but it almost is and almost was. The "almost is" I'll explain later but the "almost was" I'll explain now. Basically Microsoft as everyone knows makes operating systems. At some point years back their operating systems forked off from each other, so one line evolved into the Windows 3.1/95/98/ME line which all basically consisted of DOS with a window manager on top. The other fork was called Windows NT, and was a rebuilt 32-bit operating system. Since they were able to do things like reconstruct from scratch on newer technology and keep compatibility to a specified list, they had much better stability. In addition they released various versions of this operating system, some of which were tailored to be servers, like Windows NT 4.0 Server and Windows 2000 Server (after the name evolved into Windows 2000). There was also a Windows 2000 Workstation and Windows 2000 Professional, the successors to which wound up being the two released versions of Windows XP, aimed at the consumer desktop. The code name for the technology used in the XP systems was called Whistler, and the Whistler server operating system was to be called Windows.NET Server, once again reinforcing the .NET moniker. They then decided to rename it Windows.NET 2003 Server, then just recently they decided on the name Windows Server 2003, dropping the .NET moniker entirely (though ".NET" will still be on the box in sticker form).
So what would Windows.NET Server have had to do with .NET? Well, this will become more apparent in a minute here when we explain it but Windows.NET server would have all the latest and greatest improvements in the NT/Server line, plus it would have the .NET Framework as an install option off of the CD. Currently, you have to download it off of Microsoft's site and install it. This is pretty much the way Internet Information Server (IIS), Microsoft's built-in web/FTP server evolved - it was a free download for NT 3.1/4.0 and nowadays it's an install option on the installation CD of Windows 2000/XP. I'm not sure if the .NET Framework is an opt-in or opt-out option, but I presume it will be an option of some sort, since this practice of not letting people decide what software to have installed on critical machines has gotten MS in some hot water in the past. So basically Windows Server 2003 will be the same if you had to choose to install the .NET Framework or download it off of the Internet. So why was it in the name? Simple - to make people more prone to want to use .NET as a platform, but since it didn't need to be there and may have created the false impression that the new operating system was useless or unneccessary if you didn't use .NET or didn't plan to, they opted to change the name again. Of course this had its problems too, since a number of people think that because the ".NET" was dropped from the name, that Microsoft has "cancelled" .NET. They haven't.
Finally, .NET is not a plot by Microsoft to make you run all your software over the Internet. Recently the buzzword (or buzz-acronym) "Application Service Provider" came into vogue, which itself is confusing since its acronym ASP is already being used by the unrelated technology Active Server Pages. The Application Service Provider is this idea of running software over the Internet. For example, your ISP hands you an email account, so you have to use an application to check it, like Outlook or Eudora. Now this is all well and good, but you have to be at your PC or have access to the PC to use it, which you don't always have. Plus any time it needs to be upgraded you have to do it yourself. Plus any time you do something like a hard drive reformat you have to reinstall it. Now compare that to Hotmail. You can access Hotmail from any PC, you never have to upgrade Hotmail, and - though it can be used from within Outlook Express - Hotmail never requires you to install anything. Hotmail is an example of an Application Service Provider - the application is Email in this case. Imagine if you ran your word processor this way. No more installing, upgrading, etc. The downside being that the server with your application on it has to be available. Plus the portability of something like a word processor from one web browser to the next loses its luster when you want to save and transport files - you either have to have removable media handy or save your files on a server somewhere - and if the documents are touchy, that might not be a good idea. Plus right now web technology just can't do what fat clients can. Finally, Microsoft is actually against this idea, since their entire market motivation centers around selling CD's of software - and it was rumored that continuous Hotmail outages were deliberate to make people not want the Application Service Provider idea to take off.
Right, so if that's what .NET isn't, then what is .NET? Well before we get started, let's look at one more thing involving semantics. Namely, how do you pronounce ".NET"? Simple, as you may have guessed - "dot net". So how do you write it? That's less clear. Microsoft usually writes it in all caps (as I have): ".NET". However, every time you see a logo for it, it looks like the .NET is in lowercase: ".net". This is especially weird when you see things like job postings with it, since the moniker usually follows predefined abbreviations that have traditionally been in uppercase - I've seen every combination - Microsoft.Net, ASP.net, VB.NET, etc. I think it's safe to write it in all caps, ".NET". Of course the tacking of it onto other things is a crapshoot. The product Visual Studio .NET is oftentimes written Visual Studio.NET or VisualStudio.NET. Same with the language Visual Basic .NET. I'm still not sure how Windows .NET would have been written.
In some ways the name ".NET" is kinda dumb, or at least not the best thought out. It starts with a period, so every sentence starting in ".NET" has to start with a period. And every mention of ".NET" then drops a period in the middle of your sentence, which chokes most word processors, even Word. Plus it's difficult to search for. Google usually does alright with it, but it's easy to find a bunch of results ignoring the period and instead giving you the word "NET", which has tons of other meanings. Plus when searching for job listings (as you can no doubt tell by now that I do a lot of), you tend to get a lot of results that have nothing to do with Microsoft programming technologies, and instead merely have the ".net" in the URL of the company.
The .net TLD is traditionally used for businesses involved with Internet infrastructure activities, so it's obvious why Microsoft picked it. It conveys a sense of doing things over the Internet and using the Web to compute. Plus when they came up with it, the dot-com era was hot. I suppose it's not impossible that ".NET" was almost ".COM", except for the fact that "Microsoft .COM" would have created confusion with the website, and Microsoft already has a legacy technology called COM, for Component Object Model. In some ways, .NET is an evolution of COM as we'll see next. In hindsight, it's lucky that they went with something other than ".COM" since .com's have such a stigma right now.
Okay, finally we're at the good part - What is .NET? Well, the best starting point is this simple statement: .NET is a platform. Great, so what's a platform? Well that's another abstract concept but fortunately we have a point of reference for it.
Let's say you're like 90% of the average computer using public and you're reading this on a PC with Windows installed. You're reading this in a web browser which is a program, right? That program runs on a platform. What platform? Well a few actually, depending on how fine a point you want to put on it. In a broad sense the platform is Windows. The program is written for Windows, so Windows is the platform. Without Windows the program couldn't run. Of course Windows is itself running on something - most likely an Intel or AMD processor. Intel calls the architecture of the chips dating back to the 286 the "x86 Architecture", so this x86 Architecture processor (which the AMD's of the world seek to mimic with theirs) coupled with Windows is a platform in and of itself. It's sometimes referred to as the "Wintel" (WINdows, inTEL) platform. You could write something to run on AMD processors only, but few people bother to do that (since it limits your market).
So by the above logic, the Macintosh is a completely different platform - it's got a different operating system and a different set of hardware. That's not a hard concept to grasp. Of course then there's Linux, which is a different operating system but not neccessarily a different set of hardware - it can run on the aforementioned x86 processors and (some distributions) can run on a host of others. Linux is a different platform in the operating system, but not neccessarily in the hardware aspect. I'd comment on the idea of compiling a Linux application and being able to run it on different hardware configurations and distributions, but that's out of my league.
Which brings us to Java, which as we'll see is an important analogy for .NET. Java is a programming language. Java programs on the whole, however, aren't compiled down into platform-specific binary files. If you compile a C program with a C compiler you get an executable (EXE) file which will run on whatever platform the compiler targets (usually the same one it's running on), so compile it for Windows and it runs on Windows. It won't run on the Macintosh or in Linux. Java programs, however, are compiled into class files. The class files are then run in a Java Virtual Machine (JVM). This JVM then runs on other platforms, like Windows, Linux or Macintosh. It can also run on things like cell phones and set-top boxes. There's a lot of qualification to this statement, but the mantra of Java is "write once, run anywhere".
So Java is a platform, in addition to being a programming language. The platform in this case is the notion that Java programs run in the JVM.
So why then do we even care about Windows, Linux or Macintosh? If we could run the same program anywhere, why does it matter that we get the latest version of Windows? Well, there's several reasons, but the biggest for a long time was the fact that Java programs, because they're not compiled for a specific platform (other than Java) were slow. This has been alleviated by the march of technology progress (processors run faster now) and things like "Just In Time" compiling (where the Java class files were optimized at the last minute for the platform they're on). There are other reasons as well (Java programs generally don't look or feel like Windows programs, a turnoff for some), and the fact that by accident or design, Microsoft's actions undermined Java - their JVM was fast and very available, but added some things and removed others and broke the portability of Java - its main draw.
Which brings us back to .NET. .NET is a platform. The equivalent of the JVM is something called the CLR, or Common Language Runtime. .NET is not a language in the same way that Java is a language - rather, .NET allows for a wide array of different languages. Microsoft makes three .NET enabled languages - C#, Visual Basic .NET, and J# - the third of which is new and is designed to be like the Java programming language in an attempt to woo them away from Java. C# ("C Sharp") is essentially an extension of C/C++-like syntax. Visual Basic .NET is the successor to Visual Basic 6, but it changes a LOT of things in that language - so many that sufficiently complicated VB6 programs can't be easily ported over, and some VB6 programmers refuse to move to the new language. We'll get into the "why" of this later.
Getting back to Java for a second, Java also has (in addition to being a programming language and a platform) a set of class libraries available. What this means is that there's already a set of procedures and functions in place to do common things - all languages have something like this. .NET is no exception - it has a set of class libraries called the Framework Class Libraries. It is a very robust set of libraries that have well fleshed out functions to do pretty much anything you want to do. For example, in Visual Basic 6 a function to determine if a year was a leap year took some ten lines of code or more - and even then the year "2000" would never pan out correctly. In the FCL however, there's a IsLeapYear function as part of the DateTime object, so the same thing can be done in one line of code - and the year "2000" even works right.
Microsoft makes a program called Visual Studio .NET. Microsoft's always made compilers (their first product was a BASIC compiler), but when they unleashed Windows they realized that the only way to get people to use it would be to make it easier to develop programs for it. Whereas simple programs in character-based DOS could take mere lines of code to accomplish, Windows' graphic-based nature made even simple affairs thousands of lines. Microsoft took many steps to make this sort of thing easier, but one thing they did was to jump on a new bandwagon called Rapid Application Development - automate certian tasks and, if possible, make parts of programming graphically based. They came out with a tool, Visual Basic, to do this. Instead of writing tons of code to come up with a button, you just dragged and dropped one onto a form. They came up with a rudimentary scripting language based loosely on BASIC (a language which didn't really have anything like an ISO standard behind it) and made that the programming behind it. At some point they extended this logic to all their development programs, even their C/C++ compiler.
Before all of that even, it used to be that you had to write your source code in a text file, save it, then compile it with a command-line compiler. You can still do this if you like (more on that later) but at some point Borland decided to come out with a new version of their Pascal compiler, calling it Turbo Pascal. In it, the editing and compiling functions were all handled in one program - it called this an Integrated Development Environment, or IDE. This also allowed them to do things like integrate debugging, radically changing how people developed programs. Flash back to the mid 90's - Microsoft now makes Visual Basic, Visual C/C++, Visual FoxPro, you name it, it's Visual. However, it seemed like a huge waste to have separate IDE's, so they combined them into one they called Visual Studio. With the 97/5.0 versions of their Visual line, developers balked, so their 6.0 line got away from this idea (they still called it "Visual Studio" but the IDE's were all separate) - but with Visual Studio .NET they've once again put them back together.
Microsoft also came out with a product a few years back called Visual InterDev. Since Microsoft was instrumental in creating ASP (Active Server Pages in this case) it made sense that they would then try and extend the visual programming model for web pages as well. The first release of Visual InterDev (labelled 6.0 to match the rest of the Visual Studio line at the time) was a moderate success. Those that used its advanced layout tools discovered that their creations wouldn't work in non-IE browsers. In addition, it was designed to allow things like debugging and project control, but very few were able to get it to work.
So that brings us back to Visual Studio .NET. Like I mentioned, VS.net has the programming lanugage IDE's integrated again, and it also integrates and extends the model of visual programming started in Visual InterDev, eschewing ASP for ASP.net. The creation of executable files to run in the CLR is referred to as creating "WinForms" or "Windows Forms", whereas the creation of ASP.net projects is "Web Forms" (or "WebForms"). To the confusion of even developers, Visual Studio .NET also creates non-.NET code - it has the next iteration of the C/C++ compiler technology in it as well.
Following all this so far? Good, since there's more. OK, so we have Visual Studio .NET which uses C# and Visual Basic .NET which can borrow from the Framework Class Libraries in order to create .NET programs that can be run in the CLR, which runs in Windows. Compare this to the "old days" when we wrote the program, compiled it, and ran it on Windows. Now we have extra steps and different languages to do it in. So what have we gained? What is it about .NET that's supposedly better?
Well, here's a shocker - in some ways .NET isn't better. The aforementioned CLR has to be called into memory every time a .NET program is run. This takes time and resources. It also has to be installed on any target machines, a consideration to be sure. Plus, although progress is being made in this area on a few fronts, at this point in time .NET development more or less ties you down to Windows as a platform and Microsoft as a vendor. Plus like every new technology there's always a risk from an IT perspective that it won't "take". Microsoft's not sinless in this regard - they've had some programming paradigms (WinG comes to mind) that they've ditched.
So then how is .NET better? Well the #1 thing that makes .NET "better" is language independence. I can write code in C#, you can write code in VB.NET, and our code can talk to each other. Microsoft came up with this thing called MSIL - Microsoft Intermediate Language. All .NET languages compile down to MSIL, and then when glued together, the MSIL is compiled down to the code that the CLR can execute. Consequently this opens up the possibility of third party vendors making more .NET languages. To this end, some 30+ languages have been ported to .NET, including old favorites like COBOL and Fortran, along with recent entries like Python and Perl. Of course, in the quest for MSIL compatibility, sometimes the languages in their .NET variations aren't "complete", but this isn't really the point.
So what's the point of multiple languages? Most developers are acclimated to the idea of having to learn a lot of languages, so the gut reaction for some is to scorn those whose language preference is their handicap. However, from an IT staffing viewpoint, language preference is a nightmare. Let's say C/C++ is the best solution to Problem X. But C/C++ is a rare skill to have, and the programmers you do find want serious money. But VB programmers are easy to find and want more reasonable salaries. So what do you do? You hire VB programmers and hope for the best. But let's say you had a project with .NET. You can hire C# programmers, COBOL programmers, Python programmers, etc. Plus to some degree there's a code investment to protect - with slight conversion the decades of Fortran code written for the scientific and meterological professions can continue to be used.
Right, so what else is good about .NET? Well the existence of the CLR does some nice things. Windows is notorious for the "blue screen of death" - one rogue program does something horribly wrong and takes down the entire system. Granted, sometimes it's the operating system that screws up (which is less of a problem with the 2000/XP lines) but oftentimes it's a program error that does this. The operating system is increasingly "catching" these errors, but since the program has direct access to the OS, it's easy to take down a system. Well, with .NET, the programs don't run in the OS directly, they run in the CLR, ergo the CLR catches the errors that would ordinarly have caused chaos or crashes. This is what I was referring to earlier with the notion that .NET "almost is" an operating system.
For that matter, .NET code is considered "Managed". Traditionally, programming languages have always left it up to the programmer to de-allocate the memory they allocate for their objects and such. However it's too easy for a lazy programmer to not deallocate their objects, or to miss some. Consequently the memory is never de-allocated, which can cause all kinds of problems. I'm not sure which language/platform did it first (Java keeps being mentioned) but somewhere along the way the concept of "garbage collection" came around. When you finish running your .NET program, the CLR performs garbage collection - it goes through and de-allocates your memory for you. It's still reccomended to do this yourself for good coding practice, but it's there nonetheless. For that matter, if in your program you allocate a ton of memory for something, don't de-allocate it when you're done with it, and your application needs more memory, the CLR de-allocates it for you automatically. It's more complicated than that but that's the jist of it - since the CLR manages all the programs in it, .NET programs are more stable.
And then there's all the other things .NET does or improves upon, or in some cases replaces outright but with a name inplying an upgrade. There's WinForms or Windows Forms, which is the standard effort to create GUI programs for Windows. Then there's ASP.NET which is .NET programs running over the web with web pages as the interface, basically an upgrade to ASP, but using .NET enabled languages instead of scripting. There's ADO.NET, which is an upgrade to ADO, or ActiveX Data Objects, a way of interfacing with databases. ADO.NET's shipping objects interface mostly with Microsoft data technologies like Access and SQL Server.
.NET claims to alleviate something known as "DLL Hell". Microsoft created this thing known as the Dynamic Link Library, or Dynamically Linked Library. Essentially it was code compiled into a file that could then be accessed by other programs. This achieved two purposes - one, it didn't requre people to distribute their source code - these .dll files were object code - and two, it meant that multiple programs could access the same .dll file, reducing redundancy and saving disk space. Microsoft, for example, compiled common Windows functionality (drop down menus, dialog boxes, etc.) into something they called the Microsoft Foundation Classes or MFC, so all one had to do was place something like mfc42.dll (the number was the version number) in the Wiindows System directory, and then they could use everything in it. However, a problem arose when multiple versions of a file were needed. Say that Program A needs Function X. However, Program B overwrites the .dll file with its own version, with a Function X that works differently (or broken). So now when Program A uses Function X, it crashes. This is DLL Hell. .NET claims to alleviate this - every .dll has an explicit version and with clever directory usage, they can all coexist alongside each other. In some ways this defeats the point of the .dll file since now multiple versions of the "same" .dll file are on your system, taking up the space they were designed to save, but it's still not as bad as every program having redundant source code and with today's hard drive sizes, space isn't really a problem.
Finally, the aspect or capability of .NET which is, depending on whom you listen to, either the biggest facet of .NET, giving it a make-or-break position in the server end of the market, or just one tiny neglible aspect of .NET, is Web Services. Web Servies is another one of those terms that has different meanings based on who you listen to or how fine a point you put on it (i.e., some would say the ability to buy books on Amazon.com is a "web service"). In this context it means being able to execute code and get returned values on another machine using the Internet as an intermediary. This sounds like nothing big, until you start thinking of the implications of it.
Pretend you have two systems, both running the same hardware, software, etc. You have a database on one, and a server program on the other. When someone hits the server program with something like a web browser or a terminal client, they want data - specifically they want data from the second system. Well let's say they're not only interfaced with each other but able to talk to each other - the exchange is easy.
Now pretend that they're not running the same thing. Let's say one of them is a Mactintosh and the other is a PC. Surely they can talk to each other. Or can they? They can if one or both of them has software to talk to the other. This involves things like protocols and companies agreeing with each other (i.e., Microsoft may go out of its way to make their software not talk to the Macintosh since they want you to run more copies of Windows).
Now pretend they're not connected to each other at all. Or, more reasonably, there are more than two systems involved - like hundreds. It's crazy to come up with connections of all systems to all other systems. Better yet, let's say that the systems are not in the same room. Or building. Or state. Or maybe you don't even own the second system. How do they communicate now?
But let's say they're connected to the Internet. This is reasonable, pretty much every operating system has Internet capabilities. So now we need a way to have these systems be able to do their business over the Internet with each other, even if they run different operating systems, hardware, or encoding schemes. This is where Web Services come in handy.
Web Services is an agreed upon way of sending and recieving data based upon open standards and agreed upon protocols. Most of it revolves around something called XML -eXtensible Markup Language. If you know anything about HTML you know it revolves around a specified set of tags to do things like make words bold or make text into links. However, you can't make your own HTML tags, partly because the language doesn't work that way and partly because you can't expect to change every web browser to handle your new tag. XML however allows you to make up the tags, and then you can author the handling application to be able to handle the data contained within the tags.
So for example in our previous example, the server system could come up with a stream of XML formatted data, esssentially a request. It would make a call over the Internet to the second system, which would then do processing to come up with the response, basically quering the database, then format the response into more XML, then send that back to the first system, which has been programmed to be able to handle the returned data in XML format.
The possibilites for this include everything from companies be able to make previously impossible-to-network systems talk to each other, to vendors making certian kinds of data available to the public (the USPS for example has a web service to verify zip codes, but at a max of 10 per day, it's not something for production).
The funny thing about the Web Services market is that it's a market that simply can't exist unless there is someone else to talk to. It's no secret that Microsoft's strategy in many sectors is to simply run the competition out of business, that way they can pretty much do what they want the way they want. If this were the case then there would be no need for .NET's language interopability - Microsoft already had a somewhat awkward way of doing this in the past: the aforementioned Component Object Model, or COM, where code compiled in one language could interoperate with code in another language. If Microsoft could make every server run Windows then they could just extend the COM notion forever. However, while Microsoft has a more or less 90% hold on the desktop market, they're considerably less of a presence in the server market, where security bugs are more than just a nuisance - they're a legitimate problem. For that matter, it's unfeasible to continously rewrite your code in "flavor of the month" language, which is why 75% of the world's business code (billions of lines of it) is still maintained in COBOL on mainframes. Joe Public's perception was that COBOL experienced a brief renaissance when the Y2K bug hit - the truth was they just needed more programmers for a short while.
So Microsoft realizes that they're going up against an impossible task if they try and convince everyone to replace everything they've ever done with Windows technology. They'll not only fail at it, but by refusing to work with anyone else, they stand to lose a lot with an "all or nothing" stance in the market. So that's why web services are so important. Don't replace all your systems with Windows servers, just use them on the new systems and have them interface with your older systems through web services. Other than the extra programming to get the systems to send and recieve the right things, there's no need to install hairy things or configure other things. Consequently, Microsoft is actually working with IBM and others on standards.
Speaking of standards, as I mentioned languages like C and C++ have standardized implementations. BASIC doesn't, never did. Sun's claimed for years (six of them) that Java is going to have a standard at some point, but they've never done it. Microsoft, however, has submitted and recieved approval for standardization of C# from the EMCA (twice) and is set to recive ISO standardization soon, making C# for all intents and purposes an open standard. Of course, Microsoft has been criticized for this, too. For starters, C took ten years to standardize via a commitee, C++ took eight, but C# was decided on by Microsoft and submitted - at best a few years by a company. Plus apparently standardization doesn't neccessarily preclude Microsoft from being able to collect royalties.
And on the open front, Microsoft apparently wants .NET to be bigger than the scope of Windows. To this end, they contracted Corel to port the CLR and C# compiler to FreeBSD - and they got it working on Mac OS X 10.2 and (unofficially) Linux as well. They released the source code to this as well - unprecedented for Microsoft. The "shared source" license forbids commercial use, but it's still a big change from Microsoft. For that matter, there are at least two other projects - the Mono project, and Portable.NET initiative, based on the idea of implementing .NET on other platforms (Linux in particular).
And the giving doesn't stop there. Microsoft has the .NET Framework SDK, complete with C# and VB.NET command-line compilers and the FCL's available for free on their website. Assuming you already own Windows, you can develop a .NET application for free.
So if Microsoft is giving all of these things away for free, then how does it expect to make money off of .NET? Well for starters, while they gave away the source to the CLR and C# compiler, they didn't give away the source to the FCL, so most applications can't be ported away from Windows easily. Plus while the command-line compilers are free, it's difficult to design graphical programs with them, so Visual Studio .NET is still preferred. So this leaves Microsoft having not only designed and implemented the next-generation technology, but convienently selling the best operating system to run it on, as well as the best tool to design applications for it. To say nothing of the hundreds of books on all things .NET that Microsoft has lined the shelves of Barnes & Nobles with.
So that, my friends, is .NET. Now you know what .NET is (assuming you followed any of this), and you know why it's so difficult to describe in one phrase or two - and why Microsoft hasn't done too well in this regard. You should also know now why, unless you're an IT planner or a programmer, .NET isn't going to mean anything to you at all. And if you're very technically inclined and you have some sort of supreme issue with something I wrote here (i.e., you know of something on which I am dead wrong here) you can reach me here.