Introducing Microsoft DotNet

  Introduction

    
What is .NET? The term is, essentially, a new marketing label which Microsoft is sticking on
  existing and future products. The .NET label now features on server products such as
  BizTalk Server 2000 and Application Center 2000, which are based on Windows DNA 2000
  technology. The most interesting feature of .NET, however, lies in the development
  platform, languages and protocols which it emphasizes.

    By bringing us .NET, Microsoft is presenting us with a new platform designed to facilitate
  development of interoperable Web applications, based on a totally new architecture. For
  Microsoft, .NET will be a way of "programming the Web," no less. Today, the first versions of
  Visual Studio .NET are available, and they enable us to sketch out a relatively accurate
  profile of how the .NET platform is likely to look in the long run.


  Aims and objectives

   
The goal that Microsoft has set itself is ambitious, to say the least, both in technical and
  strategic terms. The new .NET platform has not evolved from the DNA 2000 technology
  currently available; rather, it is a totally new technology which is likely to shake up more
  than a few deep-rooted ideas.

    .
NET is an entirely new platform and technology which introduces a host of new products,
  whose compatibility with existing technology is not always guaranteed. It offers support for
  27 programming languages, which share a hierarchy of classes providing basic services. .NET
  applications no longer run in native machine code, having abandaned Intel x86 code in favor
  of an intermediate language called MSIL which runs in a sort of virtual machine called the
  Common Language Runtime (CLR).


    
In addition, .NET makes intensive use of XML, and places a lot of emphasis on the SOAP
  protocol. Thanks to SOAP, Microsoft is hoping to bring us into a new era of programming
  which, rather than relying on the assembly of components or objects, is based on the reuse
  of services. SOAP and Web Services are the cornerstones of the .NET platform.


   
However, there is no need to start worrying yet about the future of DNA applications
  currently in production; as Microsoft themselves have admitted, the final version of .NET will
  not be available until early 2002, and .NET is able to run existing applications in native mode,
  without giving them all the .NET benefits.

   
Contrary to what Microsoft would have us believe (apparently in an aim to reassure current
  customers), changes run very deep and affect almost every component in the Microsoft DNA
  architecture:

 The IIS Web Server has dropped its effective but fragile multi-threaded model in favor of a multi-process model reminiscent of the Apache model.
 ASP technology gives way to ASP.NET (initially called ASP+), where interpreted scripts are replaced by codes compiled when they are first invoked, as for JSPs.
 Win32 APIs such as ATL and MFC are replaced by a coherent set of Base Framework classes.
 VB.NET no longer ensures ascending compatibility from VB6, as this language receives a lot of contributions (inheritance, …) in order to comply with the Common Language Specification (CLS) agreement.
 COM+ 2.0 is a totally original distributed components model which does not retain any inherited element from the COM/DCOM/COM+ lineup. To this end, COM+ 2.0 no longer uses the Windows Registry to register local or remote components: deployment of components in .NET will take you back to the good old days when installing a program meant copying files into a directory and uninstalling involved nothing more complicated than deleting the files.
 A new programming language called C# ("C sharp") is born: this is a modern object-oriented language, something of a cross between C++ and Java .
 The new programming model, based on SOAP and Web Services, fundamentally changes the way in which applications are designed, and opens the way for a new profession: online provision of Web services.
   With .NET, Microsoft is sending us a vision of an Internet made up of an infinite number of
  interoperable Web applications which together form a global service exchange network.
  These Web Services are based on the Simple Object Access Procotol (SOAP) and XML.
  Today, a number of vendors, including IBM, are greatly involved in SOAP.


   
Not only are these Web Services likely to develop on the Internet, but they may also
  change the way we plan information systems in enterprises, by making SOAP systematically
  used as application integration middleware, playing the role of a simple but efficient,
  standard EAI. An enterprise information system could then also become a network of front
  and back-office applications which interoperate through SOAP, and reciprocally use the Web
  Services that they implement.

    
In the meantime, however, other vendors are not sitting back: IBM and, more recently,
  Oracle have announced offerings which enable the creation of Web Services. IBM, which has
  long been a supporter of SOAP, offers its "Web Services Development Environment" on its
  Alphaworks site, while Oracle has also just adopted SOAP, within 9i. Oracle has dubbed its
  offering "Dynamic Services", but it does not seem to be clearly defined as yet.

  Unpacking the .NET architecture

    
We will be looking in most detail at the architecture components and tools used to design
  and create enterprise Web applications.

  With this in mind, we can describe the .NET architecture as follows:
 It is a set of common services which can be used from a number of object languages.
 These services are executed in the form of intermediate code that is independent of the underlying architecture.
 They operate in a runtime (Common Language Runtime) which manages resources and monitors application execution.
    The primary goal of .NET is to provide developers with the means to create interoperable
  applications using "Web Services" from any sort of terminal, be it a PC, PDA , mobile phone,
  and so forth.
  .NET is multi-language

   
With the .NET platform, Microsoft will provide several languages and the associated
  compilers, such as C++, JScript, VB.NET (alias VB 7) and C#, a new language which emerged
  with .NET.

   Third party vendors working in partnership with Microsoft are currently developing compilers
  for a broad range of other languages, including Cobol, Eiffel, CAML, Lisp, Python and
  Smalltalk. Rational, vendor of the famous UML tool Rose, is also understood to be finalizing a
  Java compiler for .NET.

  Applications are hardware-independent

    
All these languages are compiled via an intermediate binary code, which is independent of
  hardware and operating systems. This language is MSIL: Microsoft Intermediate Language.
  MSIL is then executed in the Common Language Runtime (CLR), which basically fulfills the
  same role as the JVM in the Java platform. MSIL is then translated into machine code by a
  Just in Time (JiT) compiler.

  Applications are portable

   
Applications compiled as intermediate code are presented as Portable Executables (PEs).
  Microsoft will thereby be able to offer full or partial implementations of the .NET platform
  over a vast range of hardware and software architectures: Intel PCs with Windows 9x,
  Windows NT4, Windows 2000 or future 64 bit Windows versions, microcontroller-based PDAs
  with PocketPC (e.g. Windows CE), and other operating systems too, no doubt.
  All languages must comply with a common agreement

    Computer languages are numerous. Traditionally, new languages have been created to
  respond to new needs, such as resolving scientific problems, making calculations for
  research, or meeting strong needs in terms of application reliability and security. The result
  is that existing languages are heterogeneous: some are procedural, others object-oriented,
  some authorize use of optional parameters or a variable number of parameters, some
  authorize operator overload, others do not, and so it goes on.

    For a language to be eligible for the range of languages supported by the .NET platform, it
  must provide a set of possibilities and constructions listed in an agreement called the
  Common Language Specification, or CLS. To add a language to .NET, all that is required in
  theory is for it to meet the requirements of the CLS, and for someone to develop a compiler
  from this language into MSIL.

     This seems fairly innocuous at first glance, but the restrictions imposed by
  CLS-compliance on the different .NET languages mean that, for example, Visual Basic .NET
  ends up becoming a new language which retains little more than the syntax of Visual Basic6.

     The fact that all the .NET languages are compiled in the form of an intermediate code also
  means that a class written in a language may be derived in another language, and it is
  possible to instantiate in one language an object of a class written in another language.

    Today, if you want to create a COM+ object, you generally have the choice between VB6
  and Visual C++. But VB6 does not give access to all possibilities, and for certain
  requirements, you are restricted to VC++. With .NET, all languages will offer the same
  possibilities and generally offer the same performance levels, which means you can choose
  between VB.NET and C# depending on your programming habits and preferences, and are no
  longer restricted by implementation constraints.

    To put it frankly, in order to be able to provide the same services from languages as
  remote as Cobol or C#, you have to make sure these languages have a common denominator
  which complies with the demands of .NET. This means that the .NET version of Cobol has
  had to receive so many new concepts and additions that it has practically nothing left in
  common with the original Cobol. This applies just as much to the other languages offered in
  .NET, such as C++, VB, Perl or Smalltalk.

     So what we need to understand is that when Microsoft announces the availability of 27
  languages, we should interpret that as meaning there are 27 different syntaxes.

     The most symptomatic example concerns Java. It is one of the intended .NET languages,
  thanks to Rational, who are currently working on a Java to MSIL compiler. But what kind of
  Java are we talking about? It is a Java which runs as MSIL code, not byte-code. This Java
  does not benefit from the traditional APIs offered by the J2EE platform, such as JMS, RMI,
  JDBC, JSP. This is a Java in which EJBs are replaced by .NET's distributed object model. The
  label says Java, the syntax says Java… but Java it ain't!

    Microsoft is cleverly entertaining certain rumors, such as the recurring whispers predicting
  the eventual availability of .NET on Unix systems, even Linux. Linux is increasingly popular
  among developers, and is becoming a potential alternative to Windows NT as far as server
  architectures are concerned. By keeping details hazy around the issue of Linux support for
  .NET, Microsoft can win over fans of the free operating system.


  Close-up on the CLR

    As has been mentioned already, the CLR is, like the Java virtual machine, a runtime
  environment that takes charge of resource management tasks (memory allocation and
  garbage collection) and ensures the necessary abstraction between the application and the
  underlying operating system.
     In order to provide a stable platform, with the aim of reaching the level of reliability
  required by the transactional applications of e-business, the CLR also fulfills related tasks
  such as monitoring of program execution. In DotNet-speak, this involves "managed" code for
  the programs running under CLR monitoring, and "unmanaged" code for applications or
  components which run in native mode, outside the CLR.

     The CLR watches for the traditional programming errors that for years have been at the
  root of the majority of software faults: access to elements of an array outside limits, access
  to non-allocated memory zones, memory overwritten due to exceeded sizes.

      This monitoring of the execution of managed codes comes at a price, however. Although
  it is currently impossible, from performances returned by current Beta-test versions, to
  quantify the overhead incurred by application monitoring, we can expect performance to slip
  by at least 10%, as Microsoft admits. Of course, we might ask whether a 10% reduction in
  performance is such a bad thing if it leads to new levels of reliability and availability...

 






   Mail Us


This site is best viewed at 800 x 600 resolution & IE 5 or above.
Copyright © 2001. All rights reserved.