[an error occurred while processing this directive]

 


Special reports - Sweet dreams - June 1999
Is true interoperability merely a pipe dream or is it a real possibility? Ian Murphy investigates.
.

Interoperability is one of the holy grails of the computer world yet it can sometimes appear as a poisoned chalice to vendors. For many years, vendors made their money by locking users into their systems and making it prohibitively expensive to change either the hardware or the software (which was all too often dependent upon the hardware).

What interoperability used to be


Within the general office environment, interoperability didn’t really refer to the PCs that were starting to appear because many corporate customers avoided the clone problems by purchasing from the same vendor as the one from whom they purchased their larger computers. Ironically, this allowed many of the established computer manufacturers to keep their costs high. Interoperability was defined more by the ability to connect a PC to the corporate mainframe or mini-computer and extract data for use locally.

Those early systems often used a software terminal emulation package provided by the mainframe/mini-computer vendor enabling the PC to appear to the other system as a simple terminal. The users would logon, run a batch file prepared by the computer department and then download that file locally to reprocess. Later evolutions of this mechanism were enhanced by the introduction of query languages on the corporate systems such as Querymaster from ICL. Over the years, we have used many names to describe this process and today a large proportion of those using Client/server systems are doing little more than this.

Using terminal emulation software was never a simple matter and the introduction of Unix into the general corporate market made it even more difficult with developers trying to take advantage of the capabilities of the local computer. Yet Unix did provide us with the ability to access very expensive printing devices as if they were attached to the local computer.

The LAN and the NOS


As the personal computer market evolved, more and more information began to be stored on individual computers and this information had a habit of getting lost or of being unavailable to those who wanted it. The solution was the introduction of the Local Area Network and the idea of interoperability really took off. The early systems, just like their larger counterparts, tended to lock you into software from one specific vendor. Yet as Novell and IBM, along with a number of smaller players, showed that this market could work, they found that there was increasing demand from users for real interoperability between the emerging Network Operating Systems (NOS).

We are now 14 years on from these early NOS and Microsoft is now the dominant player in the LAN market with Windows NT. Over the years, Microsoft has faced a considerable amount of criticism of its interoperability, or rather lack of, with competing NOS and other enterprise operating systems such as Unix. Much of that has been well founded yet behind the scenes there is a recognition that more needs to be done and each new release and service pack adds more capabilities and improves those that are already there. Recently, as Microsoft has begun to focus itself on the Enterprise computing market it has substantially improved the software available and other vendors have begun to actively service this market.

The NT/Novell linkage has been the most important to many corporate environments and this has been a long and tortuous route. Early on, Microsoft took the ground away from Novell by being the first to release a 32bit version of IPX/SPX, the network protocol on which Novell NetWare was originally intended to run. Today, however, most sites work with TCP/IP as their primary protocol and there can be problems between NT and Novell proving that the goal of a vendor independent protocol suite is still hostage to the implementation by the vendors concerned.

Access all areas


Today, users can connect to either operating system and access resources such as file and printer sharing on the other. They can access applications such as databases on either operating system and the underlying software is stable and generally well written. In addition, Microsoft and Novell have introduced their own solutions to enable this to take place. For some time, this was only true of Novell’s NetWare 3.x operating system but you can now get good solutions for Novell’s NetWare 4.x operating system as well. Cross administration is also possible between these two vendors and that is a critical issue, particularly for large corporate sites and those spread across multiple locations. The last thing any administrator needs is multiple logins to manage their environments. Yet, this is still not quite complete and the problem here is intellectual property rights and openness.

Whilst cross administration itself is possible, migration of users is not quite so simple because the account databases of the different operating systems work very differently. As this goes to the heart of each NOS, this is not something that is likely to be solved anytime soon, if at all. From the user perspective, this has had a positive benefit with simplified access to the corporate networks, and data and resources using a single logon, single password mechanism. This means that whenever the user changes their password it is promulgated across the different NOS and the user therefore does not need to worry about where they have logged on.

Microsoft’s attention, not unreasonably, has been heavily focused on providing interoperability with Novell because NetWare is a major competitor. Just recently, Microsoft has made it clear that it sees NT moving into the enterprise computer market and is now firmly targeting Unix. There have always been some Unix-like tools within the Windows NT Resource Kit but they were little more than minimal attempts in order to show some willing.

Unix Services for Windows NT


As a result of needing to get serious, Microsoft has recently released version 1 of Unix Services for Windows NT and this includes some rewritten code, some licensed code and commands, and a first attempt at providing an enterprise password approach. Microsoft began by rewriting the Telnet Client and Server that it had previously provided. The existing versions of both were awful and the newer versions are much improved and actually usable. There is a password synchronisation module, although this is one way only, from NT to Unix and is very dependent on the Unix vendor using the Microsoft toolkit to write the relevant code. The two key elements, however, are both licensed from outside organisations. The first is a set of 25 Unix command line utilities that have been licensed from MKS. These provide administrators working on a Unix box with the ability to use a small set of standard commands across both Unix and NT.

More importantly, Unix is an environment where a lot of scripting takes place and Microsoft has been working hard on scripting engines over the last few years. This approach makes it easier to write small command files to automate administration of both environments. The last, and arguably at this first stage the most important, are the NFS Client and Server that Microsoft has licensed from Intergraph. Early on, there were conflicting stories from Microsoft that it had purchased, rather then licensed these utilities. Intergraph is adamant that it has simply licensed the source code to Microsoft although this could cause problems down the line when both vendors choose to enhance the code and they start to diverge. Users of Microsoft’s SQL Server have been down this route, when Microsoft and Sybase were jointly developing the product, and will know how rocky this road can be. Perhaps the most important issue here is that Microsoft has ensured that the NFS services are integrated into the Distributed File System (DFS) add-on for Windows NT.

Future enhancements


What Microsoft hasn’t done yet, is make a decision on where to take the Unix Services. It is talking up future enhancements but unlike the Novell interoperability products, the team responsible for Unix Service does not see this product being incorporated into the core Windows NT code in the Windows 2000 timescale. There are a number of other vendors providing NT to Unix interoperability and many of these are lined up behind AT&T’s Advanced Services for Unix (ASU). This product first saw life during the NCR days and came from the same development team who initially produced Star Services. That product was designed to be a Unix port of Microsoft’s earlier operating system, LAN Manager. ASU has won numerous awards and, given a choice, it is certainly what I would install in the first instance.

If you are seeking a wider, enterprise management tool to provide a single logon for the users to all your operating systems, then you are unlikely to get that from Microsoft anytime soon. There are such solutions such as Tivoli, UniCenter TNG from Computer Associates and OpenView from Hewlett Packard. Whilst they provide a much needed solution for larger companies, they are all working on smaller versions to satisfy those mid-sized clients who need an integrated logon and management environment but who can’t afford some of the more extensive features.

At the end of the day, true interoperability is still some way off and is likely to remain so.


[an error occurred while processing this directive]