[an error occurred while processing this directive]

 


Test Drive -
LANVault 200 (January 2000)
Ian Murphy test drives LANVault 200 from Quantum | ATL
.

explorermag.com star rating

There is little doubt that the amount of data held by companies is increasing on a monthly basis and this is backed up by the continuing profits reported by the big hard disk drive vendors. At the same time, our almost insatiable demand for storage has pushed prices per MB to their lowest ever. To think that less than 10 years ago a 300 MB SCSI hard disk would have cost you around 350 is frightening when you consider that the same amount of money today would buy you a pair of 27 GB EIDE drives with much superior performance than that of the SCSI drive. The same amount of money will also get you an 18 GB Ultra SCSI 2 drive with very high speed performance.

Most stored information is duplicate


Although we now store more information on computers today than at any other point in the history of the computer, much of this is duplicate information. During the growth of computer networks we suffered heavily from server crashes that when they didn’t destroy data, at the very least made sure that the data was unavailable. As a result, even those who practised saving their work to the corporate file servers tended to make copies on the local hard disk. Unfortunately, there were even more people who didn’t trust the file servers at all and only saved their data locally. This situation hasn’t changed all that much over the last few years despite the massive efforts of IT departments to regulate the storage onto network servers. Yet even when you can get the information onto the server, how can you make sure that it will be available when requested and that it won’t be damaged? There is an increasing incidence of virus writers targeting data maliciously and of members of staff going out of their way to hide or damage data to protect their position.

This combination of deliberate, accidental and machine failure induced loss is costing business huge sums of money. Legato and Stac are two of the big names in data storage and they have recently released a report they commissioned that shows in the US alone, over 6 percent of personal computers lost data during 1998. The overall cost of lost, damaged or stolen data was calculated at a massive US $11.8 billion and each incident was calculated at US $2,557 to fix. We currently have no figures for the UK or Europe but it would be reasonable to assume similar if not higher costs given the higher salaries enjoyed by UK IT professionals. So we have identified the reasons and the costs, but the why it happens and what can be done about it is embarrassing for a lot of technology companies. Despite our willingness to store data and the huge rise already mentioned, the response from the backup vendors, both hardware and software, has been fairly pathetic. There is little doubt that the solution to data security for most organisations has been to install RAID (Redundant Array of Inexpensive Disks). Backup is therefore done from one set of hard disks to another set, often sharing key components such as processor or interface card. More expensive solutions are used by the big IT departments but their costs are often outside that of most SMEs. Other alternatives such as clustering have their own problems, not least the problems of licensing software and the need for identical machines if it is to really work.

Into this problem we have already started to see working SAN (Storage Area Networking) solutions using Fibre Channel connectivity. Whilst we have yet to see any of the vendors show industrial strength interoperability with each other, we have to consider the impact that such high speed, large scale storage will have on data security and integrity.

Let’s turn then, to backup solutions.


Whenever a server is specified today, part of the automatic approach is to cost in the backup solution. For many sites, that means including an internal tape backup solution which, given the capacities involved, means a DAT device. Whilst a DAT tape streamer has a backup capacity of around 24 GB with compression, the compression automatically lowers the backup speed and increases the problems of recovery if anything goes wrong. Few servers have less than 20 GB to backup in a full backup and you are unlikely to pay less than 500. Taking the step up from this to DLT (Digital Library Tape) requires a significantly higher investment and you then get a 20 MB uncompressed solution with a potential for 40 MB per tape.

In addition, if the server is installed as a local device within a department or a small office away from the main IT department, you need to rely heavily on the users to manage tape libraries and to rotate the tapes effectively. The most important issues, arguably, are educating people to change the tape daily, to check the backup log for errors, to send the tape off-site for security and to not leave the spare tapes sitting on top of the server or near an electromagnetic source such as a monitor.

There is nothing new in any of these issues and from the time that tape backup solutions first appeared for personal computers back in the mid eighties, I can recall all of these being of major concern. There is also the issue of security as any thief who takes your backup tape is able to rebuild your system on another machine and with Windows NT, if that tape includes your SAM database, then the thief has all he needs to break your passwords. Having painted a dark picture, let’s go for real blackness.

Tape backup solutions are a partnership between the operating system, the tape backup solution vendor and a software vendor. All three are likely to issue patches and fixes to their products and will do so without spending a huge amount of time (if any) consulting with the other two. When you apply patches that affect your backup solution, there is a considerable risk that you will no longer be able to read older tapes, yet few sites actually bother to check the impact of an upgrade for this problem. When a server crashes and you attempt to rebuild it, you will suddenly find that you need an identical tape drive to stand any chance of recovering data from tape. Then you will need to bring the operating system up to a set level including patches, the tape drivers will need all relevant patches to be applied and then the tape software will need to be similarly applied and treated. A failure in any of these steps and your data stands a good chance of being unrecoverable.

Another problem is that people apply security to the backup tape to prevent the theft of tapes and therefore data. Unless you have a stringent policy of using a set of predetermined passwords that are recorded separately from the tapes, you could find yourself attempting a restore and discovering that you don’t have the correct password. Finally, when was the last time you practised a complete restore of a critical system? The last time the moon turned blue perhaps? Despite the considerable sums spent on disaster recovery plans, unless the brown stuff gets in the air conditioning few people actually take qualitative steps to ensure that their procedures work.

So why the Quantum | ATL LANVault solution?


That was the question I was most interested in after a hyped up launch at Disneyland Paris with lots of dry ice, early customers waxing lyrically and the announcement of deals which, it turned out later, hadn’t actually been signed although I’m now assured that they have! One compensation for going to France was to be allowed to have access to one of the first units to be shipped in the UK. The LANVault is a network attached DLT solution comprising of two key components; a base unit (SP200) and a digital tape library (L200/L500). When sold, it will be as two different configurations with the L200 using the DLT 4000 engine being sold as the low end, low speed solution and the L500 using the DLT 7000 engine as the high-end alternative. The difference is in the speed of backup; 90 MB/min compared to 300 MB/min or 5.4 GB/hr compared to 18 GB/hr. For larger installations or if you intend to take full backups very regularly, you must invest in the LANVault 500. The backup capability for the LANVault 200 is 160 GB and the LANVault 500 is 280 GB.

The SP200 is a Windows NT Server with the keyboard, mouse and monitor removed. Unfortunately, ATL forgot to suppress the standard Windows NT error messages when it can’t find these components so don’t panic and rush to check the Event Viewer. One of the reasons for underpinning the system with a Windows NT Server is that you can combine all of the network and server functionality in a single solution without having to effectively dedicate a machine from your network.

Unpacking and connecting all of the components takes a little time if you do it carefully, although once you have installed the first LANVault, the others will be remarkably simple. One thing to be very careful about is that you will need either two people, or one person who is prepared to lift heavy weights, as when assembled, the LANVault weighs over 5 stone (58lbs or 26kg). This weight also places a restriction on where you can place the solution because whilst most people might consider a shelf, the average office does not have real load bearing walls able to cope with such weight. This stage of the process took a little over 15 minutes. The LANVault also comes with seven tapes and one cleaning tape to get you started. Installation here is very simple but be sure to record where you place the cleaning tape as you will need to tell the management software so that it doesn’t report it as a bad tape. The final step in this process is to plug the SP200 into a nearby network socket. Quantum | ATL provides a good quality CAT5 cable for this.

Off you go


Once you have assembled the LANVault you need to power it up and start the configuration process. This can be done either through the OCP (Operator Control Panel), front panel to the rest of us, or through a Web-enabled interface. One of the installation requirements is that the LANVault is on the same subnet as the computer from which the software installation will be done. Two choices therefore; either mess about with the IP settings on a workstation, configure the LANVault and then change all the settings back, or use the OCP. I chose to use the OCP, which was relatively simple but be very careful with the operator password. The default is 1000 but if you change this and forget the new password, ATL is adamant that it cannot be undone.

Software installation was fast, simple and almost painless. You place the Management Console Installation CDROM in your computer and follow the installation wizard. The Discovery Utility locates the LANVault and this is where a little confusion arose. The computer name for the LANVault is LANVAULT and this caused some confusion because although I was offered the chance to change the name of the device during initial configuration I didn’t. I suspect that many other people won’t either so you must be very careful here.

Once the LANVault is installed, use the Web administration utility to properly configure the LANVault as part of your domain. This ensures that you can use domain accounts to manage the backup and if you are installing the LANVault in a remote office, you can then allow your operators to manage backups at night. You can also monitor log files from a central location and pick-up difficulties.At present, there are only two software vendors who have ported their software to the LANVault; Computer Associates and Veritas, and ported is the correct description. As a result of the way that the LANVault works, with the SP200 underpinning the device, Quantum | ATL needs to make sure that device drivers, operating system and backup software come together. This is a real solution to the management nightmare described earlier. This does, however, place a real onus on Quantum | ATL to ensure that patches and new releases from any of the three parts of this triangle are applied, quality tested and distributed quickly to the channel. It currently expects to do this via its website and this seems to be an ideal medium.

In turn it places a restriction on corporate IT departments who need to carefully manage the versions of software clients they have installed. Any clients must be part of the LANVault approved releases to prevent backup/restore incompatibilities and this becomes critical when dealing with business critical data such as databases and email/messaging servers.We would have liked to compare the performance of both Veritas and Computer Associates but unfortunately the only company to get licences to us was Veritas. We tested Backup Exec across a range of server and operating systems and found that we could backup and restore data easily. Veritas has also ported its disaster recovery software as part of the package although we didn’t possess a license key to properly test this.

No competition


At present, I see no competitor to the LANVault on the market and for those who are considering equipping small offices, particularly with NT Terminal Server, this is probably the best solution to ensure a managed backup solution. If you have several small departments then one additional feature is the ability to add additional network cards and attach directly to each segment rather than backup across the main backbone. Not only does this mean a more balanced network, it keeps all subnet traffic contained. With departments where you may have a need for regular small restores, this is likely to be greeted as good news from those responsible for the corporate infrastructure. For a large scale backup solution, there are alternative and better positioned products and I would suggest that when considering the LANVault, you need to consider the maximum storage capability of a single set of tapes as being the defining limitation. Finally, would I buy one to protect my network here? Yes, because it is cheaper than the equivalent backup capacity in DAT systems, it is a single point of management, it runs over the network rather than
impacting each machine and it is faster.test drive