Wednesday, June 22, 2011

Yes, SAN Does Suffer from Fragmentation

SAN brings many benefits to an enterprise. Stored data does not reside directly on a company’s servers, therefore business applications get the server power, and end users obtain the network capacity that would otherwise be utilized for storage. Administration is more flexible, because there is no need to shift storage cables and devices in order to move storage from one server to another. Servers can even be booted from the SAN itself, greatly shortening the time required to commission a new server.

There are numerous technologies employed to make SAN efficient, including RAID,  I/O caching, snapshots and volume cloning, which have lead some to believe that SANs do not suffer the effects of file fragmentation. Fragmentation is the splitting of files into pieces (fragments) originally developed for the purpose of better utilizing disk space in direct attached storage devices.

The problem is that data is read and written by the operating system, and this is done on a logical, not a physical level. The OS’s file system, by its very nature, fragments files. While the data from the viewpoint of the NAS may appear efficiently arranged, from the viewpoint of the file system it is severely fragmented—and will be treated as such.

Fragmentation affects computer operations in numerous ways. Chief among them is performance; due to the fact that files must be written and read in thousands or even hundreds of thousands of fragments, performance is severely slowed down. In a fragmented environment, unexpected system hangs and even disk crashes are common. A heavy toll is taken on hardware, and disks can lose 50 percent or more of their expected lifespans due to all the extra work.

In past times, the solution to the fragmentation issue was a defragmenter. Because of many innovations in today’s computing environments—such as those used with SAN—a higher-level solution is needed. An optimization solution, which addresses a broader scope of issues than fragmentation only, is required.

Such a solution approaches numerous aspects of file read and write I/Os in addition to fragmentation. The majority of fragmentation itself is prevented before it even occurs, but also included is the intelligent ordering of files for faster access, along with other advanced technologies designed to automatically maximize system performance and reliability.

The best proof of fragmentation’s effects on SAN is through the testing of an optimization solution within an enterprise. Doing so, it will be clearly seen that fragmentation does indeed affect SAN operations—and they can only benefit from its elimination.

Wednesday, June 15, 2011

Free Software: Not for the Big Time

There are many free utilities and applications out there, and if what you are doing is meant to be insignificant or small, they’re probably adequate. For example, there are a few free music recording apps available that will allow to you record multiple instruments and vocals to your computer, and then mix the results into a song. But if the resulting track is to be used for professional purposes, you’ll find it sorely lacking; you need Logic Pro, Pro-Tools or the like to even come close to competing in the major markets.

There are numerous free accounting programs—probably fine for keeping track of bake sale income or the like. But used in a company or corporation to track income, accounts payable, expenditures, profit and loss? Hardly. The same could be said for databases; a free one available for download won’t hold a candle to an Oracle or SQL when it comes to business use, and no IT professional would even consider it.

On the utilities side, there are free defragmenters. Unlike the examples given above, these may not even be worth it for the home user, simply due to the nature and quantity of today’s fragmentation. But that argument aside, it is an obvious truth, upon examination, that these freebies are definitely not meant for business or corporate use.

Fragmentation is the splitting of files into pieces (fragments) on a hard drive, to better utilize disk space. A defragmenter is meant to solve this problem by “re-assembling” these files back into a whole, or a nearly whole state. In the corporate environment, the levels of fragmentation are far beyond the capabilities of a free defragmenter to accomplish this task.

A free defragmentation utility must also be scheduled; it has been discovered by anyone having to actually try this that scheduling is practically impossible in today’s enterprises simply because systems are constantly up and running.

But the primary problem with a free defragmenter is that, today, it takes more than defragmentation to truly tackle the resource loss associated with I/O reads and writes. Multi-faceted optimization is, by far, the best approach.

Technology is now available that, instead of defragmenting, actually prevents a majority of fragmentation before it ever occurs. This same technology also orders files for faster access, and performs a number of other vital actions that greatly increase performance, and maximize reliability. All of these functions occur completely automatically, with no scheduling or other operator interference required.

Free software is definitely not meant for the big time. This is doubly true in addressing fragmentation.

Wednesday, June 8, 2011

Safeguarding Performance of Virtual Systems

A company implementing virtual machine technology can expect to reap great rewards. Where before a new server installed would have meant a new physical machine (at the least a rack mount)—along with the power to run it and the space to house it—a server can now be fully deployed and run on an existing hardware platform. It will have everything the physical server would have had, including its own instance of an operating system, applications and tools, but no footprint and a tiny fraction of the once-required power.

In addition to the footprint savings, virtual machines also bring speed to the table. It can be deployed and up and running in minutes instead of hours. It allows users to deploy their own machines—something unheard of in the past. It means a great time savings for users and IT personnel alike.

Virtual technology is now being used for many purposes. For example, it brings a great boost to Storage Area Network (SAN) technology which, in itself, takes an enormous amount of stress off of a system by moving storage traffic off the main production network.

Without proper optimization, however, virtual technology cannot bring the full benefits on which an enterprise depends. A major reason is that virtual technology—along with SAN and other recent innovations—relies, in the end, on the physical hard drive. The drive itself suffers from file fragmentation, which is the state of files and free space being scattered in pieces (fragments) all over the drive. Fragmentation causes severe I/O bottlenecks in virtual systems, due to accelerated fragmentation across multiple platforms. 

Virtualization suffers from other issues that are also the result of not being optimized. Virtual machine competition for shared I/O resources is not effectively prioritized across the platform, and virtual disks set to dynamically grow do not resize when data is deleted; instead, free space is wasted. 

It is vital that any company implementing virtual technology—and any technology in which it is put to use, such as SAN—employ an underlying solution for optimizing virtual machines. Such a solution optimizes the entire virtual platform, operating invisibly with zero system resource conflicts, so that most fragmentation is prevented from occurring at all. The overall effect is that
unnecessary I/Os passed from the OS to the disk subsystem are minimized, and data is aligned  on the drives for previously unattainable levels of speed and reliability.

Additionally, a tool is provided so that space is recovered on virtual disks that have been set to grow dynamically.

Such a virtualization optimization solution should be the foundation of any virtual machine scheme for all enterprises.