Wednesday, April 27, 2011

Free Defragmenter: For the Enterprise?

When most people buy a large-screen TV, they usually do so in an effort to bring the experience they might have in a movie theater to their homes. And bringing it home is usually somewhat of an event: everyone plops down on the couch, perhaps popcorn is made, and the new acquisition is fired up. Everyone waits...and there it is! The giant, lifelike crystal-clear picture that everyone wanted!

But something is wrong. While that picture is amazing, it is accompanied by sound that is tinny and has no depth at all. This is because the family, in their rush to expand their home entertainment experience, didn’t stop to think that you not only see great things on such a television, you also hear them. Now all they have for sound is the tiny free built-in speakers—never meant to be a high-quality sound experience matching the picture. If they’re going to really do it right, they’ll need to go back out and get a good quality surround-sound system.

Moving over into the field of computers, a similar thing could be said about a free or included defragmenter. A computer system in an enterprise is robust, can store and retrieve millions of files, and can serve them up quickly from servers to users near-instantly. High-tech innovations such as SAN, NAS, and virtual machines are consistently brought on board to increase that experience and functionality.

File fragmentation, however, can slow that experience and functionality to a crawl. In order to address it, an enterprise-ready fragmentation solution needs to be brought to bear. If not, fragmentation compounds unchecked—and all of its symptoms such as slow performance, unexpected hangs and freezes and reduced hardware life continue unabated.

Much like the tiny speakers were never meant to provide a true home theater experience, the free defragmenter was never meant to address fragmentation on an enterprise level. It must either be run manually or be scheduled—neither of which are options in today’s enterprise, with servers that must remain up and running 24/7.  Additionally, it will be found that this defragmenter does not have functionality to keep up with the hectic rates of fragmentation found in today’s business environments. 

The only true solution for an enterprise is one that prevents a majority of fragmentation so that it never impacts the system or users. It never requires scheduling or any kind of intervention from IT staff—which leaves them free to address more urgent priorities. Best of all, maximum performance and reliability are assured.

Wednesday, April 20, 2011

Evaluating an Enterprise Fragmentation Solution

There is an old expression: “Never bring a boy to do a man’s job.” Crude and politically incorrect though this saying may be, it does convey a truth: if you want to get a task accomplished, make sure the person you assign to it has the necessary know-how to get it done (be they man, woman or juvenile). For example, you wouldn’t bring a high-school student who has just started auto shop classes to make repairs on a passenger jet plane. Or, you wouldn’t put an apprentice fireman in charge of putting out an oil well fire.

Similarly, any software application you employ needs to be adequate to the task at hand. For instance, if you were required to create a full color brochure for your business, you wouldn’t use Microsoft Paint or another free, very basic graphics program. It’s not that any of these are bad products, it’s just that you really need something like Adobe Illustrator to obtain the degree of flexibility, options and detail that the job actually requires.

When it comes to solutions to fragmentation, there are basic, free or inexpensive solutions available as well. Some enterprises seek to cut licensing costs by employing one of these—only to discover down the road that fragmentation is worse than ever, simply because the “solution” they employed was not adequate, and IT costs have spiraled out of control. Again, these aren’t “bad” solutions—they’re just not meant to address the critical levels of fragmentation that occur in today’s corporate computing environments.

The first consideration on evaluating an enterprise-level solution to fragmentation should be, is it enterprise tested and proven? Can it stand up in real-world environments? Read reviews and evaluations. Talk to sites that have used it. Ask them if it really does the job, or if they having to constantly try and tweak it or find “workarounds” to a fragmentation problem that is still persisting.

It’s easy to tell if the utility you’re evaluating isn’t doing the job: check the fragmentation levels before, during and after. See if the utility actually finishes running and accomplishes the job. Many utilities, if they aren’t up to the challenge, will simply grind endlessly without achieving the goal of files in a defragmented state. An adequate solution will eliminate fragmentation and return a system to its native maximum performance.

A third (but no less important) consideration is, how much human intervention is required? Today, technology exists to prevent a majority of fragmentation before it ever occurs—fully automatically. Valuable IT time is not taken up trying to schedule or run a fragmentation solution; the solution is simply employed, and no one needs to worry about fragmentation from there on out. A basic solution will come nowhere near that level of functionality.

File fragmentation can be a costly problem, if not properly addressed. Make sure the solution chosen is truly enterprise-level and does the job.

Wednesday, April 13, 2011

How Efficient Does Your System Have to Be?

The word “efficient” is defined as, “Performing or functioning in the best possible manner with the least waste of time and effort.” It could also be defined as, “The extent to which time or effort is well used for the intended task or purpose.”

The word certainly can be, and usually is, applied to a business process. The objective of any  business process is to get the highest quality work done in the least amount of time, with a minimum of effort. Similarly, "efficiency" also applies to mechanics, and has going back hundreds of years. It means getting the most mechanical production for the least amount of energy output. Efficiency in business processes, mechanics and energy also has a keen impact on economics.  

A computer system is unique in the regard that it directly impacts all these elements: business processes, mechanics, energy and therefore economics. Hence a computer system must be as efficient as possible in every aspect of its operation.

Many innovations have contributed to the highest-ever efficiency we see in systems today. Components use the least amount of power to render maximum processing and storage. Form factors have become increasing small so as not to over-utilize another aspect of efficiency: space. Probably the most interesting of these innovations is the virtual machine, which relies only on the power of its host and takes up no physical space at all.

One aspect of computer system efficiency that is still sometimes overlooked, however, is the use of I/O resources. Inefficient use of these impacts all other computer resources: drive space, hardware life, processing and backup speed, and—worst of all—performance.

A primary cause of I/O resource inefficiency is file fragmentation. A natural function of a file system, fragmentation means the splitting of files into pieces (fragments) in order to better utilize drive space. It is not uncommon for a file to be split into thousands or tens of thousands of fragments. It is the fact that each and every one of those fragments must be obtained whenever that file is accessed that wreaks such havoc on performance and resources.

For many years, defragmentation was the only method of addressing fragmentation. But because of today’s complex technical innovations, efficiency now comes in the form of optimization technology, which both maximizes performance and eliminates wasted disk I/O activity. The majority of fragmentation is now prevented, while file optimization and other innovations are combined to complete the solution.

This solution puts the final touch on—and completely maximizes—computer system efficiency.

Wednesday, April 6, 2011

True Efficiency in Computing

Efficiency has always been a key factor in running a business venture. But since the industrial revolution and the amazing cost reductions that accompanied it, efficiency has become a fine art. This was made even more true with the advent of computers—now it wasn't just manual labor that could be mechanized, but also tasks that actually required thinking. Add to that innovations such as lean manufacturing, and efficiency has been moved into whole new realms.

In today’s economy, focus on efficiency is as much of an issue as it ever was, perhaps even more. Equipment is kept in repair and made to last as long as possible. Business processes are scrutinized for waste of time and resources. Job performance is paid careful attention to for the same reasons.

In the area of computers, efficiency has always played a major role. Probably the most obvious advances have been in the realm of form factor: how much computing could be done with how little physical material? Chips became smaller and lightning-fast; storage capacity grew enormously while media steadily shrank; cabinets became rack-mounts, and then came virtual machines which occupy no physical space at all.

There is an area of computer efficiency that can be overlooked, however, and that is the waste of I/O resources. Such waste has a heavy impact throughout an enterprise: Drive space is wasted;  hardware becomes overworked and fails before its time; processes hang; backups fail; and worst of all performance is drastically slowed down, affecting production at every quarter.

One of the primary causes of I/O resource waste is file fragmentation. Unless it is specifically addressed, fragmentation is a natural function of a file system. It is the splitting of files into pieces (fragments) in order to better utilize drive space. It is not uncommon for a file to be split into thousands or tens of thousands of fragments. It is the fact that each and every one of those fragments must be obtained whenever that file is accessed that wreaks such havoc on performance and resources.

For many years, defragmentation was the only method of addressing fragmentation. But because of today’s complex technical innovations, enormous volume and file sizes and unheard-of rates of fragmentation, defrag has now itself become inefficient at tackling the issue. Efficiency now comes in the form of optimization technology, which both maximizes performance and eliminates wasted disk I/O activity. The majority of fragmentation is now prevented, while file optimization and other innovations are combined to round out the solution.

It is by these methods that true computer efficiency, in all aspects, is fully brought to reality.