Showing posts with label file optimization. Show all posts
Showing posts with label file optimization. Show all posts

Wednesday, July 27, 2011

How Good is the “Free” or “Included” Product?

There have always been all kinds of “free” or “included” items, meant to sweeten the sale of a main product. For example, you buy a new home in a subdivision, and the kitchen appliances—oven, stovetop, microwave, dishwasher—are already built-in. It might also include, “free” a washer and a dryer. Or, you buy a new computer, and it has a built-in camera. On top of that, it may even include free video editing software.

The upside is that these items were free, or included in the overall price. The downside, however, is that you’re now stuck with trying to make them work for the functions you intend. Are those kitchen appliances going to be what you really need for cooking—or would it have been better if you’d been able to pick your own, after you’d thoroughly checked them out? Or, how good is that camera included with your computer going to be? Could you do a professional shoot with it? And, is it possible to perform a competent editing job with the free video software?

Chances are, these items are nowhere near what you actually require in terms of functionality and features.

The same holds true for a “free” or “included” defragmenter. Fragmentation—the splitting of files into pieces, or fragments, for better utilization of disk space—is the primary drain on computer system performance and reliability. If it is possible to obtain a fragmentation solution for free, and that solution does the job, it’s certainly a winning situation.

The problem, however, is that because of the many innovations in today’s computing environments—such as thin provisioning, replication, snapshots, Continuous Data Protection (CDP) and deduplication, to name but a few—it takes more than a defragmenter, free or otherwise, to do the job. An optimization solution, which addresses a broader scope of issues than fragmentation only, is required.

Another issue is that even as a defragmenter, a free product has severe limitations and cannot address the enormous file sizes, voluminous drive capacities and high rates of fragmentation inherent in today’s systems.

A robust optimization solution addresses several aspects of file read and write I/Os in addition to fragmentation—a majority of which is prevented before it even happens. It includes  intelligent ordering of files for faster access, and other advanced technologies designed to automatically maximize system performance and reliability. Most importantly, it is truly up to the job of dealing with these important issues in today’s systems.

So carefully check out “free” or “included” items. Upon testing and inspection of features and functionality, you’ll find that you’d be better off paying the relatively inexpensive up-front cost, to save untold waste in time and money down the road.

Wednesday, July 13, 2011

Fine-Tuning Computer System Efficiency

Mechanical efficiency—beginning as a way to save effort, time and expenditures—has today become a fine art. An excellent example is the aircraft. In 1903, after years of research and experimentation with wing types, controls, propellers and many other elements, Wilbur and Orville Wright managed to get a 605-pound plane to fly for 12 seconds roughly 10 feet off the ground. Today, a little over 100 years later, aircraft efficiency has become so advanced that we now take an enormous object weighing between 500 and 700 tons, bring it up to a speed of around 400 miles per hour, and cruise it at an altitude of 30,000 feet.

The advances that have made this possible have been made in aerodynamics, fuel efficiency, utilization of space, weight distribution, and many more. And of course it doesn’t stop there; NASA has recently announced new experimental aircraft designs that actually move more people and cargo, yet use far less fuel and are even more aerodynamically efficient.

Similar remarkable innovations have been made in the field of computer efficiency. The first general-purpose electronic computer, ENIAC, put into operation in 1947, weighed more than 27 tons, occupied 1,800 square feet, and contained 17,468 vacuum tubes. Obviously incredible at the time, today’s computers, occupying a tiny fraction of the space and consuming an infinitesimal portion of the power, actually complete many times more work.

Yes, we have come a long way. Today, we store enormous multi-gigabyte files on media that can be held in a palm, yet which has the capacity in the terabyte range. We can even run powerful servers that are virtual, take up no physical space at all and only consume the power of their hosts.

An aspect of computer efficiency that has not been completely conquered, however, is the use of I/O resources. Improper use of these has ramifications that extend all across an enterprise and affect processing speed, drive space, and overall performance.

File fragmentation—the splitting of files into pieces (fragments) in order to better utilize drive space—is a fundamental cause of I/O read and write inefficiency. When files are split into thousands or tens of thousands of fragments, each of the fragments must be obtained by the file system whenever a file is read. Because free space is also fragmented, file writes are also drastically impacted. Overall, havoc is wreaked upon performance and resources.

Defragmentation—for a long time the sole method of addressing fragmentation—is now no longer adequate to the problem. Today’s complex technical innovations require that efficiency take the form of optimization technology, which both maximizes performance and eliminates wasted disk I/O activity. With this technology the majority of fragmentation is now actually prevented before it occurs, while file optimization and other innovations add to and round out the whole solution.

Optimization of I/O reads and writes are the final step in making today’s computing environments completely efficient.