Wednesday, March 30, 2011

The Liabilities of Free Software

Anything obtained for free can certainly seem like a benefit. And sometimes it is, like the rare time when you stumble upon someone giving away a certain item that turns out to be worth much more than originally suspected, or to be much more useful than at first glance. In the case of software, however, free isn’t necessarily good, for a number of reasons.

There are different varieties of free software. The variety that is actually free, with no strings, may or may not suit your needs. Many times it doesn’t, and when actually examined, the reasons for this are pretty obvious. Prime among them is the fact that you can’t hire great software engineers for free, nor obtain the necessary development and testing hardware. This takes money. Hence, free products built by an individual or developed through an open-source scenario don’t have the robust engineering that has gone into paid-for software.

Another prime reason such software may not work well is that the developers involved, while well-intentioned, are not necessarily experts in the area the software is designed to address. Companies that specialize in, say, anti-virus, have the budget to pour into researching the most effective ways to combat computer viruses using the least amount of computer resources. They are also able to stay constantly abreast of the latest viruses and update their users. Or, developers that are expert in and focus on defragmentation have found ways to keep systems free of system-crippling file fragmentation, and will consistently be on top of operating system changes and anything else that affects the efficiency of their product.

The above reason can also apply to software companies attempting to be a “one size fits all,”  designing and selling software in areas they are not necessarily expert in simply in an effort to retain customers that have purchased their other products.

Another variety of free software is free trials. These can be more helpful, especially if all features are available. They are not always, though, so it is advisable to check. And in any case, they’ll almost always have time limitations on them. Eventually, you do better to purchase the full version of the product.

In finding the right software, the best advice that can be given is, free or not, check the functionality. Make sure it has the features that you actually need. Seek out users that have used the product through personal contact or through forums. If you can find something that genuinely does the job for free, great. But most of the time, you’ll find that purchasing a full version of a product from a company expert in the area you’re aiming to address, will be your safest and, in the long run, your most economic choice.

Wednesday, March 23, 2011

Eliminating Waste of Computer Resources

Up until fairly recently, we have been a wasteful society. Cutting down forests, piling trash into landfills, burning as much energy as we could possibly consume—these have been the indexes of our lifestyles. It is only in the last few years that we as a culture have begun to realize that eventually such waste comes back to haunt us in the worst ways imaginable, and how vital conservation of resources is to our survival.

Computer resource management has paralleled the trend in waste. A relatively short time ago, “more power” was the mantra of computing—with no regard whatsoever to the burning of energy. As hardware prices dropped, hardware resources could be added and wasted without a thought, just as long as the system was kept up and running as efficiently as possible.

Today, most of us have realized the errors of our ways. Form factors have shrunk to be able to accommodate data and processes that used to require ten times as many resources. Technology has now evolved so that servers can be launched without any extra hardware whatsoever. Computers have become far more energy efficient so that the same or even more work can be accomplished using less.

One area of waste in computer resource management still being addressed is the waste of I/O resources. This particular type of waste has significant ramifications including waste of drive space, wasted hardware purchases, waste of hardware in general, and even waste of manpower.

Such waste comes about when files are left in a fragmented state and not properly optimized. Fragmentation is the splitting (fragmenting) of files into thousands or tens of thousands of pieces, in order to better utilize drive space. Many extra I/O requests are required to read—and to write—files that exist in such a state. Additionally, processes take far longer, users have to wait, and performance generally suffers.

While defragmentation for a long time was the only method of dealing with these problems, today’s complex technological innovations, along with enormous volume and file sizes and highly increased rates of fragmentation, make simple defragmentation outmoded.  Optimization technology now exists to maximize performance while eliminating wasted disk I/O activity. A majority of fragmentation can now be prevented, making it a non-issue, and file optimization and other solutions are combined to make a total solution.

With this technology utilized in all enterprises, the waste of I/O resources is eliminated altogether. It is one major step in our overall conservation of computer resources.

Wednesday, March 16, 2011

When “Free” Actually Costs More


This might have happened to you: You’re off on a resort vacation that you’ve looked forward to for months, in some tropical paradise. You’re dreamily exploring the resort on the first day, and you come across a person who offers you a free champagne brunch, for you and your spouse. You think, great! A free high-end meal! When you get there, though, you realize how free it isn’t. The food is great—but you have to sit there for two hours listening to a sales pitch on condo sharing. And for the rest of your vacation, some salesman is pursuing you all over the resort, popping up every time you turn around, trying to close you on buying into his “great plan.” That “free brunch” was anything but—it put a serious damper in your dream vacation, which you paid substantial money to take.

In the world of computing, the same could be said for a free or inexpensive defrag utility. Yes, it costs little to nothing at the outset—but defragmentation is a serious performance problem, and soon you’re going to have to run that utility. The first problem you’re going to have is, it needs to be scheduled. Most corporate systems need to be up and running constantly, so finding a time window in which to schedule defragmentation is a major problem. And, the time that the system is offline is the first major cost of the free or inexpensive utility.

When you finally can schedule and run it, you find out that it runs, and runs, and runs, consuming system resources the whole while, and never seems to actually complete a defrag job. If you add up the number of hours that IT staff have spent trying to get useful work out of this utility, you’ll come across the second major cost of this utility.

There are third-party solutions to fragmentation that are far more efficient. In fact, technology has now evolved to the point that a majority of fragmentation can actually be prevented before it even occurs—completely automatically, and with no impact on system resources. The I/O resources required to defragment files after they have already been fragmented are saved, and peak performance is constantly maintained.

Now, compare the price of the of the free or “low-cost” utility to the third-party solution. There is an initial cost for the third-party solution—but once it is installed and running, fragmentation is basically a thing of the past. The net result: the bargain utility actually costs far more.

Wednesday, March 9, 2011

Overcoming Waste with Proper Optimization

In today’s business environment, “waste” is a particularly derogatory term. It can apply to money, material, use of personnel, and even production motion. Many innovations, such as lean manufacturing, exist to cut down on and prevent waste, and companies themselves often evolve programs that curb waste in just about every business process.

In the particular area of computing, there are categories of waste that can go unnoticed. For example, IT staff—the hours of whom do not come cheap—can be spent in chasing up problems that never seem to get solved. Backup and other processes running inefficiently waste both staff and system time. Hardware becomes worn out, and is replaced before its time, which is a material waste.

A closer look reveals that it is the lack of proper optimization that causes much of this waste. It all begins with wasted I/O traffic; a much higher-than-needed number of I/Os are being spent retrieving files that are in a state of fragmentation, or a non-optimized state. Fragmentation means that a file exists in thousands or even tens of thousands of fragments—a natural state of affairs unless a solution is in place to prevent it.

If fragmentation is not prevented, and files are not optimized, there is a myriad of symptoms that can result, such as dramatically slow performance, process freezes, system hangs, and even disk crashes. IT personnel can spend countless valuable hours chasing up these symptoms—but unless the actual root of the problem is located and addressed, these hours are wasted as the problems won’t actually be solved.

This issue also causes backups and other processes that do not complete in a timely manner. This results in waste of user time—waiting for data or results—and a waste of system time as these processes take far longer than they should.

Fragmentation also takes a serious toll on hardware and energy expenditures. Because so much extra I/O activity is required, hardware life can be cut by 50 percent or more.

Previously, defragmentation was the only method of dealing with such issues. Due to today’s complex technological innovations, however, along with enormous volume and file sizes and highly increased rates of fragmentation, simple defragmentation is no longer enough.

Optimization technology now exists to maximize performance while eliminating wasted disk I/O activity. A majority of fragmentation can now be prevented, making it a non-issue, and file optimization and other solutions are combined to make a total solution.

Such technology means just one very significant benefit for an enterprise: eliminating the waste.

Wednesday, March 2, 2011

The New Era of Fragmentation Prevention

Probably billions of words have been written on the subject of “reactive versus proactive” in many areas. These include money management, company management, marketing, and just about every important endeavor under the sun. If you can see changes coming on the stock market, you can act accordingly to make a profit no matter what. If you can predict problems within your company, you can head off potential production crunches or work slowdowns before they occur. Even on a personal level, if you can set aside money on a regular basis, you can cushion yourself and your family against emergencies.

In the IT area, the topic of “proactive versus reactive” can be a pretty hot one, simply because of the number of “fires” that seem to present themselves on a regular basis. When the help desk phone is ringing off the hook, getting ahead of the curve can seem nearly impossible.

Interestingly, file fragmentation has been around nearly as long as computing, and it’s only recently that a truly proactive approach to it has been developed. Defragmentation has been utilized for many years as the “proactive” approach—and when that’s all there is, it’s the best anyone can do.  The technology has become very advanced, and in fact can now be done completely automatically, so it can’t be said that defrag is a totally “reactive” approach.

But a truly proactive approach to fragmentation would be its prevention before it happens. That would save the I/O resources required to defragment files after they have already been fragmented, and set up the disk to be read smoothly right from the beginning.

One could wonder why the major operating system developers, especially after the considerable years that have passed, have not built OSs in a way that they would not fragment files. It’s a worthy question—but you have to realize that one of an operating system’s prime functions is the efficient utilization of disk space. In that regard, fragmentation can be seen to be a “bug not a feature” and not something to be totally eliminated.

It has finally happened, though, that a fragmentation prevention utility has been evolved and is now on the market. The challenge before developers was to prevent fragmentation in such a way that it did not impact resources in any way—and this considerable challenge has been met. The majority of fragmentation can now be prevented before it happens, completely automatically, For the first time, fragmentation can be made virtually a thing of the past for enterprises everywhere.

Now, we can truly say we have a “proactive” approach to fragmentation.