Home > Articles > Gaining Control of the Storage Environment

Gaining Control of the Storage Environment

October 11th, 2006

IT organizations today face the difficult challenges of managing exploding data volumes, delivering high service levels, and mitigating business risks while at the same time keeping costs under control. As if that weren´t enough, they must do all these things within a data center environment where complexity has grown out of control. This article examines the steps IT organizations can take to gain centralized control across their multi-platform server, storage, and application environments.

Over the past 10 years, organizations have gone from leveraging email as an alternative communications vehicle to depending on it as their most mission-critical application. According to the Enterprise Strategy Group, more than 60 percent of mid- and enterprise-tier businesses together believe that email is the number one mission-critical business application for their organization.

According to some industry estimates, the volume of email that businesses are storing is increasing by more than 60% each year. An analysis conducted by the Radicati Group helps to put that figure in perspective.

The research firm has estimated that the average corporate email user sends and receives a total of 84 messages per day. The average message size of a message without an attachment is about 22KB. By 2008, the firm estimates that an average corporate email user will process up to 15.8MB of data per day. For a company with 1,000 users, that´s an average of 10GB per day — or 200GB per month.

Of course, the increase in the volume of emails coming into the corporate network introduces an exponential growth in associated hard costs by regularly exceeding available capacity of traditional email gateway systems, mail transfer agents, email storage servers, groupware servers, and network bandwidths.

This explosive growth in data volumes comes at a time when the average enterprise data center is becoming increasingly complex. That´s partly because organizations rarely buy all of their servers, routers, switches, and other network hardware and software from a single vendor at one time. If that were the case, they would be able to implement a truly end-to-end, homogenous network that works together and provides some form of centralized console for management and administration.

But as IT departments know all too well, networks have a way of evolving on their own. As Peter McKellar, a Symantec group product manager, recently told the trade publication Processor, networks grow over time, picking up and adding whatever piece makes the most sense or provides the best value at the time. The more the network grows, the more cumbersome it can be to manage and secure.

“For example, if a company uses servers from both Sun and HP, they need to use two different volume managers, two different file systems, two different clustering tools, etc.,” McKellar said. “For companies that have three or four different hardware vendors and dozens of different application vendors, the list of infrastructure software they must support becomes unmanageable.”

Benefits of centralized management

But what if organizations were able to gain more visibility and control over their data center storage environments? What if they were able to eliminate numerous point solutions and instead manage their storage infrastructure with one tool? Wouldn´t they be in a better position to manage that explosive data growth, optimize storage hardware investments, and adapt to changing business requirements?

Articles

Comments are closed.