CTR Exclusives

What’s Your Server Migration MPO and MTO?

What’s Your Server Migration MPO and MTO?

 

By Scott Van Dyke

The terminology recovery point objective (RPO) and recovery time objective (RTO) are commonly used to plan how to best protect data and applications around disaster recovery scenarios. For proper disaster planning, a business determines the maximum acceptable level of data loss following an unplanned event that could result in data loss (RPO). A responsible business will also determine the maximum amount of time they can tolerate without their critical applications (RTO). The primary focus RPO and RTO is to plan how to best protect your critical applications and data in the event of a disaster.

One often overlooked area that impacts a business is what happens to the critical applications and data for a business when performing a server migration. A server migration typically arises for one of the following reasons:

Read more...
 

Tiering: Scale Up? Scale Out? Do Both -- Part 2

Tiering: Scale Up? Scale Out? Do Both -- Part 2

 

By Mark Ferelli

Our exclusive interview with Hitachi Data Systems’ Hu Yoshida continues in part 2 of this feature. Click here to read Part 1.

MF/CTR: The more I hear it the more it sounds like the caching capability is at the foundation of the storage pool tiering infrastructure.

Hu/HDS: That’s right. It is the key because if you have those caches in separate nodes and you can’t use those caches as one global cache, then you’ve got silence. That’s where you split the workload between those silos to be able to utilize them and that’s more operational cost.

MF/CTR: If you have effective caching,  then you’ll be able to set up a foundation to both scale up and scale out without having to go crazy in terms of capacity?

Read more...

Oversubscribed InfiniBand Fabrics Cut Data Center Costs

Oversubscribed InfiniBand Fabrics Cut Data Center Costs


By Joseph Yaworski

Two factors that determine the cost and performance of a data center’s high-performance computing cluster (HPC) are the selection of the network interconnect technology and the design of the fabric. While many people view Ethernet as the least expensive approach—and InfiniBand® as the higher cost alternative—there are ways to design an InfiniBand fabric that can reduce the cost and provide better performance than 1-gigabit or 10-gigabit Ethernet (1GE or 10GE). Many applications benefit from the low latency of InfiniBand, but do not need the full bandwidth that it offers.  This means InfiniBand might be the best performing and most cost-effective network solution.

Read more...

Data Warehousing—Software as a Service?

Data Warehousing—Software as a Service?

 

By John K. Thompson

The advances we regularly record in technology make more opportunities available to us almost daily, it seems.  Those opportunities aren’t always apparent to those making the initial advances possible.

For instance, companies struggled for years with the question of which data to keep in their IT infrastructure after it was no longer immediately needed.  The cost of storage was prohibitive, of course; keeping gigabytes of data was not seen as economically feasible, especially in an online implementation.  So, data was regularly trashed, or moved offline onto tape, which was pretty expensive as well on a per-megabyte basis.

Read more...

Seven Questions to Ask Before Using Deduplication

Seven Questions to Ask Before Using Deduplication

 

By Steve Whitner

Deduplication has been one of the hottest technologies in the storage industry for almost three years.  During that time, it has generated marketing wars, industry consolidation, and comments and controversy from vendors. IT managers in most midrange data centers typically have limited staff and few backup specialists, and it can be hard to figure out how deduplication might fit into their situation.  Following are important questions for IT managers to ask as they consider deploying deduplication in a midrange data center.

1. Is data deduplication now a mainstream technology?
Yes.  Deduplication appliances have absolutely made the transition from experimental to mainstream.  Analysts tell us that a little over 30% of IT departments use it for at least part of their data, and vendors now offer products with a couple of technology generations behind them that are optimized for simplified, non-disruptive deployment.   

However, this doesn’t mean that every solution is equal.  Most deduplication vendors go through a learning curve, so it pays to ask about experience, references, and support when evaluating solutions. 

Read more...
Page 6 of 25