Data Center

10+ years of 100% uptime for global video delivery provider

Can you imagine how great you would feel if your applications and websites never went down, even during maintenance windows? What if you could count on your servers, storage devices, and network equipment to always be powered on, plugged in to the network, and running with the right temperature and humidity?

Global video delivery provider MobiTV does not have to imagine, because they have experienced it in real life for more than 10 years. Recently, MobiTV and RagingWire announced they had celebrated their 10th anniversary of zero downtime.

10+ years of 100% uptime for global video delivery provider MobiTV

MobiTV is a global leader in live and on-demand video delivery solutions. They collaborate with industry leading media partners such as NBC, CBS, ESPN, and Disney to deliver cutting edge “TV everywhere” solutions to mobile devices. They serve a global customer base of broadband/DSL operators, cable and IPTV operators, as well as wireless and over-the-top operators like AT&T, Deutsche Telekom, Sprint, T-Mobile and Verizon.

When you think of demanding, infrastructure-intensive applications, MobiTV sits near the top of the pyramid. So MobiTV houses their IT systems in one of California’s largest and most-reliable data centers, a 53MW campus from RagingWire.

Casey Fann, MobiTV’s vice president of operations and professional services said, “RagingWire’s Sacramento data center campus is an ideal location for Bay Area companies looking for data centers with minimal earthquake risk, low-latency network access, and reduced power utility costs. By selecting RagingWire, MobiTV has a partner to facilitate the delivery of live or on-demand reliable, and world-class media content with 100% uptime and impeccable customer service.”

You may find it surprising to know the average duration of a North American data center outage is 95 minutes. And, the maximum outage was 415 minutes long. This is according to a Ponemon Institute study from January 2016. It stated 25% of these outages came from UPS system failures. The same study also measured the average cost of a data center outage at $740,357 and the maximum downtime costs was $2,409,991.

When you hear stats like this from Ponemon Institute, the ROI on uptime is compelling.

See the rest of MobiTV’s amazing story about 100% uptime for 10 years

Blog Tags:

Tech Primer: Dark Fiber, Lit and Wavelengths

Some IT colleagues have asked me, “What is dark fiber and what’s the difference between lit and wavelengths?” Let’s begin by understanding the basic concepts of fiber optics and the difference between dark and lit fiber.

Difference between dark fiber and lit fiber

Unlike wire, which passes electricity through a metal conductor, fiber optic cables use a specialized glass or plastic allowing for data to be transmitted great distances by passing light through the glass. Fiber that isn't currently being used and has no light passing through it is called dark fiber.

Utilizing this fiber, telecommunications carriers can provide something called “wavelength” services, also known as “waves.” This works by splitting the light into various wavelength groups called colors or “lambdas”. Carriers sell these wavelengths to separate customers and then recombine the colors and transmit it across fiber. Therefore, lit fiber is fiber that has been lit with light by a carrier.

Dark and lit fiber explainedTo better understand lit fiber’s wavelengths, think of a rainbow where each color is a channel of light. Remember Mr. "ROY G. BIV" from middle school – Red, Orange, Yellow, Green, Blue, Indigo, and Violet?

Wavelengths essentially split a single fiber into channels. Unlike copper wire, which uses an electrical signal, fiber optic communications utilize either a LASER or a LED operating at a very high frequency. Fiber optic cables have the ability to carry much higher frequencies than copper cables. Traditional bandwidth throughput (1Gb/10Gb/40Gb/100Gb) will easily fit into a single color channel. Each fiber can be split into hundreds of colors, but a typical lit fiber is split into sixteen colors or lambdas.

The business of fiber optics

In the late 1990's, there was an uptick in the number of carriers building out dark fiber networks. In addition, there was a high degree of inter-carrier trades – a practice where carriers would swap dark fiber assets with other carriers in order to gain a foothold in markets where they were not present or had limited capacity. Inter-carrier trades coupled with mergers and acquisitions allowed even the smallest of carriers to offer competitive data transport agreements around the world.

However, a significant portion of this built capacity remained untapped for years. Carriers wanted to avoid potential long-term lost telecommunications revenues and were reluctant to enable competitors in their high margin wavelength services market. In addition, carriers did not want to cannibalize their often oversubscribed and lucrative Ethernet services market with inexpensive high-capacity fiber. For these reasons, many carriers today still do not sell dedicated fiber assets directly to customers.

New demand for bandwidth

Technology needs have changed over time. Enterprises have become more dependent upon cloud services, interconnected infrastructures have grown in number, and a massive growth in the Internet of Things (IoT) all require a large data communications infrastructure that can scale rapidly, predictably, and on demand.

To fulfill these needs, dark fiber providers have entered the market and are working to provide massive bandwidth, low latency, and high quality connectivity to the end customer in the form of raw glass: dark fiber.

For additional information on the pros and cons of dark fiber versus lit services from carriers, read my blog post titled, “Shedding Light on Dark Fiber for Data Centers.”

White Paper and Webinar from Data Center Knowledge: “Strategic, Financial, and Technical Considerations for Wholesale Colocation”

One of the more interesting developments in the data center industry over the last few years has been the emergence of the wholesale data center market.

Think of wholesale data centers in the context of the traditional retail data center market. Wholesale data centers offer dedicated, multi-megawatt deployments spread out over large footprints of multiple thousands of square feet. These deployments are configured as secured vaults, private suites and cages, and entire buildings.

In fact, RagingWire has made a strategic shift into wholesale data center solutions as was reported in Data Center Knowledge in the article, “RagingWire Pursuing Cloud Providers with New Focus on Wholesale.”

White Paper - Strategic Considerations for Wholesale Data Center BuyersWhile RagingWire has been a leader in wholesale data center solutions, we have not seen very much substantive analysis published on the wholesale space. So we decided to underwrite a research project with Data Center Knowledge to study wholesale colocation and publish a white paper and webinar entitled, “Strategic, Financial, and Technical Considerations for Wholesale Colocation.” Both the white paper and webinar are available free of charge.

You can watch/listen to the webinar by clicking here.

You can download the white paper by clicking here.

What will you learn from the white paper and webinar?

From a strategic perspective, there are a number of new applications, such as video, social media, mobile, big data, and content that are leading to new computing paradigms where the design, scale, and location of data centers become increasingly important.

The financial considerations point out how sales tax abatement, scale economics, and targeting top data center markets as part of your data center portfolio can be advantageous with wholesale data centers. For example, one customer of ours said that for every dollar they spend on colocation they spend $10 on computing equipment. Say you are spending $1 million on wholesale colocation leading to $10 million in equipment purchases. At 5% sales tax, that’s a savings of $500,000.  And equipment is often refreshed every 3-5 years!

Finally, the section on technical considerations studies power density, energy efficiency, PUE and ASHRAE standards, DCIM (Data Center Infrastructure Management), and maintenance. Each of these technical elements can have a significant impact on the performance/cost of your wholesale data center, and ultimately on your business.

RagingWire is proud to support this important research and pleased to share it with the industry.

In the Data Center Colocation Industry, Top Markets Matter

You may have seen some articles recently talking about the rise of secondary markets in the data center colocation industry. Cities often mentioned on a list of secondary data center markets include: Cleveland, Jacksonville, Kansas City, Milwaukee, Minneapolis, Nashville, and Pittsburgh.

For those of us involved in the data center industry, whether as suppliers or buyers, it is important to consider secondary markets as part of our data center strategy. But, it is also important to remember that top markets are top markets for a reason. The top six US markets (Northern Virginia, New York / New Jersey, Bay Area / Silicon Valley, Dallas-Fort Worth, Chicago, Los Angeles) represent over 70% of US sales, and the US represents over 50% of the world’s data center sales.

In short, in the data center colocation industry – top markets matter.

Let’s take a look at a few key considerations regarding data center colocation markets.

The Fundamentals of the Colocation Market are Sound

The good news is that whether you are looking at top markets, secondary markets, or a combination of the two, the fundamentals of the data center colocation industry continue to be strong.

Businesses of all sizes are taking advantage of the “pay as you go” model offered by colocation, shifting their financials from up-front capital expenses to ongoing operational expenses. Enterprises looking to replace aging in-house data centers or support the growth of their business applications are increasingly looking to colocation.  Cloud-based companies, both hosting and software applications, need a place for their systems to live. These cloud-based companies typically do not want to design, build, and operate their own data centers.

Economies of Scale Add Up

The data center colocation industry is vast, growing, and commoditizing all at the same time. This combination of attributes tends to drive economies of scale which can be more prevalent in top markets. The colocation providers that win in these market conditions have access to low-cost capital and then spread the infrastructure costs across a diverse and growing customer base. Scale economies are particularly strong in the wholesale colocation markets where multi-megawatt deployments are the norm.

Be Wary of Growth Rates

Most of the analysis of secondary markets talks about growth rates. As always, be wary of comparing growth rates across bases of different sizes. For example, the most recent report from 451 Research on data center supply in secondary markets lists Nashville as having 109,500 square feet of operational data center square footage. If the entire Nashville data center market grew by 50%, it would still not be as large as one of RagingWire’s data centers in the top market of Ashburn, Virginia.

“Competitive Mass” Helps Everyone

We’ve all heard of critical mass being required to get a market growing. The same concept can be applied to competition. Both buyers and suppliers benefit from having multiple providers of similar data center colocation offerings in the same market. Buyers benefit from having multiple options to compare, and the assurance of getting a fair price. Suppliers benefit from having access to the talent, technology, and potential customers that the market attracts.

In can be difficult to find “competitive mass” in secondary markets. For example, according to 451 Research, the top three providers in a secondary market typically have 50-70% of the operational space and these market leaders vary from city to city. In addition, secondary markets tend to have lower utilization rates and absorption per year when compared to top markets, leading to reduced market efficiencies. According to 451 Research, top markets achieved utilization rates over 80% while secondary markets had an average utilization rate of 68% in 2014.

The Drivers for Secondary Markets: Regulations, Network Optimization, Economic Development

As we have seen, there are a number of forces driving the top data center colocation markets. What’s driving the secondary markets? Regulations can require that data generated within a geography stay in data centers within the geography. For example, some hospitals build on-site data centers as part of their HIPAA regulations. Network optimization might drive you to have a data center in a secondary market as part of your global footprint. Finally, economic development incentives might attract data center companies to build a facility in a secondary market.

Conclusion: Top Markets Matter

Top data center markets are critical as part of your data center portfolio. Top markets offer dense fiber and robust telecommunications, reliable and cost effective utility power, experienced data center staff, and an economic environment that enables a data center ecosystem to thrive.

We expect that top markets will continue to drive the data center colocation industry while secondary data center markets will develop as spokes to the top market hubs, not as a replacement.

Blog Tags:

Pages

Subscribe to RSS - Data Center