The term and use of big data is nothing new. In fact, more and more companies, both large and small, are beginning to utilize big data and associated analysis approaches as a way to gain information to better support their company and serve their customers.

Let's put today's data in perspective. One study estimated that by 2024, the world's enterprise servers will annually process the digital equivalent of a stack of books extending more than 4.37 light-years to Alpha Centauri, our closest neighboring star system in the Milky Way Galaxy. That's a lot of data to gather or analyze – let alone understand!

According to Gartner analyst Svetlana Sicular, "Big data is a way to preserve context that is missing in the refined structured data stores — this means a balance between intentionally "dirty" data and data cleaned from unnecessary digital exhaust, sampling or no sampling. A capability to combine multiple data sources creates new expectations for consistent quality; for example, to accurately account for differences in granularity, velocity of changes, lifespan, perishability and dependencies of participating datasets. Convergence of social, mobile, cloud and big data technologies presents new requirements — getting the right information to the consumer quickly, ensuring reliability of external data you don't have control over, validating the relationships among data elements, looking for data synergies and gaps, creating provenance of the data you provide to others, spotting skewed and biased data."

With the use of big data becoming more and more important to businesses, it is even more vital for them to find a way to analyze the ever (faster) growing disparate data coursing through their environments and give it meaning.
Getting the Right Information for Your Business, Proving the Value in IT to Business, The Importance of Good Analytics in Business, Bridging the Gap between IT and Business

Getting the Right Information for Your Business

Focusing on the right information by asking what's important to the business is a key point in obtaining better data context. In a presentation held at TeamQuest ITSO Summit this past June titled "The Data Driven Business of Winning" Managing Director of CMS Motor Sports Ltd. Mark Gallagher, shared how Formula One teams successfully analyze data to ensure the safety of drivers and win races.

Gallagher explained how a team of data engineers, analyzing reams of information in real time, can help make strategic decisions for the business during the race. "In 2014 Formula One, any one of these data engineers can call a halt to the race if they see a fundamental problem developing with the system like a catastrophic failure around the corner."

It comes down to the data engineers looking for anomalies. "99% of the information we get, everything is fine," Gallagher said. "We're looking for the data that tells us there's a problem or that tells us there's an opportunity." In a nutshell, it's about finding the anomalies that matter, in the context of the business problem being managed.

A Formula One driver's steering wheel is basically a laptop, providing him with the data needed to make the best decision available. Drivers can scroll through a 10-point menu – while driving – and adjust parameters that affect the performance of the vehicle. This happens because the driver is able to get to the right data when needed to get a desired outcome.

Lots of data is collected by IT, which shares data that's important to the customer (business), and together they use that data to gain an advantage and be successful in the marketplace.

Proving the Value in IT to Business

How can you prove the value of IT to business? The ability to measure costs is key but having the ability to measure the business results that come from the use of IT services (private cloud environments, for example) will drive better business conversations with IT management.

Focus on business goals and understand how the use of IT services contribute to business results and provide the best basis for planning future services. The majority of CIOs believe the IT department can increase the value it delivers to the organization by improving cost measurement.

Traditionally, IT costing efforts have been done, if at all, at a higher or macro level. For example, total capital costs for data center construction along with associated annual operating costs for things like power, floor space, cooling, etc., or budgeting for server or storage resources on a yearly basis are based on forecasted business growth scenarios.

In today's distributed systems world, any type of cost allocation has been, in most cases, coarse at best. Sometimes IT costs will be equally shared by all organizations using the total infrastructure but this approach leads to, at best, political tension, and at worst drives organizational behaviors towards acquiring access to resources outside the influence and control of IT policies and procedures.

Most financial organizations have some type of asset database that includes information on all data center resources, when they were purchased, the price, some type of amortization schedule, and some level of annual operating expenses associated with these assets.

Typically this information is owned and controlled by the financial side of the organization. Additionally, there is typically some source of information that relates these assets to business units, services, and/or applications that they are used to support.

Most IT Operations organizations have multiple tools (in most cases too many!) that monitor and measure the availability and performance of all IT technology resources. Furthermore they have one or more sets of tools and approaches by which they are measuring their ability to successfully deliver service to their various lines of business as well as customers.

Most data center management teams have a fairly complete understanding of their data center floor: power capacity, equipment footprint layout, total cooling capacity, and costing information such as cost per square foot.

To date these three disciplines within organizations have traditionally never operated in coordination with anything other than anecdotal, ad hoc, or manual communications. But there is a huge opportunity for value added through close collaboration, the goals of which should include:

Finance places a currency value on the business work that IT resources are actually accomplishing.
Data center management understands how much work the data center is or could support over time.
IT operations cost-effectively ensures the delivery of acceptable service within their ever declining budget constraints.
Each of these three main domains has a very large, multibillion-dollar solution market ecosystem built around optimizing use cases within each domain individually. For example there are hundreds of server, storage, and network management and monitoring solutions for performance and availability management of IT resources.

There are many dozens of DCIM solutions for data center management of the physical data center. And there are a plethora of solutions for financial and asset management. All of these solutions were designed around use cases that were solely within their domains and the needs of the associated end-users and therefore capable of accepting only metrics and data sources from within those domains.

Until very recently software solutions have not existed that would allow or facilitate a more seamless and productive collaboration across these organizations in support of achieving these goals. However, recent technologies in data access and analytics are lending themselves to productively attacking this challenge of intelligent and proactive collaboration across these disciplines and tool sets

The Importance of Good Analytics in Business

How can IT leaders take existing business information and make better informed and more rapid decisions that will allow them to really cost and performance optimize their entire infrastructure? Because at the end of the day, the use cases of IT are always going to be different than the use cases of business in the context of analytics.

An example business use cases of analytics would be for things like, "we want to rollout a new marketing campaign in a new geography and we want to understand what a reasonable expectation of sales penetration will be based on past campaign behaviors and our similar demographics. We can correlate it with past sales activities and demographics and we can forecast that we will have a change in demand for our products or services by X amount."

While this example shows how a person in a business type role will see the use of analytics, those in IT know there is much more to it.

The IT side needs to optimize IT costs in the support of the business while ensuring a successful customer experience, measured solely by performance (and availability). IT are still the people who are going do it, the people who have to make the decision about how to setup configurations, how much resource is needed and when, what rules to put in place around configuration automation, decommissioning and all that the other associated technically important details. Important questions must be asked and the answers understood relating to: around performance delivery, how to optimize throughput and response time, how to do that cost effectively? All of that is still, at the end of the day, somebody in IT making a decision and that decision must be informed by analyzing the right data, in a fashion aligned to the business.

Bridging the Gap between IT and Business

An understanding of applications performance must become deeply integrated with data center management tools and data for automatic provisioning of resources to be simultaneously cost- effective and service risk minimizing. Automated provisioning of storage, bandwidth, and computing power is a primary benefit of virtualization and a powerful capability of SDDCs.

But without integrated business intelligence all that is likely to happen is that sub-optimal decisions will be automatically implemented, and more quickly than ever! Recurring or compounding inefficiencies quickly drain resources and can inflict damage. The bungled launch of Healthcare.gov is a cautionary tale here.

If advanced analytics had been intelligently applied in the planning and functional testing phases, the disastrous under-provisioning of resources to this nationally deployed service might have been avoided. It doesn't take a data scientist to understand the money, time, and political capital wasted as a result of what was, at its essence, a profound – yet preventable - disconnect between IT and business.

When teams and tools can work across silos, the synergy created becomes the basis for competitive advantage. Gathering good data streams—metrics that matter to both business and IT— and correlating them through powerful analytics will amplify bottom line results.

By measuring and analyzing more than just power utilization effectiveness (PUE), the focus of continuous optimization shifts to risk reduction, revenue growth, decreased capital and operating expenditures, and enhanced customer experience. What does it mean for a datacenter to be the most efficient possible according to the industry standard PUE?

What are you getting for your use of that efficient power? How much work are you accomplishing? If all that power is going to servers than are not cost-effectively accomplishing useful work in service to the business, is that truly efficiency?

At the end of the day, nobody buys, builds or invests in a datacenter to move electricity around efficiently. They do it to get work done. Instead of only measuring PUE and the like, correlate efficiencies with work accomplished and end-user experience. All these components need to be optimized in a continuous, automated, and integrated manner. That's what true optimization across silos looks like.

Businesses can learn that they can be successful if they're able to look at the right data in combination with powerful analytics. It all comes back to this: Good data + powerful analytics = better business results.
close
close