Bio: Kevin Leahy is the chief architect virtualization and cloud for Data Center Solutions (DCS) at Dimension Data Holdings, plc. In this role, Kevin is responsible for defining the tools and architecture Dimension Data implements to solve client business challenges using virtualization and cloud. He is also responsible for the data center solution roadmaps in these areas, spanning server, storage and desktop virtualization as well as private cloud and professional services to help clients exploit public and hybrid cloud solutions. He works with Dimension Data global accounts to define strategic approaches to address their unique IT requirements.

An industry veteran based in New York, Kevin joined Dimension Data after a 32-year career with IBM, where career highlights included leading development of the IBM high-end networking and fiber optic advanced technology, the mainframe channel and I/O subsystem and power systems development for the corporation. He also launched the IBM WebSphere platform and introduced storage virtualization into the market. From there, he led the IBM virtualization strategy followed by the IT Optimization strategy.  His last role at IBM was leading the cloud sales strategy for services.  He is a graduate of The Cooper Union in New York City with a degree in electrical engineering.


About Dimension Data: Founded in 1983, Dimension Data, plc is an ICT services and solutions provider that uses its technology expertise, global service delivery capability, and entrepreneurial spirit to accelerate the business ambitions of its clients. The $4.7 billion company is a specialist IT services and solution provider that helps clients plan, build, support and manage their IT infrastructures. Dimension Data is a member of the NTT Group.


NTT Com: What are the core focus areas of Dimension Data? 

Leahy: Dimension Data operates in 51 countries around the world focusing on several key areas. One is virtualization, both server virtualization and desktop; we are one of only two global VMware partners. Also, we spend a lot of time focusing on storage, which is a growing component of the data center budget. We have strong partnerships with EMC and Netapps, for example. Then, we also look closely at infrastructure. One of the emerging trends we’re seeing in storage is that many clients need help relocating data centers. Clients are rethinking their infrastructure—how they get from the current infrastructure to a new one, whether it’s owned by them or someone else.

One thing that is new to our portfolio is cloud activities. Cloud has become a very prevalent part of many companies’ IT strategy; one recent survey I saw said the cloud will account for one-third of IT budgets as we go forward, up substantially from just a year or two ago. Whether you think of cloud as just doing a better job of virtualizing or as taking advantage of the full pay-as-you-go model, it’s a key strategy element for all infrastructure providers. So naturally, we’re keeping a close watch on both Dimension Data’s offerings, which we first introduced in January of this year, and also working with our partners to build our clients private cloud strategies and implementation projects.


NTT Com: As cloud computing becomes more popular, how does that affect decisions related to the data center?

Leahy: Cloud represents a critical juncture for data center strategy. Many clients have done a good job of virtualizing their physical resources. But many have never gotten the savings they sought. This is because, as they virtualized, they ended up with more virtual resources than the physical resources they started with. But most of their operational costs were tied to managing those resources. They have found that without a good automation approach, their costs actually went up.

When we really look at what cloud represents, it not only provides the flexible delivery models. The cloud also uses automation to take the levels of labor required down, in many cases, by orders of magnitude. For an IT manager who has been struggling to the get savings he wanted from virtualization, the cloud represents an opportunity to reach that goal. He has to consider whether the cloud might allow him to reach that goal, either by leveraging cloud within his data center or by leveraging cloud outside his data center, using a somewhat hybrid model to optimize across the entire infrastructure. The cloud brings up a lot of questions need to be rethought. Can I achieve this for myself? Do I have a better model for my business as an operating expense instead of a capital expense? Where are my users?


NTT Com: What has been the challenge of automating these processes to realize the cost benefits of virtualization? 

Leahy: A lot of times, people started in the cloud doing development and tests. Where they have struggled is after getting through development and tests, when they put the workload into production. This is when they had to face their real business processes. This includes service management processes and processes that ensure they are in compliance. They have to decide whether they can invest in automate these processes to drive savings. If I can technically provision a machine very quick, but the financial authorization process requires three weeks of manual accounting, I’m not going to get to that level of automation. Cloud providers have pre-invested in this automation. Now it is available to the client, and they can look at their processes and stop investing in those that don’t have a unique and critical business value. They can just buy these more standard processes in an automated manner from a cloud provider.


NTT Com: What sorts of business processes seem to be better suited for the cloud versus those that need to stay governed by more time-consuming business processes?

Leahy: Email is a great example of a workload that is easily tuned to take advantage of the automation that exists in the cloud. It is such a standard workload. This doesn’t necessarily mean that email should always be run in a public infrastructure. For instance, if I’m in a pharmaceutical company, email is often the way that they prove who invented a drug first. It ends up being the most critical business asset they have, meaning there are very different criteria for where I run it. But still the processes are the same. We are starting to get a more consistent view of the kinds of workloads that can be automated in a standard way. Other workloads are need to be much more custom, and they tend to be the ones that differentiate the customer’s business.


NTT Com: What role does the explosion in data that companies own have on the decisions they are making about how to run their data centers?

Leahy: In a way, we are putting “information” back in information technology. Big data is a big issue on the table for most clients. For many clients, they are spending five times more on servers then they did just a few years ago. Then, on top of this, they are looking for value in this data, ways to mine it for insight. This requires not just storage of the data, but also the ability to do smart things with the data.

If you went to a large financial client four years ago, their business process might have been might have been to keep everything. They might have kept everything on tier one disks. I worked with a large company in Europe two years ago who, for example, kept an image of every new contract they wrote. Their storage growth was effectively driving them to build a new data center because the footprint was getting so large. I asked them what they did with old contacts after a new one had been signed. They were just keeping them. A simple policy decision that moved an old contract to a slower speed disk once a new contract was written made it possible for them not to buy any more tier one disks for a full year.

Aligning IT with the business objectives, along with cost and budget alignment to show how IT is supporting new business growth, has changed the way IT staff must communicate.

This shows the power of making a thoughtful business decision and then automating the movement of that decision. In many cases, that data may not need to stay on the client’s premises. You are starting to see businesses make decisions about when data is moved to archives, which may be in a cloud environment.  A financial service firm I’m working with right now keeps data for seven days, then it is archived. We see similar examples in healthcare, where they keep radiology records for 30 days and then move it. They will keep this data for 10 years, but that doesn’t mean it has to be stored in their data center for 10 years. You can get a lot cheaper, more space-efficient solutions as you move data to other storage options. The important point is that the technology is there. Companies have to start to decide which information has important business value and how they will smartly move that information.


NTT Com: Is the ability to automate the movement of data a new development?

Leahy: While the concepts of moving data are not new, the ability to automate that movement is. With some of the newer technology, you see the ability to put multiple tiers of information inside the same device and to automate movement between tiers. Other new technology like de-duplication and WAN acceleration allow us to move 10 to 20 times less data for each backup than we did before. This makes moving this data off premise a very viable solution now. In years before, the amount of data I needed to transport was astronomical. The network bandwidth never would have allowed it. Networks have gotten better, network technologies have accelerated the process, and de-duplication and compression have reduced the amount of data that needs to be sent. All of these things in concert move it from being possible but incredibly hard to being affordable and viable to do frequently.


NTT Com: How comfortable are your clients with all the business process discussion that are arises as a result of these technology improvements?   

Leahy: They are much more comfortable with the technical decisions. But they are facing increased pressure from the business in terms of what IT needs to deliver from a business perspective. Aligning IT with the business objectives, along with cost and budget alignment to show how IT is supporting new business growth, has changed the way IT staff must communicate. The CIO has much more of a role in driving corporate strategy. I was working with a CIO in Korea a few years ago who told me that his CEO told him that the growth of the business was going to be completely enabled by IT. He viewed his role as growing the business. That is a different position than simply managing costs.

More than 70 percent of cloud activity is spurred by a business objective rather than the desire to cut costs. Building a new data center to support a new service takes a fair amount of time, from laying concrete, cabling, networking to making sure there is enough power available in the power grid. But providing network bandwidth to a cloud somewhere else can usually be done in short order. For IT people, both the role they’re playing within the organization and the expectation the organization has of them has changed.


NTT Com: How has this changed the discussions you have with your clients?

Leahy: I was talking with one large credit card client last summer. When we went in, we started to talk about the technology we could provide. He said he didn’t want to talk about that. He wanted to know what his industry peers were doing to leverage cloud and what he needed to do to lead the competition. Two years ago, they might have asked us about the best cost performance for a blade server for a specific workload. The entire conversation has changed.

Tags: , ,