Today, no company can be sure it is protected from hackers, hacktivists, or cyber warfare.More
Digitization has transformed the business world. One engine is driving it: information technology.
Today's companies not only rely on information technology, they can't compete without it. IT is integrated into products, it pulls priceless information from customer data, and it is the force behind online shopping. As digital becomes even more prevalent, the demand for IT will continue to rise, which in turn will place a heavier burden on IT organizations. A new A.T. Kearney study, which surveyed more than 150 chief information officers (CIOs) of large global enterprises and leading midsized companies, reveals that by 2020, two-thirds of companies expect to invest more in IT in all areas along the value chain, but especially in sales and customer interactions. As IT gets a bigger slice of the corporate budget, expectations will rise, and even the smallest miscalculation about IT priorities will put the whole company at risk. This paper discusses tomorrow's IT challenges and offers three strategic goals encompassing 10 golden rules to help CIOs find the balance between increasing IT's value while simultaneously lowering costs.Close
A playbook that makes IT integration faster, better, and future-proof can ensure M&A success.
Information technology plays a crucial role in any M&A. When IT fails in M&A, the repercussions are felt far more deeply than the failure of any other function, not least because IT is the lifeline of nearly every aspect of business operations. Our global merger integration survey found that more than one-third of failed acquisitions are the result of unaddressed IT issues. IT issues can sink a perfectly good acquisition, cause post-deal crises, lead to acquisition aftershocks, and result in missed opportunities to innovate.
We believe a “core competency” approach to IT in a merger or acquisition is required not only to enable a business integration, but also to establish an IT organization fit for the future. Companies can achieve these dual objectives by creating their own IT M&A playbook.
The IT M&A playbook an organization develops must take into account each of these objectives, and it should be tailored to the organization’s specific context. All playbooks must involve IT early, address various acquisition scenarios, incorporate connectors with other business functions, accelerate the process, and live by the lessons learned.
The playbook approach turns integration “from an art to a methodical science.” The playbook allows companies to do it quick, do it right, and keep IT going during and after a merger.
Once the IT M&A playbook is formulated, its success depends on its active deployment and continuous improvement. We recommend several steps to embed the playbook within the organization and exploit its value fully:
- Sustain and improve
- Provide easy access
- Integrate across functions
- Train people
Adam Pressman, A.T. Kearney principal and author of “Engaging Multichannel Consumers,” discusses the importance of aligning with consumers' needs through multi-channel engagement and the critical role CIOs play in this transition.More
With the right systems engineering framework, there is nothing to fear—and much to gain—from agile development.
Need to develop high-quality software quickly and efficiently? Whether you are a CXO or an engineer, the chances are that you are either considering exploring agile software development, or in the process of doing so.
Incremental and iterative, agile development involves breaking down what would have been a large, monolithic project into frequent, time-boxed releases that evolve in line with the end-user's requirements. Initially a small movement in the software community, agile is now becoming a major paradigm for software development. Recent surveys show that about one-third of all development is run with agile methods. The remaining two-thirds apply a "waterfall" development process (13 percent), gated processes other than waterfall (21 percent), or no formal process at all (26 percent). This is true across both large and small projects. Studies of agile implementations consistently show they result in increased focus on value for the customer, shorter time-to-market, lower development costs, and higher quality compared with a conventional software development approach.
But a big challenge remains. Twenty-three percent of agile projects involve teams with more than 200 people, and agile development can be difficult to apply across large projects involving multi-site organizations, complex system interdependencies, and the integration of software and hardware development. Such issues have led large organizations in many industries—aerospace and defense, banking, and telecom, to name a few—to doubt the feasibility of agile implementation.
These challenges, however, can be overcome with the right systems engineering framework in place. When based on well-designed and stable system architectures, large-scale agile development projects will succeed.Close
Big data offers a big advantage for those who learn how to harness it.
Every day, 2.5 quintillion bytes of data are created, with 90 percent of the world’s data created in the past two years alone. Data production will be 44 times greater in 2020 than in 2009. The enormous data influx is straining IT infrastructures. Poor data management can cost up to 35 percent of a business’s operating revenue.
Big data promises to be transformative. As computing resources have evolved, companies stand to reap many more benefits from big data and analytics. Little wonder that big data is a hot topic in corporate boardrooms and IT departments, with many leading firms doing more than talking. According to a recent A.T. Kearney IT innovation study, more than 45 percent of companies have implemented a business-intelligence or big data initiative in the past two years. Further studies estimate more than 90 percent of Fortune 500 companies will have at least one big data initiative underway within a year. The effective use of this tidal wave can deliver substantial top- and bottom-line benefits. Building capabilities in this area will not only improve performance in traditional segments and functions, but also create opportunities to expand product and service offerings.
Yet few organizations have fully grasped what big data is and what it will mean for the future. The greatest rewards will go to those with a clear vision for how it can transform their organization, capabilities, and industry. The hardest part is knowing how to get started. Four questions should be asked at the beginning of this transformation journey:
- Where will big data and analytics create advantage for our company?
- How should we organize to capture the benefits of big data and analytics?
- What technology investments can enable the analytics capabilities?
- How do we get started on the big data journey?
- What business can learn from particle physics
In this companion interview to Big Data and the Creative Destruction of Today's Business Models, Fermilab physicist Rob Roser discusses physics' ties to big data.
Rob Roser has been on the front lines of one of the most exciting periods in the history of physics. In his role at Fermi National Accelerator Laboratory (Fermilab) just outside of Chicago, and as the leader of the Collider Detector at Fermilab (CDF) and technical liaison to Europe’s Organization for Nuclear and Particle Research (CERN), Roser led a team of scientists searching for evidence of the Higgs boson, also known as the “God Particle.” In January 2012, Roser was named the head of Fermilab’s Scientific Computing Division, which provides the facilities, tools, and programming necessary for scientists to conduct their experiments and analyze their findings.
Roser recently sat down with A.T. Kearney’s Christian Hagen, Khalid Khan, and Dan Wall to discuss the challenge of big data, the evolution of scientific computing and technology, and the identification and recruitment of needed talent. The interview accompanies their recently released issue paper, Big Data and the Creative Destruction of Today’s Business Models.Close
- The right way to think about network optimization
Networks are complicated, and managing them requires an expansive strategic imagination.
The conventional way of looking at a network is through the direct-value lens: How much does it cost to run the network? Networks are a great deal more complicated than that, and managing them—or, more fittingly, optimizing them—requires an expansive strategic imagination.
No matter what kind of network one manages—hospitality, retailing, banking, leisure, telecommunications whatever it might be—once the network is built, it immediately begins its evolution. Even within a single local market, the network is evolving all the time. As the network goes through its life cycle, perspectives on sustaining it must change as well. The means for doing this are distilled in A.T. Kearney’s Network Optimization Tools, or KNOTs, comprised of eight elements, each focused on a strategic element of the network.
The English word knots translates as les nœuds in French, or nodes. This is an apt image for thinking about the symbiosis of the local and the networked—the balance of savoir-faire métier and savoir-faire local, of the collective intelligence of the network and the specific intelligence of the individual.
Think of KNOTs not as a laundry list of best practices used to build an optimal network but as electrons—each one discrete and at the same time interacting around the nucleus. A national bank develops financial products centrally, but the local branch manager manages the relationship with customers. The national bank maintains good relations with the regulators while the branch manager cultivates the good will of the town mayor. A manufacturer’s leverage with suppliers may be directly proportionate to its number of plants, yet procurement is not only about concentrated volume. It is also about expertise the manufacturer owns in a multitude of categories and brings to bear in the local nodes of its network.
A sobering counterexample is the flameout of video retailer Blockbuster, which channeled its energies into adding thousands of stores and tens of thousands of employees in North America and Western Europe only to be caught off guard by competitors such as Netflix and the rapid adoption of streaming video. In hindsight, Blockbuster’s history suggests an unbalanced emphasis on its real estate network and not enough on the customer experience. The result was catastrophic.
We organize our network nomenclature into three types: production networks, service networks, and distribution networks. La Poste, for example, is a production network in that it operates like a factory producing a product: collecting and distributing mail. Taxi companies, railroads, and airlines are other good examples of production networks. The nodes in these networks are more than just infrastructure. One must own the nodes or there is no business to manage. Closely related to the production network is the service network, typified by telecommunications and hospitality. A hotel network, for instance, cannot deliver a night’s sleep over the Internet. The consumption of its product is done at the local level even though each node in the network is supported by the expertise of the whole. The service is the network.
A distribution network is retail in all senses of the word, especially in its tailoring of products to meet the needs of local customers. Distribution networks are high touch and in certain ways are the easiest networks to think about in terms of nodes. The most familiar example, literally the most concrete, is a brick-and-mortar retail chain. Find a Wal-Mart, and its distribution center will not be very far away.Close
Discussions about cloud computing are often obscured by IT jargon. Clarity reveals the cloud’s silver lining.
There is no question that cloud computing has made a name for itself. Companies worldwide are turning to public and private clouds for their IT needs. In fact, 80 percent of Fortune 1000 companies are expected to be using cloud-computing services as early as 2013, and 20 percent of them will do so without ever owning a single piece of hardware. By 2014, the value of public cloud services is expected to grow to $55.5 billion from $16.5 billion in 2009. That’s an annual growth rate of 27 percent—about five times the growth rate of IT services. The same type of technology resides within most companies. Often called a private cloud, it too is growing fast, influenced by the growth of public clouds and the new capabilities cloud provides.
What is cloud computing? The most often cited definition comes from the U.S. National Institute of Standards and Technology: The cloud enables convenient, on-demand network access to a shared pool of computing resources—networks, servers, storage, applications, and services, among others. Several key features are crucial to the cloud off ering and to today’s dynamic business environment.
Cloud computing represents a significant and needed step toward the “utility-ization” of IT services across all layers of the architecture stack, meaning IT functions are becoming more standardized and services-based. Just as no programmer would expect to develop code for windowing, scroll bars, and button clicks, many service components in the cloud, such as credit-card verification and billing, are like Lego blocks: pre-built and ready to be snapped into applications.
Access to the cloud is fast and inexpensive. Cloud computing does not require large capital investments in infrastructure, technology, applications, or platforms. The cloud’s architecture offers flexibility, which is essential for accommodating sudden shifts in demand, and is accessible from a broad range of devices. Companies in the cloud can respond faster and more effectively to business needs and customers, which by extension often means improved productivity, more innovation, and faster go-to-market strategies.
So far, the cloud is primarily deployed in areas with minimal concerns, such as email, backup, storage, and testing. In the next few years, as standards are established around data security and storage, we expect to see the cloud in many other areas—and, not surprisingly, creating more profitable businesses.Close
As knowledge of ERP systems has grown so too have the misconceptions. It is time to set the record straight.
Careers and fortunes have been built and lost because of enterprise resource planning (ERP) systems, and in the process, some reliable advice has emerged on how best to address the challenges and capture the benefits of a standardized system. However, as knowledge has grown, so too have the misconceptions. Many of the myths from earlier days persist today and often dissuade potential purchasers of ERP systems. In this paper, the authors highlight six ERP myths generally accepted as fact and recommend ways to regain control of the ERP journey.Close
Blanca Treviño, CEO of Softtek, discusses the future of global IT outsourcing and how Mexico can improve its competitive advantage.
A passionate advocate of service over labor arbitrage, Blanca Treviño has focused Softtek on adeptly undertaking complex projects and providing value, not just low costs. She is also a vocal proponent of sponsorship as a means of helping small- and medium-sized Mexican companies in the sector gain a toehold. At a recent A.T. Kearney conference on global competitiveness, Ricardo Haneine and Rodrigo Slelatt interviewed Mrs. Treviño for Executive Agenda to discuss the future of global IT outsourcing, the means by which Mexico can improve its competitiveness, and her aspirations for Softtek.Close
The value of good capital management is almost priceless, especially in high-spend industries such as telecommunications and utilities.
Managing multibillion-dollar capital expenditures (capex) is a balancing act where it is easy to lose sight of the basics due to the intricacies of the allocation process, the internal politics, or the complexity of the business case. This last distraction is almost always dealt with by quantifying every aspect of the business case, which may give the impression of "managing all the details" but in reality often results in at least three symptoms of poor capital management. The first symptom is failure to prioritize—when capital investments are not linked to corporate strategy and financial targets, it is almost impossible to capture the required level of returns across the portfolio. The second symptom is loss of accountability—when accountabilities are not clearly defined, followed, or enforced, and reviews are not conducted, no one owns the outcome. The third symptom is poor visibility—without a corporate-wide reporting structure there is limited visibility into spending and even less control of the investment portfolio. As cost overruns mount and projects slow down, the economics of the original investment case are often lost. Any one of these symptoms can lead to poor returns. However, four principles can lessen the chance of a fall.Close
High-growth companies are often ahead of the curve when it comes to IT innovation.
For more than a decade, A.T. Kearney has studied clients and organizations worldwide to determine what drives innovation in IT and how to realize the benefits. Our studies of technological innovation began during the go-go days of the Internet boom and continue to this day. We have seen IT innovation as a standalone industry killer and as an integrated tool that helps orchestrate and enable strategic shifts. One thing that has not changed is the belief among executives that more aggressive technology adoption and investment is a clear path toward growth and profits: A.T. Kearney's 2012 study finds that high-growth companies are frequently ahead of the curve when it comes to IT innovation.
A.T. Kearney defines IT innovation as game-shifting, technology-enabled strategies—the kind that creates sustainable competitive advantage or leads to significant cost reductions. A common goal of IT innovation is the delivery of new solutions to customers, products, or supply chains—for example, analytics that predict customers' needs or off er new ways for customers to find and order products or services. Innovation in products includes how automakers design vehicles to improve lifestyles by integrating cars with drivers' mobile devices, homes, and off ices. Other examples include supply chains that are continually being transformed by IT as forecasting and tracking improves both the delivery of goods and the monitoring of assets.
Our 2012 study yields several noteworthy findings. For one, executives at major global companies continue to recognize the value of IT innovation and believe they should allocate a larger portion of their IT budgets to innovation. Yet most do not, especially when compared with high-growth companies that invest more in technology innovation and do so earlier in the technology life cycle. These companies rely less on large service providers for innovation than their medium- and low-growth counterparts, and instead build internal IT processes and operating structures to develop their own IT innovation capabilities. Indeed, they look to their IT organizations to drive revenue, not just to reduce costs or streamline operations.
What do these findings mean for the average company? That growth and innovation can be generated in the IT organization, but only if the IT organization gets the necessary funding and resources and steps up to the challenge.Close
Best of suite applications are reclaiming their dominance as the preferred choice for specialized applications.
Companies always face tough decisions, and one of the toughest—and often most contentious—is choosing the right enterprise software solution.
The business and information technology (IT) organizations within a company are often at odds. Business wants software that supports current processes, along with functions and interfaces with new features. The IT group wants a simple, easy-to-maintain portfolio of software applications with little or no customization. With disparities such as these, it is little wonder that the software-selection process puts stakeholders at odds.
Such hassles can be minimized by analyzing and evaluating three aspects of software: functionality, complexity, and vendor management.Close