With a plan in place, schools can make sure that their investments in digital technology don’t go to waste.
Digitization has rocked the classroom experience and the education system as a whole with a flood of new technologies that promise to change the way teachers teach and students learn. From tablets and e-readers to interactive whiteboards and online learning programs, classroom technology has become a big business. According to the Center for Digital Education, U.S. elementary, middle, and high schools will spend almost $10 billon on technology in 2013, an increase of $4 billion since 2003.
Yet cutting-edge technology is so enticing that it’s easy to get caught up in the moment—acquiring new devices before truly understanding how they apply to the classroom and enhance the learning experience. The truth is, despite many successes, there remains uncertainty about the efficacy of technology in the classroom, and how school systems acquire and implement technology remains a challenge. Without a plan in place, even the best technologies can go to waste.
That’s why today’s schools need a flexible, evergreen process in place that gives students and teachers access to technology while accounting for ever-changing, ever-improving technological advances. This paper examines how those involved in the classroom experience, from administrators to teachers to educational technology firms, can create a structured process for developing a technology strategy and a road map for 21st century learning.Close
Digitization could be a major opportunity for healthcare industry players—if they take the right steps.
Forecasting the future of any industry is difficult, none more so right now than healthcare in the United States. There are countless reasons why healthcare will look different in the near future, not least of which being the country’s movement toward national coverage. However, digital transformation—the cumulative change that comes when digital technologies are introduced wholesale into an established industry—is poised to have an even bigger impact. For the U.S. healthcare industry, digital technology will be transformational, cutting healthcare delivery costs, eliminating errors through improved electronic medical records, and establishing routinized, evidence-based approaches to treatment.Close
Digital forces are pulling at the industry and significantly altering services, products, innovation, delivery, and remuneration (see figure). There are digitally integrated healthcare providers, digital medical devices and technologies, and digital delivery and monitoring of home healthcare. In addition, new ideas are emanating from developing markets, agile competitors are embracing technology, and a digital-friendly federal administration is pushing innovation. And don’t forget the digital consumer who is used to digital banking, digital retailing, and digital education, and expects digital healthcare.
Digital represents a tremendous opportunity—and a significant threat—for the various participants in the U.S. healthcare industry. No one in this industry can afford to fall behind. This paper examines how the healthcare industry can capitalize on digitization.
Today, no company can be sure it is protected from hackers, hacktivists, or cyber warfare.
There is no such thing as absolute security these days. All companies are vulnerable and, it is prudent to assume, already the target of an ongoing, successful attack every second of every day. This is the new paradigm for the world of information security. The upshot is that information security cannot be seen purely as a technological issue but as one that also has an organizational and a strategic dimension.
Information security thus demands the attention of corporate management, with overall responsibility for information security established outside of the IT department. Whether the issue is taken seriously enough is often revealed by the positioning of the chief information security officer (CISO) within a company’s organization. While not usually a member of the board, the CISO or chief security officer should report directly to the board or at most through just one intermediate level. Bottom line: A comprehensively screened system and the right training can make infiltration a lot more difficult, and its detection easier.Close
Digitization has transformed the business world. One engine is driving it: information technology.
Today's companies not only rely on information technology, they can't compete without it. IT is integrated into products, it pulls priceless information from customer data, and it is the force behind online shopping. As digital becomes even more prevalent, the demand for IT will continue to rise, which in turn will place a heavier burden on IT organizations. A new A.T. Kearney study, which surveyed more than 150 chief information officers (CIOs) of large global enterprises and leading midsized companies, reveals that by 2020, two-thirds of companies expect to invest more in IT in all areas along the value chain, but especially in sales and customer interactions. As IT gets a bigger slice of the corporate budget, expectations will rise, and even the smallest miscalculation about IT priorities will put the whole company at risk. This paper discusses tomorrow's IT challenges and offers three strategic goals encompassing 10 golden rules to help CIOs find the balance between increasing IT's value while simultaneously lowering costs.Close
A playbook that makes IT integration faster, better, and future-proof can ensure M&A success.
Information technology plays a crucial role in any M&A. When IT fails in M&A, the repercussions are felt far more deeply than the failure of any other function, not least because IT is the lifeline of nearly every aspect of business operations. Our global merger integration survey found that more than one-third of failed acquisitions are the result of unaddressed IT issues. IT issues can sink a perfectly good acquisition, cause post-deal crises, lead to acquisition aftershocks, and result in missed opportunities to innovate.
We believe a “core competency” approach to IT in a merger or acquisition is required not only to enable a business integration, but also to establish an IT organization fit for the future. Companies can achieve these dual objectives by creating their own IT M&A playbook.
The IT M&A playbook an organization develops must take into account each of these objectives, and it should be tailored to the organization’s specific context. All playbooks must involve IT early, address various acquisition scenarios, incorporate connectors with other business functions, accelerate the process, and live by the lessons learned.
The playbook approach turns integration “from an art to a methodical science.” The playbook allows companies to do it quick, do it right, and keep IT going during and after a merger.
Once the IT M&A playbook is formulated, its success depends on its active deployment and continuous improvement. We recommend several steps to embed the playbook within the organization and exploit its value fully:
- Sustain and improve
- Provide easy access
- Integrate across functions
- Train people
Adam Pressman, A.T. Kearney principal and author of “Engaging Multichannel Consumers,” discusses the importance of aligning with consumers' needs through multi-channel engagement and the critical role CIOs play in this transition.More
With the right systems engineering framework, there is nothing to fear—and much to gain—from agile development.
Need to develop high-quality software quickly and efficiently? Whether you are a CXO or an engineer, the chances are that you are either considering exploring agile software development, or in the process of doing so.
Incremental and iterative, agile development involves breaking down what would have been a large, monolithic project into frequent, time-boxed releases that evolve in line with the end-user's requirements. Initially a small movement in the software community, agile is now becoming a major paradigm for software development. Recent surveys show that about one-third of all development is run with agile methods. The remaining two-thirds apply a "waterfall" development process (13 percent), gated processes other than waterfall (21 percent), or no formal process at all (26 percent). This is true across both large and small projects. Studies of agile implementations consistently show they result in increased focus on value for the customer, shorter time-to-market, lower development costs, and higher quality compared with a conventional software development approach.
But a big challenge remains. Twenty-three percent of agile projects involve teams with more than 200 people, and agile development can be difficult to apply across large projects involving multi-site organizations, complex system interdependencies, and the integration of software and hardware development. Such issues have led large organizations in many industries—aerospace and defense, banking, and telecom, to name a few—to doubt the feasibility of agile implementation.
These challenges, however, can be overcome with the right systems engineering framework in place. When based on well-designed and stable system architectures, large-scale agile development projects will succeed.Close
Big data offers a big advantage for those who learn how to harness it.
Every day, 2.5 quintillion bytes of data are created, with 90 percent of the world’s data created in the past two years alone. Data production will be 44 times greater in 2020 than in 2009. The enormous data influx is straining IT infrastructures. Poor data management can cost up to 35 percent of a business’s operating revenue.
Big data promises to be transformative. As computing resources have evolved, companies stand to reap many more benefits from big data and analytics. Little wonder that big data is a hot topic in corporate boardrooms and IT departments, with many leading firms doing more than talking. According to a recent A.T. Kearney IT innovation study, more than 45 percent of companies have implemented a business-intelligence or big data initiative in the past two years. Further studies estimate more than 90 percent of Fortune 500 companies will have at least one big data initiative underway within a year. The effective use of this tidal wave can deliver substantial top- and bottom-line benefits. Building capabilities in this area will not only improve performance in traditional segments and functions, but also create opportunities to expand product and service offerings.
Yet few organizations have fully grasped what big data is and what it will mean for the future. The greatest rewards will go to those with a clear vision for how it can transform their organization, capabilities, and industry. The hardest part is knowing how to get started. Four questions should be asked at the beginning of this transformation journey:
- Where will big data and analytics create advantage for our company?
- How should we organize to capture the benefits of big data and analytics?
- What technology investments can enable the analytics capabilities?
- How do we get started on the big data journey?
- What business can learn from particle physics
In this companion interview to Big Data and the Creative Destruction of Today's Business Models, Fermilab physicist Rob Roser discusses physics' ties to big data.
Rob Roser has been on the front lines of one of the most exciting periods in the history of physics. In his role at Fermi National Accelerator Laboratory (Fermilab) just outside of Chicago, and as the leader of the Collider Detector at Fermilab (CDF) and technical liaison to Europe’s Organization for Nuclear and Particle Research (CERN), Roser led a team of scientists searching for evidence of the Higgs boson, also known as the “God Particle.” In January 2012, Roser was named the head of Fermilab’s Scientific Computing Division, which provides the facilities, tools, and programming necessary for scientists to conduct their experiments and analyze their findings.
Roser recently sat down with A.T. Kearney’s Christian Hagen, Khalid Khan, and Dan Wall to discuss the challenge of big data, the evolution of scientific computing and technology, and the identification and recruitment of needed talent. The interview accompanies their recently released issue paper, Big Data and the Creative Destruction of Today’s Business Models.Close
- The right way to think about network optimization
Networks are complicated, and managing them requires an expansive strategic imagination.
The conventional way of looking at a network is through the direct-value lens: How much does it cost to run the network? Networks are a great deal more complicated than that, and managing them—or, more fittingly, optimizing them—requires an expansive strategic imagination.
No matter what kind of network one manages—hospitality, retailing, banking, leisure, telecommunications whatever it might be—once the network is built, it immediately begins its evolution. Even within a single local market, the network is evolving all the time. As the network goes through its life cycle, perspectives on sustaining it must change as well. The means for doing this are distilled in A.T. Kearney’s Network Optimization Tools, or KNOTs, comprised of eight elements, each focused on a strategic element of the network.
The English word knots translates as les nœuds in French, or nodes. This is an apt image for thinking about the symbiosis of the local and the networked—the balance of savoir-faire métier and savoir-faire local, of the collective intelligence of the network and the specific intelligence of the individual.
Think of KNOTs not as a laundry list of best practices used to build an optimal network but as electrons—each one discrete and at the same time interacting around the nucleus. A national bank develops financial products centrally, but the local branch manager manages the relationship with customers. The national bank maintains good relations with the regulators while the branch manager cultivates the good will of the town mayor. A manufacturer’s leverage with suppliers may be directly proportionate to its number of plants, yet procurement is not only about concentrated volume. It is also about expertise the manufacturer owns in a multitude of categories and brings to bear in the local nodes of its network.
A sobering counterexample is the flameout of video retailer Blockbuster, which channeled its energies into adding thousands of stores and tens of thousands of employees in North America and Western Europe only to be caught off guard by competitors such as Netflix and the rapid adoption of streaming video. In hindsight, Blockbuster’s history suggests an unbalanced emphasis on its real estate network and not enough on the customer experience. The result was catastrophic.
We organize our network nomenclature into three types: production networks, service networks, and distribution networks. La Poste, for example, is a production network in that it operates like a factory producing a product: collecting and distributing mail. Taxi companies, railroads, and airlines are other good examples of production networks. The nodes in these networks are more than just infrastructure. One must own the nodes or there is no business to manage. Closely related to the production network is the service network, typified by telecommunications and hospitality. A hotel network, for instance, cannot deliver a night’s sleep over the Internet. The consumption of its product is done at the local level even though each node in the network is supported by the expertise of the whole. The service is the network.
A distribution network is retail in all senses of the word, especially in its tailoring of products to meet the needs of local customers. Distribution networks are high touch and in certain ways are the easiest networks to think about in terms of nodes. The most familiar example, literally the most concrete, is a brick-and-mortar retail chain. Find a Wal-Mart, and its distribution center will not be very far away.Close
Discussions about cloud computing are often obscured by IT jargon. Clarity reveals the cloud’s silver lining.
There is no question that cloud computing has made a name for itself. Companies worldwide are turning to public and private clouds for their IT needs. In fact, 80 percent of Fortune 1000 companies are expected to be using cloud-computing services as early as 2013, and 20 percent of them will do so without ever owning a single piece of hardware. By 2014, the value of public cloud services is expected to grow to $55.5 billion from $16.5 billion in 2009. That’s an annual growth rate of 27 percent—about five times the growth rate of IT services. The same type of technology resides within most companies. Often called a private cloud, it too is growing fast, influenced by the growth of public clouds and the new capabilities cloud provides.
What is cloud computing? The most often cited definition comes from the U.S. National Institute of Standards and Technology: The cloud enables convenient, on-demand network access to a shared pool of computing resources—networks, servers, storage, applications, and services, among others. Several key features are crucial to the cloud off ering and to today’s dynamic business environment.
Cloud computing represents a significant and needed step toward the “utility-ization” of IT services across all layers of the architecture stack, meaning IT functions are becoming more standardized and services-based. Just as no programmer would expect to develop code for windowing, scroll bars, and button clicks, many service components in the cloud, such as credit-card verification and billing, are like Lego blocks: pre-built and ready to be snapped into applications.
Access to the cloud is fast and inexpensive. Cloud computing does not require large capital investments in infrastructure, technology, applications, or platforms. The cloud’s architecture offers flexibility, which is essential for accommodating sudden shifts in demand, and is accessible from a broad range of devices. Companies in the cloud can respond faster and more effectively to business needs and customers, which by extension often means improved productivity, more innovation, and faster go-to-market strategies.
So far, the cloud is primarily deployed in areas with minimal concerns, such as email, backup, storage, and testing. In the next few years, as standards are established around data security and storage, we expect to see the cloud in many other areas—and, not surprisingly, creating more profitable businesses.Close
As knowledge of ERP systems has grown so too have the misconceptions. It is time to set the record straight.
Careers and fortunes have been built and lost because of enterprise resource planning (ERP) systems, and in the process, some reliable advice has emerged on how best to address the challenges and capture the benefits of a standardized system. However, as knowledge has grown, so too have the misconceptions. Many of the myths from earlier days persist today and often dissuade potential purchasers of ERP systems. In this paper, the authors highlight six ERP myths generally accepted as fact and recommend ways to regain control of the ERP journey.Close
Blanca Treviño, CEO of Softtek, discusses the future of global IT outsourcing and how Mexico can improve its competitive advantage.
A passionate advocate of service over labor arbitrage, Blanca Treviño has focused Softtek on adeptly undertaking complex projects and providing value, not just low costs. She is also a vocal proponent of sponsorship as a means of helping small- and medium-sized Mexican companies in the sector gain a toehold. At a recent A.T. Kearney conference on global competitiveness, Ricardo Haneine and Rodrigo Slelatt interviewed Mrs. Treviño for Executive Agenda to discuss the future of global IT outsourcing, the means by which Mexico can improve its competitiveness, and her aspirations for Softtek.Close