“I don’t know what we mean when we say we’re going after AI. Do you?”
“We don’t change to adapt to new technologies anyway… We just push them into our current paradigm.”
“I don’t understand what we’re supposed to be doing right now!”
Twenty officers sit around a table, enmeshed in the awkwardness of an “adaptive leadership” workshop. This framework, developed by Ronald Heifetz and colleagues at Harvard Kennedy School, is designed to help organizations make progress on complex, collective challenges known as “adaptive” challenges. Unlike “technical” problems that can be solved with existing know-how, adaptive challenges require learning and change — adjustment – by the participants themselves.
Digital transformation poses an adaptive challenge for the Department of Defense. As long as the Department of Defense relies on painless, “technological” solutions—what Steve Blank calls “innovation theater”—America will become increasingly vulnerable to exploitation by foreign adversaries that costs both dollars and lives . To make strides in the challenge of digital transformation—and to maintain technological superiority—the Department of Defense should review and reshape its ingrained values, habits, beliefs, and norms.
The officers in the workshop are a prime example of a group struggling to conform. As in many groups, they begin by looking outward. One says, “It’s the ‘frozen middle’ that prevents us from doing anything digital,” while another adds, “Our bosses can’t agree on what they want anyway. … What should we do?” The instructor nudges them, “It seems the group is shifting responsibility somewhere but here. What makes it difficult to look inside?”
Next, the officers drift away from the challenge. They share stories of past successes, evaluate the trainer’s references and make jokes about the workshop itself. Again the trainer intervenes: “I can see we avoid uncertainty. Can we stay longer in the nebulous space of “digital transformation”? Or will we escape at the moment when it is not clear how to proceed?”
They reluctantly return to digital transformation, but after a few minutes they ask the instructor for help: “Are you going to plug in here, aren’t you…?” to solve a problem that can only be tackled together – by all of you.”
At this point, the room burns with frustration. But you can’t blame the officials. Their attempts to avoid adjustment work—by diverting attention from the problem and shifting responsibility for it to something else—are typical of groups faced with a difficult reality.
More specifically, in what Heifetz calls “classic failure,” groups attempt to solve adaptive challenges through “technical fixes”: painless trials that apply existing know-how rather than working with stakeholders to change the way they work.
Hiring someone, firing someone, increasing the budget, expanding the schedule, creating a committee, restructuring the organization, developing a new tool, enforcing a new policy: these are all technical fixes that, while not inherently harmful, are easier as – and can distract from – the internal work of re-evaluating values, habits, beliefs and norms.
The Ministry of Defense is already trying to counter the digital transformation with technical means. The Department of Defense, in partnership with the Massachusetts Institute of Technology (MIT), established the Joint AI Center and established the position of Chief Digital and AI Officer. These steps are not without benefit: the Joint AI Center has developed AI ethics principles and a new procurement process; MIT has produced valuable research and educational content; and the Chief Digital and AI Officer offers the opportunity to integrate various technological functions. But these measures are not enough. In fact, they’re not even the most difficult steps.
The real barriers to digital transformation are deep-seated norms and conflicting perspectives that exist across the organization. “How valuable are technologists really? Should they be treated differently from others?”; “What about computers: can we trust them to do our jobs as well as we do? If so, what role will humans play afterwards?”; and perhaps most importantly, “How do we move beyond simply articulating new standards to actually living them?” These are difficult questions, affecting Department of Defense goals, strategies and missions at every level – but answers will only come through discussion and Earned experiments throughout the defense ecosystem itself.
Back in the workshop, the officials have made at least one breakthrough. Toward the end of the session, the facilitator says, “I sense a sense of sadness in the room. Anyone else feel that?” Predictably, everyone shakes their heads – admitting sadness feels like admitting failure – but then a Major chimes in, “I bite. Yes I am sad. That just feels overwhelming. If we can’t rely on our commanders to handle this…” He pauses. “I have no idea how we’re going to do this. Especially when we’re told to just keep our heads down at all times. It feels hopeless.”
The Major’s comment is the most honest moment the group has witnessed, and the shift in space is palpable: an hour earlier, officers were barely aware of their own duty of adjustment work, and when they were, they failed to appreciate it Weight. Now they are coming to terms with that responsibility, and they are doing it publicly – in a vulnerable way – where the whole group can learn from individual experiences. This shift is the stuff of real change.
The truth is that no one knows how a digitally transformed Department of Defense will work. But no one will find out without the collective process of trying, failing, and learning. The Department of Defense should, therefore, be comfortable learning through experience—collecting data through discussion and experimentation—and popularizing that learning throughout the organization. And while the Department of Defense has good reasons for maintaining a risk-averse culture, avoiding learning carries its own risks. The world is changing and America’s opponents are improving their skills. We cannot afford to wait for our enemies to make it clear that they have overtaken us.
Public officials can now take three actions to make progress on digital transformation.
First, officials should create and run low-risk experiments: actions that will lead to learning for the future, not actions that will lead to success based on today’s metrics—who knows if those metrics will be relevant post-transformation? For example at Department of the Air Force– Massachusetts Institute of Technology Accelerator for artificial intelligencewe have experimented with different forms of training service members, from live lectures and online classes to interactive exercises and project-based workshops. If an experiment fails, so be it: Failure is the main ingredient of learning.
Second, officials should present as many perspectives on digital transformation as possible. Who is afraid of digitization? Who supports it? Why? And what is the wisdom in each perspective? If everyone is part of the problem, everyone should also be part of the solution — even if it means engaging people across borders in a way the Department of Defense has never done before.
Finally, officers should prepare those around them for an extended period of uncertainty when operational reality dictates that those in charge cannot answer critical questions. This serves two purposes. First, it helps manage expectations so those in positions of authority can resist pressure to provide answers where none exist. Second, it empowers those without authority to conduct their own experiments — to try something new and fail — and to report what they learn.
Ultimately, transforming a system requires transforming the people within it. If the Department of Defense is serious about digital transformation, everyone should be involved in the uncomfortable and personal process of change. As the work continues, both the organization and the people within it will be better equipped to deal with new and challenging realities.
The workshop, meanwhile, closes with a note that applies across the Department of Defense: “This moment requires courage. Try better. Better to fail. learn better One day you will look back and see that you have changed.”
Brandon Leschinsky is an AI Innovation Fellow at the Department of the Air Force-Massachusetts Institute of Technology Artificial Intelligence Accelerator, where he has taught AI to over 600 military personnel, including over sixty generals, admirals, and senior executives. He also works with Ronald Heifetz and others at Harvard Kennedy School, where he has coached over 50 students, ranging from young professionals to senior executives, on complex, collective challenges.
Andrew Bowne is an Air Force Judge Advocate and Chief Legal Counsel to the Air Force-Massachusetts Institute of Technology Artificial Intelligence Accelerator Division. He is also a Ph.D. Candidate at the University of Adelaide studying the link between national security and AI, focusing on the role of industry. He has published numerous articles and book chapters, including national security, security cooperation, contract law, the rule of law, machine learning, and intellectual property.
The views expressed are those of the authors and do not reflect the official guidance or position of the US Government, Department of Defense, or US Air Force. In addition, the appearance of any external hyperlink does not imply an endorsement by the Department of Defense of the linked sites or the information, products or services available thereon. The Department of Defense exercises no editorial, security or other control over the information found at those locations.
Image: US Army
https://warontherocks.com/2022/05/digital-transformation-is-a-cultural-problem-not-a-technological-one/ Digital Transformation Is a Cultural Problem, Not a Technological One