All rights reserved. This book or any portion thereof may not be reproduced or used in any manner whatsoever without the express written permission of the publisher except for the use of brief quotations in a book review.
ISBN: 978-1-7357329-1-6
TABLE OF CONTENTS
Foreword
Garry Kasparov
Introduction
Matt Calkins, CEO of Appian
From Big Boxes to Intelligence Everywhere: The Changing Face of Automation
Neil Ward-Dutton, IDC
How to Turn Your Company into a Master of Digital Transformation
George Westerman, Massachusetts Institute of Technology
Survival of the Quickest: How to Hack a Pandemic with Intelligent Automation
Lakshmi N, Tata Consultancy Services
From Hurricanes to COVID-19 and Beyond: How Low-Code Helps our University
Sidney Fernandes & Alice Wei, University of South Florida
FinTech and the Forces of Change in Financial Services
Chris Skinner, FinTech Expert
A Business-minded CIO’s Perspective: Why Low-Code is Indispensable for Transformation
Isaac Sacolick, StarCIO
Low-Code Journey in the Enterprise
John Rymer, Forrester (Emeritus)
People Power: The X-Factor of Digital Transformation
Lisa Heneghan, KPMG
Speed is the Key in Pandemic Response
Darren Blake, Bexley Health Neighborhood Care
Digital Innovation is More than a Side Hustle
Rob Galbraith, InsureTech Expert
A Technology Business Needs Simplicity
Ron Tolido, Capgemini
An Economic Revolution
Michael Beckley, Appian
Foreword
Garry Kasparov
We are in a time of crisis, as a pandemic wreaks havoc on human health and the global economy. With all our focus on the perilous present, it can be difficult to look ahead. We can easily fall behind relying on tactical reactions, losing sight of our bigger goals. Strategic planning becomes even more valuable in a crisis—not only for escaping the current one, but for being better positioned for the next.
Often our instinct in a crisis is to go on the defensive, to become more conservative in the hopes of weathering the storm. There is nothing necessarily wrong with this impulse—as long as we are aware of it and refuse to be dominated by it.
If we concede reason to instinct, we forfeit the greatest survival mechanism of all, the ability to adapt. This isn’t a chess insight, or business strategy; it’s basic Darwinism. Natural selection rewards the characteristics that best suit the circumstances, gradually producing change and evolution. A crisis accelerates the process—or ends the line.
Humans and our technology are subject to similar pressures, but we don’t have to wait for random mutations or generations of life and death to evolve. We can observe, analyze, experiment, and strategize. We can build new tools to meet the demands of the moment, whether it’s a financial crisis, pandemic, or both.
This ability doesn’t mean we will always use it correctly, or make the right moves to adapt in the best way. But we have the potential to do that and more—to come through the crisis and thrive by making the right decisions while others falter.
This cannot happen unless we establish the right conditions—or at the very least it will be much more difficult and painful to achieve. Deep preparation was my trademark when I was the world chess champion. There was no way to be prepared for everything, but I knew I could be better prepared than my opponents.
I also discovered that my preparation produced benefits even when it didn’t go as planned, even when my opponent avoided it entirely. My readiness was cumulative, multifaceted, enabling me to adapt on the fly. It turned out that pieces of preparation designed for one scenario applied surprisingly well to other situations that seemed unrelated on the surface.
Adaptation and creating the preconditions for adaptation—that’s the human way of evolution. It applies to individuals, corporations, and our entire society. When a crisis hits, those who possess the right combination of characteristics have the advantage, but those who can acquire those characteristics on demand will thrive in any situation.
Now we’ve arrived at the topic of my first conversations with Matt Calkins, the founder and CEO of Appian. It turned out we were both fascinated by the potential of using increasingly intelligent machines to maximize this ability to adapt. We both see artificial intelligence as a tool that makes us better prepared, better able to meet a crisis—if we use it wisely.
My interest in working with AI came the hard way: my chess matches against the supercomputer Deep Blue in 1996 and 1997. The machine’s victory in the rematch was hailed as an achievement on par with the Wright Brothers’ first flight and the moon landing. It was described as “The Brain’s Last Stand” on the cover of Newsweek. No pressure!
Of course, my loss to a machine was still a human victory. Deep Blue was created by a talented team that spent years of research and work to create a machine that could beat the world champion at a game long considered a nexus of human intelligence.
After losing that final game, I went out to a nice dinner with friends and talked about politics. And what did Deep Blue do? What else could it do, other than play chess? Nothing, of course. It couldn’t even celebrate because it didn’t know it had won. All that work, all that capability and cutting-edge hardware and code, couldn’t be redirected into other tasks in any amount of time. It couldn’t learn and couldn’t adapt. It climbed Mount Everest, but it was also a dead end.
My personal adaptation was a form of “if you can’t beat’em, join’em.” If human strategic thinking and understanding could be combined with machine speed and precision, might it not produce the best chess ever? Instead of human versus machine, why not human plus machine? And so, Advanced Chess was born.
My idea was simple, if heretical. Grandmasters would face off, each with a computer by their side running the best chess software available. My brainchild saw the light of day in León, Spain, in June 1998. My opponent was one of the world’s top players, Veselin Topalov of Bulgaria. Playing with computer assistance was a strange sensation, although by then I was quite used to using a machine to help me with analysis and preparation.
It turned out to be far from the best chess ever, although the result was instructive. I had crushed Topalov in a match of regular rapid chess a few weeks earlier, a 4-0 sweep. But in León, it was a 3-3 tie. The machine’s ruthless accuracy had neutralized my advantage in calculating tactics. Topalov and I failed to use our time efficiently, unsure of when to consult our machine partners and for how long.
Advanced Chess could have ended there as a curiosity, but eventually it found its natural home on the internet. In 2005, a popular chess site hosted what it called a “freestyle” chess tournament in which anyone could compete in teams with other players or computers. Lured by the substantial prize money, several groups of strong Grandmasters working with several computers at the same time entered the competition. At first, the results seemed predictable. The teams of human plus machine dominated even the strongest computers.
The surprise came at the conclusion of the event. The winner was revealed to be not a Grandmaster using top-of-the-line hardware, but a pair of amateur American chess players using three regular PCs at the same time. Their skill at manipulating and “coaching” their machines to look very deeply into positions effectively counteracted the superior chess understanding of their Grandmaster opponents and the superior computational power of other participants.
This led to my formulation: Average human + average machine + better process was superior to a strong computer alone and, more remarkably, superior to a strong human + fast machine + inferior process. It was about then that I started to prefer AI as “augmented intelligence.” It doesn’t replace us, it enhances us, and allows us to adapt faster by adapting our tools faster than ever. The human mind is an unmatched analogy engine, able to apply experience and new information to new circumstances almost instantly. Machines can’t do this themselves—not yet— but with our guidance, they can help feed our insatiable appetite for ever-greater agility.
Process is king, a multiplier that turns human plus machine into a transformative advantage. As Matt Calkins says in his Introduction to this book: it’s all about bringing human and digital workers together, to unite them into a workflow that is far greater than the sum of its parts. That’s what those American chess amateurs did to beat Grandmasters and supercomputers, and that’s what every company must do today to survive, and thrive, against unexpected challenges.
I could pick out many of my own favorite parts from this book, such as John R. Rymer’s essential explanation of low-code platforms, or Darren Blake’s real-life example of how adaptable software tools save lives in a pandemic. But I’ll let you read them all yourself without further delay. After all, as every author to follow explains, speed is of the essence!
Garry Kasparov was the world’s top chess player for 20 years and writes and speaks frequently on decision-making and the human-machine relationship. He is the author of Deep Thinking: Where Machine Intelligence Ends and Human Creativity Begins.
Introduction
Matt Calkins, CEO of Appian
Even before the COVID-19 pandemic, big changes were underway in enterprise software.
Corporations had become deeply reliant on software applications to automate their essential behaviors. Those applications, in turn, were stubbornly hard to update or modify.
Software had become the spinal cord of a business. Every action and every signal that passed through the company was carried by software. Every new behavior required software to enable and regulate it. This made software the limiting factor on corporate growth and change.
As companies grew, they needed to create new applications. Demand for these applications rose exponentially, faster than the labor supply could grow, making “software developer” one of the world’s best-paid professions. Budgets rose, but still every company had a long backlog of processes to build and suffered universally from software delays.
Change was slow; costs were high. The stage was set for a revolution.
* * *
In the case of a crisis, every business was going to be dangerously inflexible. A company could move no faster than software allowed, and if new processes were slow to encode, so also would corporate reaction times be slow.
It is a fact that a corporation before 2020 could change any of its core assets more quickly than it could change its software processes. It could replace its leadership team, rebrand with a new logo, or move to a new physical headquarters faster than it could rewrite the software on which the corporation’s behavior depended. Not only did this create internal inefficiency, it also led customers to feel they were treated impersonally and robotically.
The “digital transformation” movement sought to use new technology to overcome this immobility and unresponsiveness. It was discussed much, but accomplished little, due to a general misunderstanding about the depth of the problem and the urgency of its solution. It would take an exogenous shock to truly focus attention.
* * *
COVID-19 set the spark for the next phase of the enterprise software revolution. In the pandemic, business realized that change was a matter of survival. Every corporate relationship depended on an agile response to the crisis: customers and regulators demanded new behaviors, while employees needed assurances of safety.
Most businesses were unable to quickly adapt to the new circumstances. Because they couldn’t change their applications fast enough, they were unable to express their new plans in new behavior. For example, most relied on non-technical systems to return their employees to work, despite the obvious safety and privacy disadvantages.
Companies today need to be ready at all times to write an application on which their business might depend. The new mandate is for agility in all applications, especially the most important ones.
COVID marked a turning point in enterprise software, an event that forced businesses to find ways to change their applications faster. Even when such an event ends, the preference for speed remains. Speed is addictive. Once people experienced Google’s sub-second search times, or Amazon Prime’s 2-day delivery, they were unlikely to go back.
* * *
“Automation” means bringing human and digital workers together in the same workflow. (In an earlier era, “automation” meant replacing people with technology, but now it means complementing them with digital helpers.) Automation is a uniting technology, and it comes along at a perfect time. Today, digital workers (like Artificial Intelligence and Robotic Process Automation) are powerful enough to collaborate with people on real tasks. Today, workers are more separate than ever before, and more in need of being connected.
Automation has a self-evident value proposition: different types of workers have different strengths, so they’ll be better in combination. RPA Bots are fast and inexpensive but cannot handle exceptions or change. AI is great at evaluation, recognition, translation, and giving advice; but generally cannot make the final decision. People are best at making decisions and talking to customers. Without a doubt, these different workers can complete jobs more efficiently as a team, together in a single workflow.
The pandemic has forced people to work remotely from each other and collaborate over a distance. The workflow has replaced the workroom, as a coordination technology. We’ve never needed smart workflows as much as we do now, nor the full range of automation technology that fills them.
* * *
Hyperautomation is automation at speed. It’s a combination between technologies that allow faster application authorship (like low-code and no-code) and automation technologies that coordinate different worker types. Both are essential in the new decade. Businesses will want to deploy workers more efficiently, and they will want to invent new work patterns faster.
The world has changed. Tomorrow’s enterprise will need agility, unification, speed, and collaboration. In a word, it will need hyperautomation.
From Big Boxes to Intelligence Everywhere: The Changing Face of Automation
Neil Ward-Dutton, IDC
A look at how the world of business automation has changed through the decades, and how new technology capabilities have created new business possibilities. This chapter examines how new technologies come together with low-code development to deliver hyperautomation.
ABOUT THE AUTHOR:
Neil Ward-Dutton is Vice President, AI and Intelligent Process Automation European Practices, at IDC. Prior to joining IDC, Neil was Founder and Research Director of MWD Advisors, a technology advisory firm focusing on digital technologies and their impacts on business. Neil is recognized as one of Europe’s most experienced and high-profile technology industry analysts. He has regularly appeared on TV and in print media over his 20-year industry analyst career as well as authored more than 10 books on IT and business strategy.
From Flour Mills to PCs: 250 Years of Business-Automation History
The history of business automation goes back a lot further than you might think. In 1785, American inventor Oliver Evans built an automated, water-powered flour mill near Newport, Delaware. Using a variety of automated mechanisms, Evans’ invention enabled the mill to operate with just one person rather than four. When it worked optimally, it also produced flour from grain more efficiently than a non-automated mill. In subsequent decades, the invention and refinement of control systems enabled even more automation, of all kinds of manufacturing processes, at greater scale.
Second World War-era military efforts, and later, NASA’s spaceflight program through the 1960s and 1970s, fueled the next major wave of innovation in automation. The first computers started to be set to work in business administration settings, as well as in manufacturing processes and scientific environments. It was a British food company, Lyons, that operated the first business applications on an electronic computer, starting in 1951 with a custom-built system to calculate valuations, process payroll and assess inventory. Through the 1960s and 1970s, computers in business were principally used to automate the work of clerks in accounting, payroll, and other relatively simple administrative functions, at scale; automating “standalone” functions; and creating and managing simple (if large) sets of administrative records.
Through many subsequent inventions and refinements in business computing—the introduction of digital computers, time-sharing systems, mainframe systems, local-area networking (LAN) technology, PCs and so on—businesses continued to focus their automation efforts on distinct administrative procedures and processes, albeit at vastly increased levels of scale and variation. It was only with the emergence of Enterprise Resource Planning (ERP) as a business discipline, in the late 1980s, that IT systems were built and operated to integrate automated business functions at scale: from HR to finance and accounting, production planning, and so on.
From the first introduction of computers in business contexts to the mid-1990s, the story of how businesses introduced automation was one of centralized design and development, high cost, concentrated use of specialized talent, and long gestation periods (with their attendant risks.) Large-scale, complex IT delivery models could only be applied to the most gnarly business challenges or the most obviously profitable business opportunities. The resulting systems operated at scale by necessity and could typically only be changed at significant cost and risk.
It’s tempting to fast-forward to the present day and highlight how modern business automation technology has changed the game. But, it’s not that simple. We got a taste of what is now happening at scale today for a relatively brief period, from the early 1990s to the early 2000s.
Rapid Application Development in the Client-server Era
In the early 1990s, an explosion of invention in networking technologies, server and PC platforms (together with a major shift in technology spending, away from centralization to distributed spending led by business units and functions) created a huge wave of opportunity to make computing more accessible to a wider range of businesses. This explosion, however, also created a huge wave of complexity for any team wanting to build business software. At the same time, the mass-market availability of new Graphical User Interface (GUI) technology, popularized by Microsoft, was making a massive impact. Combined with the success of the company’s partner-centered business strategy, the result was a fast-growing ecosystem of partners offering new, low-cost, PC-based productivity applications with mass-market appeal.
For the first time, software vendors offered development tools that took advantage of new GUI environments and point-and-click techniques. The result was an explosion of tools that relatively non-technical staff could use to create relatively simple business applications. With the hugely popular Microsoft Visual Basic (introduced in 1991), Access, Delphi and PowerBuilder, together with niche products like Dynasty, Forte, JAM, Progress and Uniface (and many more), teams of business analysts and self-taught programmers were able to participate in (and often develop) end—to-end—business applications using visual tools.
This included so-called Rapid Application Development (RAD) techniques, based around the notion of iterative development. Through the 1990s, business function teams (and software development firms contracted to them) built and deployed tens of thousands of relatively simple, team-focused business software applications.
Some of these tools still remain. And perhaps unsurprisingly, many of the applications they were used to create are also still in use in businesses worldwide. At the end of the 1990s, though, a new technology-platform shift happened that most of these vendors struggled to embrace: the shift to web-based applications, where development-platform activity consolidated around Java and Microsoft’s .NET programing languages.
Digital Transformation: The Imperative that Drives the Story
Curiously, the productivity advances that many of the “first-wave” low-code application development tool vendors had made in terms of visual, model-based development—not only of user interfaces, but also of business logic, data definitions, and so on—were forgotten in the early 2000s. A new wave of developers flocked to new Java- and .NET-based tools that required more technical development skills in order to build a new wave of e-commerce websites and applications that the RAD tools of the time were poorly equipped to help with.
Today, though, the pendulum of demand very definitely has swung back from the early 2000s, when web-based application development was dominated by technical developers working with relatively low-level tools. There are many reasons for this swing, but perhaps the most impactful is today’s digital transformation imperative, which exists for businesses large and small in every industry.
The internet-based platforms for application and data hosting that first became usable in the early 2000s have become commodities. Hyperscale public cloud platform providers have created an abundance of scalable computing, storage and network capacity for rent. This has allowed waves of new “born-digital” businesses to compete for market share with established businesses across multiple industries—from banking and retail, to telecoms, utilities and even manufacturing.
For established businesses, the response to new “born-digital” competitors has to be to find ways to leverage digital technologies—not only to implement more sophisticated online marketing and commerce capabilities on the “outside” of their organizations, but also to integrate, streamline and increase the agility and scalability of the core business processes and decisions that drive the “inside” of their organizations.
Of course, the same rental models for computing infrastructure (known as Infrastructure-as-a-service, or IaaS) and business software development platforms (known as Platform-as-a-service, or PaaS) that new born-digital industry disruptors have leveraged are also available to established businesses. It’s the global movement to take advantage of these platforms—to digitize business activities inside organizations, as well as outside—that is driving the new automation and that this book is all about.
A New Wave of Agile Business Automation Demand
Most organizations starting their digital transformation journeys begin by aiming to reinvent customer experiences—aiming to match the kinds of omnichannel, immediate, easy-to-use, personalized, and transparent interfaces that born-digital competitors place at the heart of their strategies.