Busting the myths about AI
Michael Xie, Founder, President and CTO at Fortinet, argues that AI doesn't eliminate jobs but it creates them.
Over the past two years, as the debate over immigration policies has grown increasingly heated, an argument was often introduced as a counter to some of the more abrasive stances. At first glance, it may have appeared to be a fact-based response to the animosity and divisiveness that defined the debate. For those of us with deeper, first-hand knowledge though, it was just as fear-based and misinformed.
At Fortinet, we have been investing in artificial intelligence (AI) for years. It is an incredible technology that presents extraordinary opportunities for how to protect networks and, ultimately, the internet. As AI become more common and more sophisticated, it consistently clarifies an important truth: The value, power and efficiency of AI does not arise from its ability to replace human beings.
In fact, AI does just the opposite. Both automation and AI underscore how central and critical human insight and expertise are to success.
Arguments and headlines that cast technology as an encroaching threat, widening social divides and limiting opportunities may provoke stronger reactions (and more clicks), but in general, innovation is not additive or subtractive — it is multiplicative. It creates exponentially more opportunities for more people in more ways than even those most directly impacted by it can often imagine at first.
Has email replaced the post office? While the number of total career employees declined from 2007 to 2016, it is now just a bit more than it was in 1965. The volume of marketing mail and first-class mail have decreased, but the volume of total shipping of packages has increased from 3.3 billion to 5.2 billion packages. Delivery points have increased from 148 million to 156 million and there are also thousands of additional delivery trucks on the roads.
Did ATMs replace banks? No — by lowering the cost of opening a branch, ATMs helped increase the number of banks by more than 40%. In fact, they didn’t even replace bank tellers, whose ranks increased to meet the demand of more branches.
It is especially important to recognize these facts in light of the particularly callous argument that the only jobs that AI kills are the ones nobody would want. All of us value the job that provides for our families and lives.
If anything, the rise and spread of AI forces us to take a closer look at how charging employees to do the only kinds of tasks that AI is good at — repetitive, precise, controlled tasks that require no reasoning, higher-order thinking or even common sense — represents an outdated, divisive management style.
It is hard to imagine an industry more heavily reliant on digital technology than cybersecurity. As of Q3 2017, our cybersecurity tools and technologies were responsible for neutralizing 91,000 malware programs, blocking access to 150,000 malicious websites and resisting 4.4 million network intrusion attempts — per minute.
In a digitally driven world that is teeming with threat actors — from malicious pranksters to criminals, ideologically motivated sects to state-sponsored cyber terrorists, threatening everything from our individual identities to the critical infrastructure of our society — there is no way to protect data without self-learning AI and automation. For effective cybersecurity, we must utilize AI to do time-consuming tasks such as data mining and parsing data logs — while allowing cybersecurity teams to focus on the much higher-order tasks of threat identification and elimination.
And yet, one of the gravest challenges our industry faces is a shortage of talent.
Our industry’s unemployment rate stands at 0%. In 2016, one million new cybersecurity jobs were created and estimates project an increase of five or six million over the next few years. In 2015, there was a 74% increase in cybersecurity job postings, half of which went unfilled.
Across industries, 45% of organizations claim to experience a problematic shortage of cybersecurity skills. As a result, cybersecurity teams must race from one crisis or breach to the next, with little time for strategic planning or continued learning to keep up with threat sophistication.
These are certainly business challenges — and increasingly costly ones at that. The demand itself is driving an expensive bidding war for talent, and the cost of cybercrime is estimated to reach $2.1 trillion globally by next year. These are also national, global security risks, with everything from financial systems to healthcare to critical infrastructure in the crosshairs.
Automation and AI are not eliminating jobs, they are creating them — high-paying, high-level and secure ones at that — at an unprecedented rate. As the levels of data continue to grow, that will create even greater demand.
We will never be able to fill these jobs without greater awareness of the need for them, early training in middle school and high school and more outreach to veterans and college students — particularly women, who currently comprise just 14% of the cybersecurity workforce.
There are definitely clear causes of wage stagnation and job loss — from who disproportionately benefits from economic gains to the impact of wealth creation through capital management rather than goods or services to the funding and priorities of our educational system to an increasingly volatile financial system to the impact of rapid globalisation. Those are all extraordinary challenges, and it is far easier to scapegoat technology than to address the challenges it presents. Blaming innovation will not solve real problems or prevent crises. It will only drive misunderstandings and clicks in an increasingly unsafe digital landscape.