The expert view: Using AI, automation and machine learning to tackle the cyber skills gap
5 November 2018 |
“We’re now in this new cyber frontier,” Verizon’s Ray Ottey told a Business Reporter Breakfast Briefing for senior business executives at London’s Langham hotel. He added: “We are dealing with the same problems, but the scale has increased. How do we use automation to help?”
He emphasised the need between collaboration among security professionals, pointing out that attackers collaborate all the time, but companies tend to try to keep their knowledge to themselves, hoping that it will be a competitive advantage.
This might one day extend as far as sharing algorithms, said James Hanlon, of Splunk. He said that, with a skills shortage affecting the security sector, companies need to do everything possible to ensure that they gain an advantage.
Skills shortage and cost pressures
Most attendees were more concerned about pressure to cut costs than the staff shortage. Those present acknowledged that it can be hard to find qualified staff – and the lack of experienced people means that they can command higher salaries – but most said that their firms wanted to reduce the headcount deployed on security, where possible.
One attendee said that his company wanted to make security “everyone’s responsibility”, rather than having a dedicated team.
Another delegate, from a major international bank, said that he was in the process of “onshoring and insourcing” – taking back a function that had previously been outsourced overseas and bringing it back to the UK. The challenge, he said, was that he was expected to make the move “cost neutral”, which meant he would need automation to replace some functions. Hiring the same number of staff as the outsourcer would be too expensive.
For established companies the challenge can be to compete with start-ups, which are often able to spend less because they have no legacy systems to maintain and a simpler structure to manage.
Where automation can help
Some at the briefing had applied automation to security tasks but only in a very small way. Nobody considered their use of artificial intelligence (AI) or machine learning to be at an advanced stage.
The best opportunities are in the areas that this technology presently excels: pattern recognition. That enables use cases such as responding faster to phishing attacks or reducing triage time. Much of this works by training a machine-learning system to understand what ‘normal’ looks like, so it can spot abnormalities and raise the alarm.
Mr Ottey said that Verizon was about to roll out autonomous threat hunting, which would enable businesses to track emerging threats and ensure they have defences in place. Another attendee, from a major bank, said that his company had been using predictive analysis for several years as a tool for reducing fraud and financial crime.
Obstacles to security automation
There was some debate about the limits of AI and whether what we have now can be considered ‘intelligence’ at all. For some, true AI will only come when we have automated tools that can be proactive, for instance by identifying threats that have not previously been spotted. We are some way from being able to do that.
Everyone present was comfortable with AI that presents data and lets them make decisions, but several expressed concern about AI decision-making. If a ‘robot’ is making the decisions, then who takes responsibility when it goes wrong? Who is qualified to analyse its behaviour and make sure that it is working properly?
A more basic concern is that companies today cannot expect funding for experiments. Technology needs proven case studies before companies will commit to buying it and, some attendees said, security automation tools just don’t have enough of this yet.
There’s a problem here, however: machine learning, especially, needs data to be effective and it can’t gather data without more users. But more people will not use it unless it can be shown to be effective. One proposed solution is to adopt new technology in limited trials, which will allow both the supplier and the customer to gain evidence to show the tool’s effectiveness.
Finally, some of those present said that auditors can be an obstacle to automation because they don’t really understand the realities of the technology. Some of those who have had success with auditors suggested involving them in the process earlier, explaining what you intend to do and why, so that when an audit happens, more information is available.
This remains an area of technology that is moving very quickly, with many organisations struggling to keep abreast of the latest changes. “We’re not going to solve this today,” said one attendee, from the financial services industry, summing-up her response to the briefing. “But it’s reassuring to hear that other people are struggling with the same problem and to know that I’m taking the right approach.”
For more information visit www.verizonenterprise.com
Latest posts by Shane Richmond (see all)
- The Expert View: How to gain agility and connectivity in a cloud-based world - 4th December 2018
- The expert view: The reality of AI to defeat cyber-crime in a cloud-based world - 22nd November 2018
- The expert view: digitising business – transforming business models from products to services - 21st November 2018
- The expert view: Using AI, automation and machine learning to tackle the cyber skills gap - 5th November 2018
- French View: Comment faire pour que les entrepreneurs français considèrent la prise de risque comme une réelle opportunité? - 8th August 2018