Welcome to this week’s edition of the teiss magazine.  

The numerous attacks on pharmaceutical companies developing vaccines for covid-19 has brought the value of scientific and medical research into the limelight. This week we have two more stories on this for you including an article on exosomes (no, we didn’t know what they are either). We also have articles on governance and some suggestions about security strategy. As well as stories about this week’s major leaks.

News from the front line

Cyber crime

You may not know this, but exosomes are membrane-bound extracellular vesicles that are produced in the endosomal compartment of most eukaryotic cells. And apparently, information about them is valuable. A US woman has been given 30 months for stealing and selling Exosomes data to China.

It isn’t just exosomes that have value when it comes to the life sciences. Sensitive medical images are being left in the open – making it easy for criminals to steal them and use them for fraud and extortion.

On ground that will be more familiar to most cyber professionals, Microsoft, Facebook, PayPal have been named as the phishers’ favourite brands of 2020.

Governance and compliance

In what will we hope a useful move forward the UK Government has formed the UK Cyber Security Council to govern the cybersecurity sector. The teiss elves would love to hear your thoughts on this initiative.

And in a separate move, the US FTC has approved a final settlement with Zoom over data security concerns, forcing the company to implement a comprehensive information security programme

Security strategy

Over half of CSOs and CISOs in the UK feel that the switch to remote working has rendered existing controls, systems, and applications outdated and ineffective in defending against today’s cyber-threats. We bring you plenty of statistics as well as advice on how to keep your organisation safe.

Of course, remote working needs to be secure. But it’s not just about technology. Get the basic organisational principles right and you will be a long way to keeping your workforce safe.

Leaks and hacks

Our weekly round-up of some of the worst security lapses. In a particularly unpleasant attack, hackers infiltrated a Florida water treatment plant to poison the water supply. Remember, it’s not just information that criminals want. It’s money, and they will try any way they can to get it. For example, a European fraud ring has defrauded US banks out of £10m by using shell companies. And, again proving the value of medical records, data stolen from US hospitals has been leaked on the dark web.

Good week for…

Privacy: Worried about the privacy implications of facial recognition technology? The BSIA has unveiled the industry’s first ethical use guide for AFR technology.

Bad week for...

FCA: It’s been a bad quarter, not just a bad week, for the FCA who have been targeted by nearly 250,000 spam and malicious emails.

Cyber-defence in the UK & Ireland: protecting against an evolving threat landscape 

Organisations worldwide face a fast-evolving threat landscape. And those in the UK and Ireland are no exception. 

In the past year, the landscape has developed further. Businesses worldwide have had to rapidly adapt to the challenges of the pandemic, and they commonly did this by accelerating cloud deployments and adopting new software-as-a-service (SaaS) technologies to enable remote working and collaboration. These changes created new and more complex attack surfaces – and cyber-criminals took advantage of this. 

Recent Proofpoint research of CISOs and CSOs in the UK&I revealed that more than half (53 per cent) of organisations in the region reported at least one cyberattack in 2020. Looking ahead, this pace isn’t set to slow. In fact, almost two-thirds (64 per cent) of CISOs/CSOs believe that their organisation is at risk of cyber-attacks in the next 12 months. 

Although IT leaders in the region are showing awareness of the level of threat they face, many need to think in new ways to enable their firms to be truly effective in protecting against them. 

Preparing for future threats 

Remote working is here to stay for the majority of organisations. However, there’s still some way to go for organisations to manage the new remote-working attack surface. 

Over half of CSOs and CISOs in the UK&I (54 per cent) feel that the switch to remote working has rendered existing controls, systems and applications outdated and ineffective in defending against today’s cyber-threats. Indeed, only one fifth (22 per cent) strongly agree that their employees are well equipped to work remotely. 

Overall, a majority (64 per cent) consider that implementing a remote working policy across 2020 has left their business more vulnerable to cyber-threats. 

Despite the proven challenges brought by teleworking, ransomware remains the major threat keeping CISOs awake at night. Our research found that in the next 12 months, 46 per cent of CSOs and CISOs in the UK&I believe that ransomware, or other forms of extortion perpetrated by outsiders, will be the biggest cyber-security threat to their organisation. This was followed by cloud account compromise (39 per cent), insider threats (33 per cent), and phishing (30 per cent). 

While these predictions largely align with current trends, worryingly, less than a quarter (24 per cent) of CISOs and CSOs in the UK&I consider impersonation attacks and business email compromise (BEC) attacks as the potential biggest cyber-threats to their organisation in the next year. These financial fraud attacks are the most expensive cyber-threats globally – the FBI estimates losses at $26.5 billion over three years, and cyber-liability insurers say payments for BEC are greater than all other cyber-claims combined. These threats are not as high profile as ransomware, but it’s essential that IT leaders in the region are correctly understand the relative risk levels. 

Human error and security awareness 

Where cyber-criminals once focused their attention on our networks and infrastructure, it is now increasingly our people who are coming under attack. Whether via malicious links, account compromise, or social engineering, threat actors have turned their attention to what, for many organisations, was expected to be the last line of defence.  

Unfortunately, despite “security awareness” programmes, for many firms this last line of defence is often poorly motivated and ill-prepared. IT leaders in the UK&I believe that over half (55 per cent) believe that, despite all other security protections, it is human error and lack of cyber-security awareness that present the most significant risk to organisations. 

Common employee behaviours likely to result in cyber-vulnerability include clicking on a malicious link or downloading a compromised file (43 per cent), followed by falling victim to phishing emails (39 per cent), intentional leaking of data (35 per cent), use of devices and applications (35 per cent), and mishandling of sensitive information (35 per cent). 

Despite these recognised concerns and awareness of potential employee mistakes, inadequate training programmes remain commonplace. The majority of UK&I CISOs and CSOs (72 per cent) admitted to training their employees on cyber-security best practices as infrequently as twice a year or less, with only 28 per cent running a programme three times a year or more. 

It’s a struggle to create communication so compelling that employees will internalise it and change their existing behaviour with such few touch points. Such programmes will likely push “awareness”, but not reach the real goals of behavioural change, or the creation of a supportive security culture, and it’s only these two stages that can have a real impact on detecting and deterring such attacks. 

Putting people at the heart of your defence 

Irrespective of the means of attack, threat actors continue to take advantage of the human factor. 

While CSOs and CISOs across the UK&I clearly recognise the cyber-risks faced by employees and are prioritising their response and 2021 investments accordingly, there seems to be moments of disconnect in grasping the scale and importance of some vectors such as the true threat of BEC attacks, and the correlation between cloud account compromise and insider risk management. Both these threats are significant and growing, fuelled by the global shift to remote working. 

A people-centric strategy is a must for organisations. This starts by recognising that the majority of attacks target people, and that these commonly arrive via email. Identify your most vulnerable users and ensure they are well defended from the majority of threats, but also equipped with the knowledge and the tools to defend your organisation. 

Along with these technical solutions and controls, it is essential that a comprehensive training programme sits at the heart of your cyber-defence. Training should be regular, comprehensive and adaptative and cover a range of topics – ensure that staff understand the real significance of cyber-attacks and how these can have real consequences on their job and personal life. Only then will they develop the motivation to truly become part of the solution. 

by Andrew Rose, Resident CISO, EMEA, Proofpoint   

teissTalk: Protecting your Remote InfoSec Resources

Episode: Protecting your remote InfoSec resources: Strategies to identify and avoid Information Security professional burn-out during the day-to-day and during a crisis

Originally Aired: Tuesday 9th February 2021 at 16:00 (GMT)
This episode is now available to view on-demand

What are the things that worry you as we are in the pandemic? And are there any upsides?

There are two worries. First there are the personal things – how they people in your team are surviving, how people cope when they have to be at home, how people are coping with the stress and how you manage that. And then there is the security aspect. Of course if you are used to working at home it’s easier but if you are not then you need to be aware of things such as the neighbours listening in when you make a business call in the garden. And over time this can get more difficult as you get more relaxed: you need to ensure that people keep switched on about security.

Knowing who is accountable in these circumstances is difficult. It’s much harder to check on what people are doing, especially if you are a presenteeism-focussed company rather than an output-focussed company. But perhaps the pandemic has changed the way we view this and made more companies more focussed on outputs and less dependent on people coming into the office.

It’s important to remember that there are advantages to being in an office – such as bumping into people are sharing ideas. But we don’t have to be in the office all the time, especially if we put disciplines in place such as having core hours when everyone knows you will be available.

To what extent is stress a problem in the cyber security industry?

Research shows that 80% of CISOs feel stressed most of the time. And technology workers are 5x as likely to suffer from mental problems as the general problems. Of course cyber security and being a CISO is a stressful job. What it would be good to know is whether stress levels have risen during the pandemic.

Unfortunately CISOs have very little control over what happens and when it happens. So you are constantly worried that the phone will ring. It’s enjoyable, challenging and exciting of course which is why so many CISOs love the job. The downside is that you are passionate, you care about the job: and when people don’t behave safely – and of course they may have other priorities - it affects you. So you need mechanisms to turn the stress off.

Stress often comes from having to prioritise. So for instance, anything that affects health or physical safety needs to come first. But you can’t have 5 top priorities!

Are resources a problem in cyber security?

There is an expectation issue. There are always limited resources. And an almost unlimited number of potential threats. If time and money is limited you need to prioritise. And that is hard. Especially when people expect you to do everything.

It isn’t always a budget problem though. It may be that we need to be more proactive, taking care of all security systems.

But whatever the cause, there is a need to be able to say” no”. So if you prioritise certain risks for management or elimination, then the other risks need to be accepted unless more resources or budget s allocated. Making it clear what you aren’t doing is therefore important: that needs to be communicated to senior managers so that if necessary more resources can be allocated.

Are security professionals too harsh on themselves and others?

Everyone makes mistakes. But when these happen, if you have layered defences then if something is missed or a system is misconfigured, then a single mistake can be picked up elsewhere. So we shouldn’t blame individuals. Instead, we should plan for mistakes and with defence in depth enable security to be maintained even when things go wrong.

How has the pandemic affected security?

We need leaders to help us prioritise what must be done and what can be referred. This is particularly important at the moment where there are new risks from more working from home. And this needs to be done in conjunction with an understanding that when people are at home they may have other things they need to do – they cant work 24 hours a day.

Of course, there are advantages to working from home. You can work at your own pace and you may get fewer interruptions. But it can also be demotivating and you don’t have a chance to share ideas with colleagues and solve problems together.

The pandemic is in fact raising the issue of stress and burnout further up the agenda. If you are not on call, then getting a message at 9 at night shouldn’t mean that you have to respond. Managers need to know that. Also working from home isn’t the same for everyone. In a small flat, you may be stressed by the fact that you can hear other people who are on calls all the time. And there may not be space for a proper desk and monitor. Some people are comfortable. But many people are not.

Is the pandemic an argument for diversity?

Well, certainly it’s useful to have people who have different biorhythms. But people need to know when their colleagues like to work. Perhaps they don’t function well in meetings before 10 am. Or they don’t like starting new tasks off at 5 pm. Understanding how your colleagues are different, and how they like to work, is really important – especially when you can’t see them.

How can you tell when an increase in stress is becoming a problem?

Errors by people who don’t usually make mistakes, people behaving oddly, people who talk without any energy, people who seem withdrawn - things that you don’t see every day – these may all be signs of depression and stress. But, along with things like weight gain or paying less attention to personal appearance, they may also simply be a result of people reacting to the pandemic and lockdown.

This is why teams are important. Everyone can have an off day or feel that they can’t cope. But if you have a team that wants to work well together then you can manage that. Having people who work well as a team is more beneficial than having one or two experts who don’t want to support their colleagues.

So teams need to be encouraged and strengthened. But when people are working remotely this can be hard. Forced fun like online pub quizzes won’t get you anywhere and not everyone wants to join in with online team building: in that case, don’t force it – simply make sure you check in with them from time to time and make sure they are OK.

And it is easy to make people feel that they are being spied on. Instead, you need to build an open culture where leaders show their own struggles or their own domestic situations (such as having a child in the room during a conference call) so that other team members can feel less pressure to be perfect.

Overall the key thing is to understand that everyone is different and, especially in a time of stress like the lockdown, knowing what individual team members need is important. And there is one team member that you need to be very careful to remember and take care of – and that is yourself.

teissTalk: How can InfoSec leaders improve diversity and inclusion for their organisations?

Session: How can InfoSec leaders improve diversity and inclusion for their organisations?
Air Date: Tuesday 11th February 2021, 16:00 (GMT)
This episode is now available to view on-demand

Why does the cyber security industry need to become more diverse?

Recent advertisements for Cyber First have caused controversy, sadly. For instance an advert suggesting a dancer could become a security analysis has been criticised for suggesting perhaps that the arts were dying in the pandemic. However, this was a misinterpretation. The message is really quite simple: you don’t need a computer science degree or experience with coding to have a career in cyber security.

There is still an urgent need for more diversity in the profession. Less than a quarter of people in the profession are female for example. And for complex problems like security there is a real need for diversity, for different ways of looking at problems, for different mindsets observing what is happening in the world – and what might be a threat.

It’s not just about widening the recruitment pool either. People need to understand the different requirement of different people, so that the profession appeals to everyone.

Having role models – successful women in cyber – is important. But another very practical solution is to change job adverts so they become more general, less based around technology skills, and support non specialists with appropriate training.

Recent research has shown that about 20% of cyber security professionals are women, and that nearly a third of them drop out of the profession within 3 years.

What is the relationship between neurodiversity and cyber security?

Just because people are neurodiverse it doesn’t mean they will make good cybersecurity professionals. And indeed they may have many other talents. They shouldn’t be typecast.

They may have different needs. For instance, people with autism need extra employment support as otherwise they can suffer from burnout more quickly than usual. IN addition people with disabilities may find obstacles getting employed because there are so many obstacles.

Neurodiversity is about looking at how people’s brains differ. It’s important to avoid thinking of people as a diagnosis and instead think about what they can do and what their needs are. It’s difficult to cater for every individual of course – but you can start by asking people about their strengths and about how they would like to be interviewed – for instance giving people the option to have a written cv or a video cv depending on what they are comfortable with. Remember: under the Equality Act you are required to make reasonable adjustments. Of course not everyone is prepared to request different things. Being inclusive means asking people how you can help rather than waiting for them to request help.

How should neurodiverse people be treated by employers?

One useful way to think of this is to say that we are all neurodiverse because we all have different brains. Another thing is to think of people as being able to do a job despite differences rather than because of differences.

There is a temptation to encourage people to get a diagnosis e.g. of dyscalculia that acts as a defence of some sort. But this can be dangerous and it shouldn’t necessarily give people more leeway at work. Does someone in a wheelchair have greater rights than someone who is autistic for instance?

Neurodiversity can be hard to understand medically and people need to get away from thinking of the medical symptoms. For instance most people immediately think of autism when neurodiversity is mentioned but Tourettes is another, very different, form of diversity. We need to get away from thinking about a diagnosis and thinking about what people can do instead and what their individual needs are, what reasonable adjustments you can make at work.

True diversity is the antithesis of identity politics. If we want to categorise people to make it easier to act around them, then how will that help the workplace? For instance you may have someone at work with autism who finds it hard to look you in the eye. And you might

have a victim of an assault who has PTSD and is unable to look you in the eye. Does a person who has been categorised as having autism get more protection if there is friction between the two people, and if so why?

Is there a relationship between neurodiversity and cyber crime?

A study at Bath University and the NCA looked at whether there was a connection between cyber-crime and autism. They found an increased risk of committing cyber crime with higher autism-like traits; but a decreased connection with people who had a diagnosis of autism.

Significantly this study proved a connection between autism-like traits and better digital skills. This is something industry needs to be aware of. There needs to be further research into the employment environment, the obstacles to neurodiverse getting employed, and what industry can do about this,

How can neurodiverse people be brought into employment and kept in employment?

There are specialist recruiters who find neurodiverse people. And the tasks you put to people within an interview are also important – for instance it may not be appropriate to ask someone with autism to make a presentation.

Communication at work is also important. People with dyslexia will probably prefer a call rather than emails. And someone who doesn’t like to be touched might be wearing headphones: you shouldn’t touch them on the shoulder if you want to get their attention. People need to be aware of this sort of thing. shouldn’t be touched on the shoulder. Even the “socials” may be different – everyone going to the pub on Friday evening may be very excluding for someone with autism, or perhaps it might be that going to a different pub every week is the problem! There are no easy answers here. But it is important not to manipulate the whole workforce around a particular person’s preferences.

It is important to promote “positive disclosure” – an environment where employees feel comfortable to disclose medical conditions that have been diagnosed, or simply their needs, an environment where people can voice their issues. This gives you better leeway to make reasonable – reasonable – changes. Individuals have generally lived with a particular condition for years – so simply asking how you can help is a great step forward. And remember – if two people have the same condition, it doesn’t mean they have the same needs.

BSIA unveils industry’s first ethical use guide for AFR technology 

The British Security Industry Association (BSIA) has released the first of its kind ethical and legal use guide for AFT (automated facial recognition) which lays down how, for which purpose, and in what conditions can end-users deploy the technology. 

BSIA, the trade association for the professional security industry in the UK, stresses that the aim of the ethical and legal use guide for AFR technology is to ensure that AFR (automated facial recognition) is used in such a manner that it does not cause harm or discriminate against any persons in either a public or private setting. 

“The use of AI is an exponentially growing part of daily life and we must ensure that all stakeholders are aware of the ethical and legal considerations of using these solutions. If not, this beneficial technology could be misused, leading to loss of trust and increased scepticism of the technology,” says Dave Wilkinson, BSIA's Director of Technical Services at the BSIA and the leader of the AFR working group. 

“This collaborative piece of work among industry experts has produced a guide with advice and recommendations on ethical and legal AFR usage, which will appeal to anyone in or out of the physical security industry. 

“We want to make sure the general public know that this ethical and legal guidance is out there for companies to follow. Compliance with the law is paramount using this technology, and this guide will provide companies with the basis to demonstrate their commitment to complying with the ethical realities, consequences and impacts of using an AI/AFR solution.” 

In the absence of a global ethical framework to govern the use of AFR, the ethical and legal use guide has been framed as per the recommendations of OECD (the Organisation for Economic Cooperation and Development) which calls for the technology to be used ethically, transparently, in accordance with the rule of law, and with respect to human rights, diversity, and democratic values. 

According to the guide, AFR must not be used to identify individuals without obtaining their prior consent, that the AFR training data must be obtained lawfully, that the database of images against which the AFR matches faces must be legally controlled as set out in the Data Protection Act 2018, and that the use of AFR should be proportionate to the purpose. 

It also calls for the users of AFR to ensure that privacy data is made available for subject access requests, and that all data collected is necessary, proportionate, and stored transparently and for no longer than necessary. Users of AFR must also keep data protection policies, conduct Data Protection Impact Assessments, and nominate individuals or groups to take responsibility for the ethical and legal compliance and operation of the system. 

Considering that data privacy will be at the centrestage of data collection through AFR technology, users of the technology must also ensure that they identify data controllers, identify measures to reduce risks, assess necessity and proportionality, describe the processing activity, utilize privacy masking, and confirm completion of DPIA and record outcomes. 

The guide also stresses that users of AFR solutions should also take special care as far as the storage and retention of sensitive data of private citizens is concerned. They must consider how long the data is to be retained, how often the data needs to be reviewed, ensure that cyber security protections are in place to protect the data, and define the purpose of databases in line with ethical and legal requirements. 

Upcoming teissTalk Episodes

Coming up next on teissTalk: 

Measuring your organisation’s information security posture – which metrics help you evaluate your posture? 

Air Date: Thursday 18th February 2021, 10:00 (GMT) 

  • Overcoming communication challenges around communicating the guidelines for measuring and improving your information security posture 
  • Data retention, sanitisation and lifecycle – setting a clear structure to measure changes in your information security posture, internally and with vendors 
  • What are the challenges of measuring your changing information security posture over time?

Guests: 

Ben Aung, Executive Vice President & Global Chief Information Security Officer, Sage 

John Rouffas, Chief Information Security Officer, Pharos Security 

Craig McEwen, Chief Information Security Officer, Anglo American 

  

Recruitment and retention in information security: energising the talent market or causing a genuine cyber-skills gap?

Air Date: Tuesday 23rd February 2021, 16:00 (GMT)

  • Are some skills sets more transferable than others for cyber security careers?
  • Moving away from a blame culture to retain your best security staff
  • Do we have a misalignment of expectations, rather than a skills shortage?

Guests: 

Greg van der Gaast, Head of Information Security, The University of Salford

Bharat Thakrar, Director, Professional Services Peak Cyber Institute

Nicky Keeley, Head of Cyber Security Oversight, Civil Aviation Authority

Thom Langford, Security Advocate, SentinelOne

AI: malicious uses and abuses

Air Date: Thursday 25th February, 10:00 (GMT)

  • Social engineering at scale - how plausible is this scenario, and how can InfoSec Leaders prepare their colleagues?
  • Criminal Business Intelligence - how Machine Learning is improving the efficiency of malware-based organisations
  • How are cyber criminals using deep fake technology and how can InfoSec leaders protect their people and organisations?

Guests: 

Linus Neumann, Hacker and psychologist

Stephen Spick, Head of Information Security, Cyber Security and Compliance, SHL

Ed Williams, Director, Trustwave SpiderLabs EMEA

Be sure to add these dates to your diary! 

NEXT
PREV

Paid Slider

You can choose paid_slider from dropdown.


Example : slide-gray, slide-white


    Cancel

Membership Login



signup now | forgot password?