CYBERSECURITY AND THE DUNNING-KRUGER EFFECT
by Tony Anscombe, Chief Security Evangelist, ESET
If you were to ask the average person about their cybersecurity hygiene and habits, I suspect you would get answers like these: “Yes, I have different passwords on my devices”; “Yes, I monitor what my kids do online”; “Yes, all my devices have up to date security software”; “Yes, I understand privacy.” But in recent months I have seen several surveys that give me reason to question if the answers people give about their cybersecurity practices are an indication of reality or better portray a desire for what they know they should be doing but do not. Are they just embarrassed that they have not taken preventative steps? Or are they suffering from the Dunning-Kruger effect?
The Dunning-Kruger effect is a cognitive bias where people overestimate their abilities, fail to recognize their own mistakes and lack of skill and the genuine skill and expertise of others. More simply put, they can’t recognize they are not good at something. The effect also includes the more limited scenario where people of high ability underestimate their ability.
I thought about this as I read the results of a survey conducted by the Family Online Safety Institute (FOSI), which found that 71% of parents are unsatisfied with parental control software and feel that current offerings do not meet their needs. As a parent and a professional in the cybersecurity industry, I suspect it is highly improbable that 7 out of every 10 parents have even attempted to switch on parental controls, yet alone found that the tools do not meet their requirements. This theory is partially backed up in the results of a survey by Internet Matters which, in 2018, found that only 39% of parents set controls across their broadband or mobile networks and just 35% set controls on devices their children use at home. While there are two years between these surveys, it is highly unlikely that an additional 30-35% of parents started using parental controls of some type over that period of time.
Demonstrating the disconnect between ability and reality, or even a willingness or desire to take preventative steps, is tough to prove, but I feel certain the majority of cybersecurity professionals reading this will have a slight smile of agreement that users often say one thing yet do another. An annual report published by Proofpoint – 2020 State of the Phish – highlights the issue. It includes some interesting statistics, such as only 49% of U.S. workers could correctly explain ‘what is phishing’ and 30% believe malware is a type of hardware that boosts WiFi signals. If you can’t identify and describe a risk, then you are unlikely to be taking steps to avoid it. Similarly, a recent survey by my company showed that 87% of people said they “feel secure” when shopping online, yet 45% said they had no or only some anti-virus software installed on their devices. And while there may not be any survey data to show that many Apple Mac users don’t think they can get a computer virus, I think deep down they know they can. Yet many continue to tempt fate by not installing any anti-malware software. A good question for any Mac user not running an anti-malware product on their device is, ‘Is your machine infected?’ If they answer no, ask them how they know (my guess is, they don’t know).
Creating a secure and safe digital environment is only partially resolved by technology. Human behavior, unfortunately, accounts for many, if not the majority, of the cybersecurity incidents we read about in the media. How do we expect people to protect themselves, their families or their workplaces when they are not even able to describe what a threat may be, or they have a perception that they’re using protection, such as parental controls, when in reality they may not be?
Security in many aspects of life is an unconscious decision. From an early age we are taught that some things are potentially dangerous or could cause harm. I know to stop, look and listen at the side of the road before crossing, I know to lock my front door at night, I know to wear the seat belt when I am in a car. We take the necessary precautions – for example, with driving and with swimming pools – due to the continual messaging and advice we’ve been given, but does the same happen when we think about the digital world? Or is it that dangers we can’t see are not real enough for us to take any action? People know that there is a risk when they overshare personal information online, or that what they post or share online may live forever. Yet they overshare, right up to moment of disaster, because they believe it will not happen to them (and then, shockingly, some even continue oversharing after the event). It would be like knowing the danger of crossing the road but crossing regardless of what stopping, looking and listening may have bought to your attention.
The digital world has been here for a while and is here to stay. We need to be engraining the core safety and security messages in people from an early age, in the same way we teach crossing the road safely. Understanding and appreciating cybersecurity dangers will help create an environment where people take informed decisions and actions and better understand their own abilities.
About the Author
Tony Anscombe is the Chief Security Evangelist of ESET, an industry-leading IT security software and services company for businesses and consumers worldwide.With over 20 years of security industry experience, Anscombe is an established author, blogger and speaker on the current threat landscape, security technologies and products, data protection, privacy and trust, and Internet safety. His speaking portfolio includes industry conferences RSA, CTIA, MEF, GlobSEC and Child Internet Safety (CIS). He has been quoted in security, technology and business media, including BBC, CNN, the Guardian, the Times and USA Today, with broadcast appearances on Bloomberg, BBC, CTV, KRON and CBS. Anscombe has served on the board of MEF and FOSI and held an executive position with the Anti-Malware Testing Standards Organization (AMTSO).