Is it fair?


(Michael Kraft) #1

Since this category description states the user is always the weakest link, I have to ask:

Is it fair to always blame the user? Do we not spend hundreds of thousands on security appliances? Here’s how I approach blame…

If I buy a car, and I turn the left turn signal on and it doesn’t work, who is to blame, the driver or the manufacturer? The manufacturer of course. Now, if the driver turns on the left signals but then turns right, now I blame the driver…

What I am trying to say is we need to truly determine what the root cause was, a security product that didn’t work or a user who didn’t know to do, or purposely bypassed the correct thing to do…

Vendors are passing the buck to an easy target by always blaming users…yes, users need help, but vendors need to take accountability in all this too…


(Mel Green) #2

Maybe a better description would be “low-hanging fruit” - in my dream world I would have a Splunk backend rolling through every netflow on the network and a fully staffed SOC to handle all the alerts.

I do have really strong security tools (good Endpoint protection, SIEM, IDS, FIM, etc) but, at this point for my company the next piece is to stop the users from doing dumb stuff.

To use your words, the users need to be held accountable too.


(Cindy) #3

E
Blaming users just buys some initial time. It is often not even heard as the true root cause is sought out


(Edwin Eekelaers) #4

In 95% or more of the cases brought to me users are at fault… Either their ignorance or bad intent causes a lot of problems. If something happens once and doesn’t re-occur after i had a word with them then i’m happy. It’s those repeat offenders that drive me nuts frequently.


(Tegniq User) #5

It seems to me that more and more companies are not just becoming aware of security issues, but also to the human brought in by staff and contractors. Board seems to have a new favorite word, “cybersecurity”. We can through all the cash we possibly can at cybersecurity, but without staff be held accountable for when they are truly at fault the current issue will remain.

Thus IMHO, everyone (board, managers, staff, security professionals, contractors) will need to adopt an accountable culture no matter the cause of the issue.


(Jack ) #6

Did the bulb burn out and need replacing? If so, the user is at fault. That is a case where we tend to put too much on the vendor will keep us safe and the user can do what ever we want to do. Yes, we have appliances and software to assist us, but this does not mean that we should act stupid.

In the above example, the driver could still get the ticket if seen by a police officer. It is the drivers responsibility to ensure that the car operates properly, and they should have checked the turn signals prior to starting the trip.

You would not get identity theft protection, then advertise your name, address, and social security number to everyone knowing that someone else is protecting your identity, would you? I would not, because it is not smart. I have protection, but still safeguard my information like my life depends on it.


(Justme) #7

The biggest problem with any security product (firewall, av, web filter, etc…) is that they are reactionary usually only as good as their latest update or definition file. Users are a first line of defense and can be proactive against many threats. While you can’t always blame the end user a lot of security does start with them being that front line.


(Matt Parkes) #8

It all boils down to RISK, how much risk are you willing to take on not protecting something which is sensitive or confidential. How well do the board understand or appreciate information or cyber security? Is adequate resources such as people, systems and time provided for adequate protection.

Yes security systems are not fool proof and cannot provide 100% protection and this is where we need to educate users how to behave securely so I believe it is a mix of both being able to configure security systems to protect against vulnerabilities and insecurities and for users to behave in a secure manner. Ultimately even the administrators who configure the security systems are human so systems might not be configured as restrictively as they need, who knows.

This is why we have incident response and remediation so that when a user or system fails to protect we know about it and find the cause and put a solution in place. Users become a problem when it found that they repeat offend, if education is proving ineffective and a system cannot be configured to mitigate the behaviour then there may be no option to go down the HR route.

In the analogy of the car manufacturer being at fault if the left indicator doesn’t work, I am not sure I agree, as it is the drivers responsibility to ensure the vehicle is road worthy before use therefore a faulty indicator should be reported to the manufacturer or at least a garage before setting out.


(Stewart) #9

I would have to agree - the same happens in my experiences. It is usually the users’ ignorance that is the problem. I use the word ignorant not as a derogatory term, but a lack of knowledge and understanding. That is why we provide them with security training for onboarding, as well as annually. But there are some people that just don’t get why you don’t click on links or open attachments blindly, and that is why we always say “when in doubt, call us first”. We also send out monthly examples of security issues and what to look out for so they get a constant reminder. It’s not their job to know what I do, but it is their job to use extreme caution and follow our security policies when in the workplace.


(Annette) #10

I totally agree with your comments. Most often (not always but…) end users are the weakest link, at least at the company I work for. That’s why educating users on what to look for, what not to click on, etc… is a must in this day and age.


(Jason Ross) #11

“there are some people that just don’t get why you don’t click on links or open attachments blindly”

True, but this also implies that they are clicking “blindly”. Is that the case truly, or are they being tricked by a very well done phishing email? As phishing and ransomware delivery techniques improve (and like it or not, they definitely are…) it’s becoming more and more useless to simply tell people to “look for bad english”, etc. in the attack email - and also more difficult to just write these off as “you should have known better”.

A “simple” solution exists - disable HTML in email - either by processing/stripping it at the inbound mail gateway, or forcing client-side “read in text mode” (ideally both!). Easily said, tougher to execute sure, but more importantly it’s just about impossible to convince business to accept that solution.

So… we’re left with “awareness training” followed up with victim shaming as our only choice.


(Annette) #12

Jason,

We don’t use SAT training to shame anyone. IT does not share with anyone who has or hasn’t passed a test or who was tricked into clicking on a link or attachment in an email. We use it as a tool to educate our employees so that they have the knowledge on what to look for and how to avoid getting tricked or conned into doing something that may cause harm to them or the company. The majority of our employees want the training because they realize the training isn’t only for work but also when they are at home.

Besides, for most companies SAT training is a requirement by Insurance Companies they sell Cyber policies as well as Credit Card vendors that require companies to be PCI complient.

Your solution may be simple, but for most of us that is not an option.


(Jason Ross) #13

Annette,
It sounds like you are doing training the right way - sadly I fear that
your organization is an exception. In many of the orgs where I’ve seen this
done, the training is often little more than fear-mongering followed by a
mass phishing test that puts everyone on edge because they know that it is
coming. As a result, everyone just sits there dreading all email for a week
or so until the test is over. Individual outcomes may or may not be
publicized, but regardless the result is typically an employee base that
comes to hate the annual training they are forced to take, and dreads the
possible shame (or employment repercussions) of clicking the wrong thing.

I guess maybe I’m grouping SAT with the phishing test that often follows
it, which is perhaps unfair. Generally I find that the approach many
organizations take tends to drive a wedge between the security team and the
rest of the business - since it is often viewed in an "us vs. them"
context, where “us” is “not IT/Security” and “them” is “IT/Security staff”.
I think that’s counter-productive and divisive overall, and question
whether activities that lead to that mentality truly benefit the security
posture of the company.

I admittedly hadn’t considered the possible insurance requirements - mainly
because I tend to dismiss “cyber insurance” as a worthless sham in general.
Again, that’s likely unfair, and perhaps an unrealistic view of business,
but as a pentester it’s hard for me to take anything with “cyber” in the
name seriously, and “insurance” doubly so :slight_smile:

Jason


(Matt Parkes) #14

My organisation uses security best practice material during induction provided by me. Our training team constantly push against the material I provide saying it needs to be simpler and less information and can it be reduced to as few pages as possible because will lose the attention of the new starters in light of all the other company info we are cramming them with. My latest material is more graphical and concise almost like a series of flash cards which double as a set of posters which can be pushed out regularly. Annually we do a basic multiple choice pop quiz but the training team capture the responses and simply have 1-2-1’s with those that don’t meet the grade.

I never get to see how effective or not our training is. In addition I constantly push out awareness material on our intranet on various topics such as ransomware, phishing, securing social media accounts and how to set up 2 step verification at home and so on.

I love some of the ideas I have read from users on here utilising things like free food as a reward and making training sessions fun. If my organisations powers that be would get on board with me I could do all stuff like this easily and I absolutely think that culture would be far better.