A collection of phishing stories and lessons learned for use in presentations on phishing tests


(Scott Wright) #1

Note: I was looking for a category in which to post requests and responses about good phishing stories. But please let me know if there’s another category where this should be posted.

As security professionals, many of us on this forum are probably in a position to do presentations, whether it is internal training of staff on awareness and phishing concepts, executive briefings on phishing program justification or results, or at industry conferences. My feeling is that we should be doing this as often as we can to spread the word about the importance of phishing assessments as a tool in a security awareness program.

When it comes to live presentations, I always like to start a session, or sometimes a subsection of a session, with a good story. This week, I will be presenting at a security conference, and am trying to select a good opening story about phishing assessments.

So, while I have a few stories of my own that I could use as the opening hook (sorry for the pun), I came to the HackBusters forum to see if anyone had any really interesting stories about phishing assessments, and the things that can go wrong with them - with good descriptions of the reason they happened, and what the outcomes were.

I have one story that I can kick off this topic with, but it’s not so much about phishing, as using USB drops…

A Good Illustration of Why a Rules of Engagement Agreement is Important for Any Human Vulnerability or Social Engineering Security Test

A couple of years ago, I was doing USB drop tests within a small division of an organization that had about 5,000 employees, and had developed several different files to use for content, which would all cause information to be logged about the device, the time, the IP address and the filename that was clicked when the files were opened.

I had briefed the IT Security manager to make sure he knew what the expected and possible actions would be when somebody found it. They were supposed to be turned in to the Service Desk (who did not know the test was being done), and they were supposed to report it to the head of IT Security, as part of their normal procedures.

One of the files had what looked like a confidential internal budget spreadsheet. When the device was turned in, the Service Desk staffer saw the file and decided that it looked like somebody was trying to “exfiltrate” confidential financial information from the organization on the stick. They panicked and took it to one of the top level executives for the entire organization, to show them what was happening. That executive had not been briefed on the testing either.

As a result, the level of concern escalated. Fortunately, the executive’s first reaction was to start working down the chain of command until he got to the IT Security manager who had hired me to do the tests. There was a lot of explaining to do, and the manager was a little shocked and embarrassed that the incident took the path it did. Otherwise, no harm done. Imagine if the executive, for some reason, had contacted somebody outside about the organization, like the media… [not likely, but possible, you just never know]

But I use this story to illustrate how it’s even more important to have a “Rules of Engagement” discussion and agreement with management, before deciding whom to brief about testing of human vulnerabilities, and what kind of content to use in tests (i.e. not too alarming).

In technical pen testing, the results of testing gone bad can usually be easily anticipated and the chain of command can be informed. However, when testing human vulnerabilities, IT Security Managers tend to want to keep everyone in the dark to get the most pure results. Unfortunately, it can be VERY hard to predict what a human will do when they are faced with a situation they haven’t seen before.

So, please feel free to comment on this story, and add your (anonymized) stories to this thread. I would really like to see a range of other good lessons learned that we can all use in presentations, to get people’s attention on this topic. If you are scared that divulging even an anonymized story may put somebody’s job at risk, you could send me (or anyone else) a private message, and we can post them with no attribution. If you can’t find my contact info on this site, please use my website contact form.


(Will Jeansonne) #2

Hi Scott,

Thanks for your post. You selected the correct category for sharing your ideas, opinions and issues regarding phishing. All we ask is that you try to keep regular posts non-promotional as possible to be fair to the others in the forum. That said, we look forward to your contributions and interaction with out community!

Will Jeansonne
Community Manager


(Warren White M.S. Cybersecurity) #3

That is a good story @streetsec. It is very likely in most scenarios for people to be either curious or afraid of something that could be considered important. Social engineering requires many perspectives when trying to execute a plan that will lead to the end user giving up confidential information. The USB drop, as Scott mentioned, can lead to no one opening the file and loose accusations due to how sensitive the material appears to be.I think phishing is an appropriate way to describe this type of attack because the criminal will throw out the lure to see if there are any bites.

One of the few phishing examples I have seen was one someone was approached online, in a chat room, and a relationship was formed. Some time passed and they were to be wed without ever seeing each other. The person wired money for a plane ticket and a dress so they can meet and get married. Needless to say after however much money was sent/received on the other end there was no word back from them. Phishing/Social Engineering can come in many forms and there are many people out there who will still fall for traps.