A few days before Christmas, the domain registrar and hosting service GoDaddy came under fire in the press and on social media for phish-testing their users with a fake email which made it look like they were getting a US$650 bonus for the holidays. While the methodology used can and does need to be done better in the future, the test itself raises some serious questions for corporate cybersecurity professionals and regular users alike.
Here’s What Happened:
GoDaddy’s cybersecurity team sent an email to a large number of employees which made it appear as though the company was providing a significant one-time bonus payment. The employee needed to click a link within the email and fill out a form including sensitive information before a certain date. This was, of course, a phishing test – the purposeful sending of an email designed to determine if users are susceptible to phishing attacks. Because of the offer of a significant bonus, and the timing of the email, many people took offense to the campaign, but a closer look shows what GoDaddy both did right, and what they definitely did wrong.
What GoDaddy Did Wrong:
Based on screenshots of the email sent to local media stations, the email itself was crafted perfectly – far too perfectly for a phishing test. While threat actors are indeed getting better and better with their phishing campaigns, an email that is as perfect as the one used by GoDaddy would be overwhelmingly rare, and frankly quite difficult for even cybersecurity experts to detect. The email appears to have come from a valid GoDaddy domain email address; language and tone are also correct for corporate communication of this type. The graphics and logos were perfect and correct. There was little, if anything, a user could leverage to detect that this was fake. To be clear, this kind of spear-phishing email can indeed occur, but it would be difficult to expect an average user to have the ability and the time to make the determination that the email was not legitimate.
It should be noted that as of the writing of this post, we don’t have information on if the links in the email were going to GoDaddy[.]com sites or if the email address resolved correctly if a user hovered over it. These are common checks for phish emails, and that information is critical to have before we can cry total foul over this phishing campaign, but the rest of the email is picture-perfect, and not a fair test of users’ ability to detect fraudulent email activity.
What GoDaddy Did Right:
For the most part, the backlash against GoDaddy has been about the timing of the email and the “bait” of the bonus being used so close to the holidays. The overwhelming response has been that this was a cruel trick more than it was a valid test of employee security. I personally disagree with this and think the test was properly timed and performed – except for the details listed above.
Threat actors are continually evolving their email attacks. In fact, this exact type of email has been successfully used in the past to trick users in to divulging everything from sensitive corporate information to their login details and more. Using this technique is a valid and fair test – if and only if the test is passable. If the links in the email clearly did not resolve to known GoDaddy corporate domains; if the email address did not show a real GoDaddy email domain when hovered over, then the test was still a bit unfair, but still a valid simulation of exactly the kind of attack users can expect to be faced with. Yes, doing this at this time of year – and especially THIS year – is rough on users, but considering that threat actors will and do use this same technique to allow their real attacks to succeed, cybersecurity teams must simulate attacks like this to ensure the proper defense of the company.
Why it Matters:
On many occasions, I have heard “it’s not fair to expect low-ranking and part-time employees to be able to spot and avoid phishing attacks.” I’ve heard “they’re not technical,” “they don’t report to us,” and “they don’t have access to anything,” over and over again.
Here’s the problem with that line of thinking:
All users – from janitorial and reception staff to the CEO and Chair of the Board – are going to be attacked by phishing emails. Either broad-spectrum, un-targeted attacks or directed spear-phishing activity will occur, and can occur to anyone at a company no matter their roles or responsibilities. LinkedIn and many other sources of public information can provide an attacker with all the details they need to create such attacks, and public information about the company itself (its holiday schedules, fiscal year start/end dates, etc.) can allow any attacker to craft a phishing email that directly targets an ongoing or upcoming event. Security teams must, therefore, use the same mindset when creating phishing test emails – leveraging that same information to create those same kinds of attacks.
Employees also can create much more damage than their job responsibilities would initially indicate. A minimum-wage, part-time employee can still cripple an entire corporation by opening ransomware attack files downloaded via a link in an email. We cannot continue to think that just because someone doesn’t have direct access to company bank accounts; they cannot cause a massive loss of revenue.
So It Was Both Right, and Wrong:
Because of this, phishing tests like this one are necessary and valid. They must be conducted fairly, and limited by the same techniques attackers actually have ready access to. In this case, we here at Cymulate would have recommended that an external domain be used for any links and that the sending email address be spoofed using standard From Address spoofing techniques. This would have easily allowed any user to take just a few seconds to confirm basic information to be able to successfully realize this email was fake. Minor grammatical errors and/or slight inconsistencies in the logo or other graphics would also be recommended, depending on the likelihood that the company’s real logo could be duplicated precisely. This permits the cybersecurity team to effectively test every employee but also gives the employees a chance to detect the phish using methods that are taught to every employee – and that they should be used for any and all emails, both company and personal, that they interact with.
No employee should be exempted from real-life phishing testing, but every employee should have the training needed to catch the phish. Every employee should be required to show that they are willing to take simple precautions to defend the organization they work for, and every cybersecurity team should realize that even trained professionals can fall for perfectly-crafted, targeted attacks. We need to work together – both technical and non-technical teams – to help better defend the organization. That means testing like this has to happen, but employees must be able to succeed.
Finally, we in the cybersecurity world need to start offering as much of a carrot as we offer a stick with these simulation tests. Yes, employees who fail should be required to undergo training. Yes, employees who fail repeatedly may face disciplinary action (after all, they’re proving they do not care if the company is devastated by a cyber attack). That being said, users who do succeed should be celebrated! Perhaps a smaller reward should be provided to every employee who does not fall for the fake phish, or the names of all those who passed the test should be published for all to see on the company intranet. Those who pass repeatedly should be singled out for praise on company-wide calls and/or in other broad communications. We need to show users that – in addition to defending the company – their efforts in stopping these threats will be recognized and recognized well. This encourages more and more employees to try to find the fakes, and leads to better and better cybersecurity.
Cybersecurity Validation is a give-and-take situation when it comes to users. We must hold them responsible for defending the organization in any way they are able – and spotting a phishing attempt that has telltale signs of fraud is one way they are able. We must also recognize and understand that not everyone will be a cybersecurity expert, and build tests that they have a chance to pass. This test was not invalid, or inappropriate. It’s entirely possible it was unfair and should have been conducted differently; but that doesn’t change the cold, hard fact that this technique is used by threat actors, and it must be tested against to ensure the company – and its employees – remain safe.
If you’re concerned about how to conduct a phishing test that is both fair and realistic, reach out to the Cymulate Sales Team. Our platform includes a Phishing Awareness module that can help create realistic – but still detectable – phish emails so you can start highlighting when employees get it right and helping those who have trouble.
Try it for yourself with a 14-day free trial.