Skip links
Image of Fletus Poston and quote.

AI, Automation, Oh My: Why Human-centric Design for the Modern Cybersecurity Leader Remains Essential to Business Resiliency

Blog Post

AI, Automation, Oh My: Why Human-centric Design for the Modern Cybersecurity Leader Remains Essential to Business Resiliency

An Interview with Fletus Poston 

We sat down with Fletus Poston, Senior Manager of Security Operations at CrashPlan to discuss why human-centric design is a critical consideration when building a modern cybersecurity program. The Q&A will also cover how by leveraging automation, AI, and other innovations, CISOs can increase business resiliency by removing roadblocks and focus on challenges, recommendations, strategic planning assumptions, talent management, culture, and embedding security as an ideology.

 

Human-centric design must consider the very human issue of burnout. How prevalent is this in cybersecurity, and what impact does it have on attrition rates?

To use an analogy, we’re all doctors with niches. We all have basic knowledge and have been through fundamental training. If you’re in a technical role, you all know router switches, OSI model, and networking. Then you get into the layers of your specialty, that’s where burnout can happen.

Workers often don’t know how to pivot, or they’re not coached to pivot. They’re in a role for too long and get frustrated because they don’t have a coach, mentor, or leader telling them, “Okay, let’s remove you from this situation for six months or one year. Then if you want to come back, you can.  If not, this may be your next career path.” Ultimately, a succession plan is a great way to help reduce burnout.

 

Why is insider threat management not a focus area for most organizations unless they are highly regulated?

It’s two things: Culture and budget. I’m dealing with this right now. We are a small startup that trusts everyone, so when I talk with my leadership about what does Shadow IT, aka insider risk, look like? How do we control that?

A lot of it’s under addressed or we have not put the manpower against it because our current culture says I need to let you work or save money. This does not work to block accidental insider threats because this type of insider will inadvertently lock down their computer.

To combat this in human-centric design, we have to restrict certain content types to limit surfing or what you can download and utilize. Additionally, we could limit how you can purchase by forcing you to go through the procurement process instead of pulling out your company credit card. This allows for user justification of the respective applications installed on your computer, or security is going to remove it. That’s why most organizations struggle to properly implement unless they mandated it for the sake of keeping a customer or contract. And when they do it, they do it inconsistently or not at all!

 

How can digital risk protection services (DRPS) address the human element of human-centric design as an effective vector for malicious actors?

The human element educates them on what is and is not appropriate. If they have questions, let them ask in a no-blame situation: No stick, all carrot. If you have a question, I can tell you yes or no. But also, I will always clarify why I’m allowing you to do it this time, although maybe I won’t next time. Policies and procedures also help here.

The biggest thing is managing human risk. As we discussed, human-centric design is about training the individual to trust the training they’ve received. I always remind people; how do you do this at home? How do you know when to open the door? How do you know when to pay a bill? How do you know when to give information over the phone? Is it from your parents? Is it from your formal education? Is it from the school of hard knocks? Several of us have been burned, robbed, or stolen from. You may have lost your identity and learned to be more critical or skeptical. The same goes back to managing and spending resources on the human element.

 

How can the cybersecurity industry reduce process friction and improve user experience?

The biggest thing is open dialogue. In many organizations’ security is the “no machine.”  It’s about having that dialogue with champions or ambassadors in your program.

Find people in finance who are conduits between finance, security, and IT. Most of the time, those are going to be mid-tier. Once you get to directors and higher, it gets a little hard to separate. Because now, you’re listening for different reasons because titles matter in some organizations more than others.

Gamification is another way to break the friction. Role-playing helps too. Do a demonstration during an all-hands meeting, have security come in, and do a hands-on role-play exercise where they play the end user. They report an issue and show you, “I’m just like you. I’m your peer.” I always just say, “My peer.” I don’t want to separate and use a generic term such as the end user. I don’t say the weakest link; instead, I say, “My peer fell for this. My colleague fell for this.” That reduces friction, too.

 

How can human error be used to indicate cybersecurity-process-related fatigue within an organization?

Fatigue-related error usually occurs with new hires or those nearing retirement. The new hire is likely untrained, stressed, and excited. The fatigue is not quite there yet; they just don’t know the process. So, fatigue occurs to the person training, security operations team or the service desk peers because a new hire might send 20 help desk tickets and five emails into the suspicious inbox every hour.

Error from those nearing retirement is usually because they’re just coasting. They’re in the last five years of their career. They’re clocking in, doing their job, and clocking out. In that case, again, they fail to remember the training they just took: The last fishing simulation, the last social engineering exercises, or the last tabletop.

 

What steps should be taken to develop an insider risk management program with the support of senior leadership, human resources, and legal teams?

A Gartner study found that developing an insider risk management program requires dedicated involvement from the top down: Senior leaders or executives. From there, executives can bring in HR, legal, and senior leaders to build a credible and well-rounded program. While implementing new tools and operations are necessary for building a robust and effective program, it’s vital that it’s deployed at a steady pace. Rapid deployment may overwhelm users and not allow them to fully grasp the new toolsets, leading to increased security threats.

 

How can artificial intelligence (AI) recommendation engines help augment human-centric design decision-making in cybersecurity?

AI assists in the automation of monotonous tasks, as you can feed it a repository. Take Chat GPT or Bart. I can give all my knowledge articles. Then, users can ask, “Hey, how do I do this?” Instead of requiring users to call the help desk, it’ll spit out exactly what’s needed.

Moreover, AI going to continue to evolve. Microsoft is rolling out copilot AI based on their products to help with business enablement. It enables you to be smarter and faster and streamline your process. GitHub, GitLab, and others are using it to immediately see where your code is insecure and tell you, hey, before you move on, you need to fix this.

 

Why is it important to evaluate human factors alongside technology when designing and implementing cybersecurity controls?

Looking at the X/Y axis, technology investments continue on an upward trend. But we don’t spend the money on employees that we spend on technology. And that’s purely just looking at the cost. One isn’t better than the other. But what percent must be spent on humans if I spend a million a year on my tools?

That’s up to the company to figure out. This goes back to what I said before: I must put a percentage of my security budget on the human, not just the tools. Considering implementing gamification, raffles, rewards, and shoutouts. Additionally, develop a kudos system inside the legal bounds of your organization in the country you live in.

 

What precautions should be taken when discussing zero trust outside the security team to prevent misinterpretations that could damage employee trust in the security program?

Don’t use the term “zero trust” outside the security team. Again, it ties back to culture. Explain why security and IT is forcing you to authenticate into this application or asking you to log in. Or, if it’s passwordless, explain why I want biometrics or your YubiKey. You can also give just-in-time access. Yes, you had to wait, but locking your account for 5 minutes reduced the threat vector by a huge percentage because you’re a domain admin.

Teach the risk levels without labeling employees as risks. You’ll need to wordsmith with your marketing and HR teams and get them to write the comms up. But you don’t say “zero trust” because “zero” may be perceived negatively in people’s minds.

 

How does talent churn and burnout impact the effectiveness of a cybersecurity program, and what  strategies can be implemented to mitigate these issues?

Garter’s study shows that talent churn damages cybersecurity effectiveness in terms of culture and cost. The salary for modern CISO professionals and security experts increased from 2019 to 2021, meaning churn increases recruitment costs while lowering productivity. Hiring new employees also costs up to 30% more than retaining talent.

Programs must be fully focused on compliance to help meet the challenges and stress that cybersecurity professionals face. Additionally, as stated earlier, executives must prioritize risk management, making it a core pillar of company culture.

 

What cultural shifts and changes in engagement rules can help embed security as an ideology in an enterprise and prioritize cybersecurity as a core component of the value proposition?

It’s essential to get executive support to establish a top-down philosophy concerning cybersecurity and present it as a human issue that treats organizational members as stakeholders.

Thus, what it comes down to, is; we have intellect, we are knowledgeable, but we have emotions. Humans can be swayed by our emotions, so organizations still need technical and administrative controls, but we also need to educate, trust, and rely on the human elements of cybersecurity.

The biggest thing is helping your peers understand their intrinsic value to the company and how you see them. I see you as a human, as a peer, as a value add. I want to make sure that when I do this, I do it in a manner that allows you to still do your job. If I can train you to see something, say something, to stop and assess and to think before you click or think before you do, those actions just saved us x dollars, x time, and potentially more profit, more bonus, more aptitude, and ultimately, a little more leniency.

Image of Fletus Poston and quote.