8 Steps to Better Security. Kim CrawleyЧитать онлайн книгу.
Even if you aren't a CISO, these are valuable tips for when you design your company's cybersecurity program. It's always best to learn from others the easy way, rather than learn the hard way by making the same mistakes yourself.
The Psychological Phases of a Cybersecurity Professional
You will probably work with cybersecurity professionals at some point or another. I want to help you to foster a strong security culture by teaching you what I've learned about how we think. Understanding this will be a big help in security hardening your organization.
When people start learning cybersecurity, they often believe that computer software, hardware, and networks can be made 100 percent secure. That's the first phase. “I must learn about everything that makes computers vulnerable, so those things can be completely remedied, and then there'll be no more security problems!” But as the first months and years of their studies progress, they learn that absolutely nothing can be made 100 percent secure.
The first problem is the complexity of computer systems. I love video games, so I'll use them as an example. Video games on Nintendo Entertainment System (NES) cartridges typically ranged from 128 to 384 kilobytes in size, with a few games, such as Kirby's Adventure, coming in at a relatively whopping 768 kilobytes. All of that code was written in assembly language, the code computers send directly to the CPU. NES games could have a few bugs here and there, but they couldn't have a lot of bugs and remain functional, because the games were programmed in a simpler way. Plus, the fewer lines of code a program has, the fewer bugs it can have and still run. Any programmer can tell you that.
As of this writing, most of the games I play these days are on my PS4 and Nintendo Switch, eighth-generation video game consoles. It's impossible to make these more technologically complex games in pure assembly language. Their developers use multiple computer programming languages, large media assets such as polygonal environments, sound and video files, and sophisticated game engines such as Unreal Engine 4 and Rockstar Advanced Game Engine. Eighth-generation games are often 20 gigabytes in size, frequently over 50 gigabytes. That's a lot more code than an NES game, and today's internet-connected video game consoles are constantly installing multiple gigabyte patches. The first stable version of a game is never the last.
Debugging today's complex video games is much harder work. And the best developers know that even well-designed and maintained games will have at least hundreds of bugs. Their complexity causes this challenge.
The greater complexity of today's AAA video games is parallel to the complexity of today's corporate computer networks. The software, hardware, and design of networks are all much more complicated than they were in the 1980s and '90s. Many companies have hybrid networks where some of their servers may be on their premises while their other servers are provided by a cloud service such as Amazon Web Service (AWS). The internet interfaces with their networks at many points as a functional necessity, but the internet is the source of most cyber threats. Combine all that with trying to get your users and employees to behave in secure ways, and you'll understand that complexity is one of the main reasons why nothing is 100 percent secure.
Then there's the compromise to be made between security and usability. The most secure computers in the world are airgapped, meaning that they are locked down as much as technically possible. They're usually not connected to the internet, or often to any network at all. Their USB ports are disabled. The people using them have to go through layers of physical security, such as locked doors with fingerprint scanners. All of those security measures make airgapped computers difficult to use. That's why they're usually deployed only when they access data that's considered highly sensitive.
It would be a big problem if most or all of the computers in your network were airgapped. Most of the facets of your computer network must be usable. But you also don't want your computers to be too accessible. So, you'll need to find the right compromise between security and usability. When you, as a cybersecurity professional, learn that some risks must be accepted for practical reasons, that reinforces the lesson that nothing can be made 100 percent secure and helps you remain vigilant. If you think any computer system can be completely secure, you may forget that security is a constant process, which can result in any number of things going wrong. Understanding that security risks and vulnerabilities will always exist will improve the quality of your work.
Understanding that nothing is completely secure often leads to the next phase, overconfidence. Some of the most well-known cybersecurity professionals are still at this phase and may never evolve from it. These cybersecurity professionals often think, “Why are my users so foolish? They do all these foolish things. How can they be so ignorant? My silly users are the security problem! As for me, I'm a cybersecurity expert. So, my habits are perfect, and I could never fall for a social engineering attack. I'm too smart for that!” Admittedly, I didn't evolve past this phase until a few years ago. Here are the problems with this kind of thinking.
First, you're not going to improve security by treating users and employees like they're foolish fools who are beneath your wisdom. As important as it is to teach people to develop better security habits, well-designed computer security accepts the realities of human nature.
For example, the January 2021 attack on the US Capitol building happened a few weeks prior to this writing. Thousands of right-wing extremists, some of whom worked in law enforcement, tried to seize the US Capitol building because they believed that Joe Biden won the 2020 election illegitimately. They believed Donald Trump was cheated out of a re-election they thought he deserved. Most of the attackers were armed and dangerous. When security professionals with access to the Capitol's computer system became aware of the attack, they sent an evacuation warning to the computer screens of many Capitol building employees and workers: something to the effect of “Get out now. This building is in danger!”
One of the insurgents took a photo of a computer screen with an evacuation warning and posted it to social media. I saw it on Twitter. Not only could the warning be seen, but sensitive information in a worker's email application was also visible.
A prominent person in our industry criticized the Capitol building worker for being foolish enough to leave that sensitive data on their computer screen as they fled to safety. Other more sensible and empathetic members of the cybersecurity community replied that the worker likely was afraid that they could be killed and their life is the most important thing to protect. If I were that Capitol worker, I also wouldn't have shut down my computer before running away. Would you? Probably not. Seconds could mean life or death in an active shooter incident.
The computer system could have been designed in such a way that evacuation warnings could trigger the computers to automatically shut down or otherwise suspend the user's session. The computer system could have protected the sensitive data on those screens without the need for user interaction so workers could focus on their physical safety.
If you look down at your users, your attitude will be detrimental to fostering a strong security culture. A false sense of superiority is problematic in other ways too. Even cybersecurity professionals have bad habits. Until I started using password managers a few years ago, I used the same password for multiple online services. The problem with doing that is if one of my passwords is breached, cyberattackers could engage in credential stuffing—trying the same password with my other accounts. People reuse passwords frequently, so credential stuffing attacks are often effective. I tackled my password reuse problem, but I likely have some other bad security habits too. If I overestimate my own security, I won't be sufficiently vigilant.
About a year ago, I also learned that cybersecurity professionals are becoming more frequent phishing targets in advanced persistent threat cyberattacks. We also tend to overestimate our inability to be fooled by social engineering, to our detriment.
Cybersecurity professionals must learn humility, for our users and for ourselves, and that's an important thing to keep in mind in order to foster a strong security culture.
Конец ознакомительного фрагмента.
Текст