Dr Josiah Dykstra
Director of Strategic Initiatives, Trail of Bits
Shooting yourself in the fortress
How we sabotage ourselves with misguided views on human behavior
Josiah Dykstra is a seasoned cybersecurity practitioner, researcher, author, and speaker. Dr. Dykstra is the Director of Strategic Initiatives at Trail of Bits. He previously served for 19 years as a senior technical leader at the National Security Agency (NSA). He holds a Ph.D. in computer science and has worked as a cyber operator and researcher. He studied stress in hacking, action bias in incident response, and the economics of knowing when sharing threat intelligence is more work than it is worth.
Dr. Dykstra is a frequent author and speaker, including Black Hat and RSA Conference. He received the CyberCorps® Scholarship for Service (SFS) fellowship and is one of six people in the SFS Hall of Fame. In 2017, he received the Presidential Early Career Award for Scientists and Engineers (PECASE) from then President Barack Obama. Dr. Dykstra is a Fellow of the American Academy of Forensic Sciences and a Distinguished Member of the Association for Computing Machinery (ACM).
He is the author of numerous research papers, the book Essential Cybersecurity Science (O’Reilly Media, 2016), and co-author of Cybersecurity Myths and Misconceptions (Pearson, 2023).
The field of cybersecurity is deeply intertwined with human action since digital technology is created and utilized by humans, and cybersecurity threats often stem from human adversaries. Despite this inherent connection, common misconceptions about human behavior can significantly hinder cybersecurity research and practice. This talk delves into these misconceptions and their detrimental effects on cybersecurity efforts.
One prevalent misconception is that humans are rational decision-makers. This assumption often leads to poor design, such as developers that rely on complex authentication protocols or intricate risk assessments. Using real-world examples, we explore how human decision-making is often influenced by heuristics, biases, and emotions, making these systems less effective than intended.
We also discuss the misconception is that humans are inherently vigilant and security-conscious. In reality, people often exhibit security fatigue or a sense of complacency, increasing their susceptibility to phishing attacks and other social engineering tactics.
We describe how the assumption that humans possess a high level of technical proficiency can lead to cybersecurity systems that are overly complex and difficult to use. This can result in users circumventing security measures or making mistakes that compromise the system’s integrity.
By addressing these misconceptions, cybersecurity researchers and practitioners can develop more effective mitigation strategies that align with human behavior. This includes designing systems that are intuitive, easy to use, and account for human cognitive limitations. Additionally, promoting cybersecurity awareness and training can help users make more informed decisions and reduce their vulnerability to cyberattacks.
Understanding and addressing misconceptions about human behavior is crucial for enhancing cybersecurity research and practice. By acknowledging the complexities of human decision-making, we can create more effective security solutions that align with human capabilities and vulnerabilities. This approach will lead to a more resilient and secure digital ecosystem for all.