The 5×5—Forewarned is forearmed: Cybersecurity policy in 2024
Members of the Cyber Statecraft Initiative team discuss the regulatory requirements and emerging technology they are closely following in 2024, and forewarn of the year ahead.
The 5×5—Forewarned is forearmed: Cybersecurity policy in 2024
BANNER: Digital Security and data protection. Conceptual illustration with advanced technology digital display. (Source: healthitsecurity.com)
New year, new cyber policies. Or well maybe, old cyber policies with more nuanced understanding is a more realistic way to introduce the year 2024!
We have the Cyber Incident Reporting for Critical Infrastructure Act (CIRCIA), a National Cybersecurity Strategy and its Implementation Plan, those Cybersecurity Risk Management, Strategy, Governance, and Incident Disclosure requirements from the SEC, and even a National Cyber Workforce and Education Strategy. The launch of ChatGPT in 2022 led to a myriad of generative AI related legislative proposals including the 117-pages long Executive Order on The Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, and this year, we can expect these policies and regulations to come to fruition and shape the cybersecurity landscape.
So, for the first 5×5 edition of this year, we brought together some members of the Cyber Statecraft Initiative team to tell us which regulatory requirements and emerging technology they are closely following, and which dominant technology they think could give way to a better one in 2024. From disinformation and spyware to artificial intelligence and warfare, this edition forewarns of the year ahead.
1. What is the one emerging technology, industry or sector that you think could most adversely impact the cybersecurity landscape and you recommend the governments should proactively monitor in 2024?
Maia Hamin (she/her/hers), Associate Director, Cyber Statecraft Initiative, Digital Forensic Research Lab, Atlantic Council
“This one is tricky. I am in some ways tempted to say that emerging AI systems have the greatest potential for a ‘black swan’ disruption of the cyber landscape; AI systems could substantially disrupt or alter the offensive/defensive balance if they develop new capabilities that can be meaningfully harnessed for cybercrime or espionage. However, I don’t necessarily think that this scenario is the most likely to happen. More likely is something like a gnarly vulnerability of some kind in a widely deployed and used technology system that has a long-tailed remediation process, allowing bad actors to continue to exploit it for long after its discovery. But then, what else is new?”
Stewart Scott (he/him/his), Associate Director, Cyber Statecraft Initiative, Digital Forensic Research Lab, Atlantic Council
“I worry about AI, but for reasons different than (and not conflicting with) my colleague, Maia. The amount of metaphorical oxygen AI discussions consume in the policy room is staggering, and I worry that other more concrete issues are neglected as a result. Not that AI does or doesn’t present risks and challenges for policy, but rather it seems to have struck some perfect blend of abstraction, novelty, and hype to consume policymakers. I’m not saying ignore it, but don’t put down all the other important work out there either.”
Jen Roberts (she/her/hers), Assistant Director, Cyber Statecraft Initiative, Digital Forensic Research Lab, Atlantic Council
“It is not a necessarily emerging technology, but I would say spyware. While some policy action was made to regulate this space in 2023 with the executive order, joint statement, and PEGA committee findings, policy attention on the spyware market seems to be focused on vendor specific action rather than looking at the marketplace as a whole, including who is investing in this type of technology.”
Alexander Beatty (he/him/his), Assistant Director, Cyber Statecraft Initiative, Digital Forensic Research Lab, Atlantic Council
“It’s hard to look at this and not immediately think of developments in artificial intelligence but I think most of these claims are hysterical. 2024 is posed to have the highest ever number of democratic elections across the globe, so governments need to proactively monitor the systematic spread of disinformation online to ensure free and fair elections around the world.”
Emma Schroeder (she/her/hers), Associate Director, Cyber Statecraft Initiative, Digital Forensic Research Lab, Atlantic Council
“I believe that governments should take proactive steps to better understand the ways in which cyber operations have altered how warfare is conducted. At the tactical level, the ways in which the cyber and the kinetic can be melded on the battlefield are still undergoing a phase of dramatic evolution. On a wider scope, governments must seek to better understand how the cyber environment and cyber tools alter the types of actors engaged in conflict and the roles they play. Perhaps since the ‘state’ was conceptualized, there has never been a time in which on a global scale, warfare is less in the hands of the state. The same division that exists, whether in reality or just in theory, in the physical domains between the civilian and the combatant, doesn’t quite exist in the same way in the cyber domain.”
2. Are there any regulatory changes or compliance requirements expected to significantly impact cybersecurity practices in 2024? What are some policy changes that you are most eagerly waiting for?
Maia Hamin
“There are a bunch of moves I’m watching in terms of how the federal government will update its own security practices and requirements and expand their applicability – from revised FedRAMP guidance, which governs government cloud security; to new proposed updates to the Federal Acquisition Rules that would require government vendors to maintain Software Bills of Materials; to an overdue requirement from a 2022 EO for government software vendors to comply with NIST’s Secure Software Development framework. I’m curious to see how these different risk management frameworks and approaches will be implemented and whether they will improve the cybersecurity of the federal enterprise. This experience will provide useful information as the government considers whether and how to formulate software and cybersecurity requirements for broader swathes of industry.”
Stewart Scott
“I don’t know if they will fully land in 2024, but I’m excited to see how the Cyber Incident Reporting for Critical Infrastructure Act (CIRCIA) and the cybersecurity disclosure requirements of the US Securities and Exchange Commission (SEC) play out. Empirical data on cyber incidents at scale would be such a cool, useful asset for policymakers to work with.”
Jen Roberts
“The marketplace for offensive cyber capabilities (OCC) is an industry the government should proactively monitor in 2024 and introduce more policy changes into. As the OCC market continues to grow to meet demand and remains an affordable and attractive option for governments who do not have homegrown capabilities, efforts to shape this marketplace must be comprehensive.”
Alexander Beatty
“Unfortunately, the news of aviation safety incidents (cyber or otherwise) have been prevalent in recent weeks, but with the FAA starting to increase pressure on carriers to comply with regulations introduced in 2023 and giving them 3 years to comply, we are likely going to see more US and international carriers start to conduct more organizational risk assessments which will be a valuable first step in improving cybersecurity across the aviation industry.”
Emma Schroeder
“This may be a bit of a cheat answer, but after the release of the National Cybersecurity Strategy, and its implementation plan in 2023, I am very much looking forward to following how the government actually works to implement and forward its strategic goals. In particular, the strategy had a strong theme of rebalancing responsibility for risk and security in cyberspace. I am eager to see how this effort to make the private sector take more responsibility in the cyber domain will proceed.”
3. What’s a long dominant technology which will start to fade, or has begun to already, in 2024? What does this mean for cyber policy?
Maia Hamin
“One challenge with cyber policy is that technologies don’t stop existing, just because they ‘fade’ in current use – there are still important systems in active use that are written in FORTRAN or COBOL! It’s important that cyber policy both conceives of a better world (for example, pushes developers to use more memory safe languages) while also acknowledging that we will continue to need to use and manage risks arising from ‘legacy’ languages and technologies in application stacks that will never be migrated.”
Stewart Scott
“Ooh, interesting question. I’m not sure! I assume that whatever technological change occurs, it will continue requiring cyber policy to evolve at a pace that it’s not ready to match. I think that’s a more interesting angle though—technologies gain and lose importance (and maybe more relevantly, the spotlight) all the time (vacuum tubes anyone?), but we’ve long struggled to design policy systems that account for this without being overly prescriptive or unhelpfully vague.”
Jen Roberts
“Passwords. In 2023, we saw some companies like Microsoft offer passwordless authentication. With the benefits a passwordless world offers in terms of risk reduction and cost efficiency, 2024 can expect to see a wider push towards a passwordless world and those who can’t remember a password no matter how hard they try will rejoice!”
Alexander Beatty
“One can only hope that 2-Factor Authentication (2FA) using SMS will begin to fade, and this will mean an uptick in far safer and more resilient multi-factor authentication systems being implemented as industry standards. Will that actually happen, though? Plenty of organizations are already behind the times with their security standards, so it’s not unlikely that we’ll have to see some high-profile failings of SMS 2FA before there is any meaningful change. The Twitter/X removal of SMS 2FA for non-paying members may have, surprisingly, helped sound the death knell for this system.”
Emma Schroeder
“Not a particular technology, but rather a characteristic of technology that has been disappearing for some time is the lack of understanding of how a specific technology functions. As technology becomes more advanced, less people will be able to understand how it functions and while this is in some ways offset by an increased focus on usability, the rising barrier for people to understand the ‘back end’ of the technology they are interacting with means that they might also not understand the risks that they are accepting. This means that the government needs to step in to help make those risks clearer to the population through requiring additional transparency from the companies that sell or provide these goods and services, and that the government must understand the contours of the digital landscape on which it and its citizens rely.”
4. With the watershed launch of ChatGPT, industry and the government have refocused their resources towards AI policy amid a flurry of commercialization. What are some of the cybersecurity and digital policy issues that might have taken a back seat in 2023 but should be reconsidered in 2024?
Maia Hamin
“I continue to think that the US government (and probably others, though I’m less qualified to say) will be hampered in certain key policy efforts so long as they cannot get digital identity right. Privacy-preserving digital identity solutions are desperately needed to solve a host of challenges from how to digitally deliver benefits and government services to more thorny questions about whether and where to require (privacy-protective!) proof of identity on the internet. These challenges are likely to be intensified as AI content and bots proliferate on the internet. Existing commercial solutions generally have significant problems, chief among them that they rely heavily on the commercial data broker ecosystem. US government should move this issue back to the policy forefront.”
Stewart Scott
“Every day, I wake up hoping that the Cyber Safety Review Board will decide to examine the SolarWinds incident, and every day my dreams are crushed before 9:00am ET. To be clear, this is purely selfish—I just want to know more about the incident because I am a nerd. More broadly though, I don’t think cybersecurity policy has particularly robust learning mechanisms built into it. It’s hard to know how effective policies were or how well or poorly things are going, let alone why. The amount of time spent speculating about how new technological capabilities—generative AI is hardly a new technology per se—will change the status quo is somewhat bewildering given we don’t know much about what the status quo is, at least with any serious rigor and empiricism. The cybersecurity issue that I think takes a back seat, as a result, is less a topic or technology than a frame. Cybersecurity policy would benefit massively from institutionalized learning mechanisms—reviews of major incidents, analysis of whether policy interventions achieved their desired outcomes, wide-ranging studies on security control efficacy, empirical surveys on cyber incident damages, etc.”
Jen Roberts
“Workplace readiness and capacity building need to remain at the forefront of the cyber policy agenda. Governments across the world have a workforce shortage, that is not going away, but can be minimized by attracting talent to cyber, especially individuals who have international relations, political science, and legal interests because ‘cyber’ doesn’t happen in a silo. The White House’s National Cyber Workforce Strategy was a step in the right direction.”
Alexander Beatty
“The development of the cyber workforce both in the US and all around the world. With the launch of the National Cyber Workforce and Education Strategy in mid-2023, we saw workforce development, to follow the analogy, getting out of the back seat and getting into the passenger seat. In 2024, we both hope and need to see the development of the cyber and digital workforce hop in the driver’s seat.”
Emma Schroeder
“Cyber policy, like most policy areas, often operates on a schedule of sprints and fixations in response to significant technological advancements (or the perception of significant technological advancements) and severe cyber incidents. This means that it can sometimes be difficult to maintain the consistency needed to ameliorate cyber policy. There are many issues that either had their ’15 minutes’ and since faded from attention or that have never really had their day in the sun and yet are incredibly important. I will, however, give the obvious answer that in 2024 perhaps no digital policy issue will be more important in the US than countering mis- and disinformation surrounding our elections.”
5. Let’s close this with some fiction. If you were to define the current cyber policy landscape through a movie or web series or storyline of a book, which one would you pick, and why?
Maia Hamin
“Comparing cyber to movies is hard. I watch a lot of fantasy and sci-fi movies. These movies usually have one big bad guy, and in the end, the hero and their buddies triumph over the big baddie. In cyber, there are a lot of different bad guys, so you may not know sometimes who you’re fighting. And you do win, but you also lose, and it’s a lot more obvious when you lose than when you win. Sometimes your buddies work with you, but sometimes they won’t really talk to you because they’re scared that they’ll get in trouble, even though all you really want is to win together. I think there must be a sports movie that is a better metaphor for this, but right now my brain is only giving me the 1982 movie- The Thing.”
Stewart Scott
“I would compare the current cyber policy landscape to the book/movie Moneyball, but only the part before all the statisticians start bending the ear of baseball management. Our general inability to answer, with any kind of empirical data, questions like ‘are cybersecurity outcomes better or worse than they were a year ago,’ ‘how impactful was this cyber policy intervention,’ or ‘which of these security controls is most effective given the cost of its implementation’ is, uh, not great.”
Jen Roberts
“Love, Death & Robots. A lot of fascination surrounding the future of AI, IOT, and new and developing technologies.”
Alexander Beatty
“The Girl Who Saved the King of Sweden by Jonas Jonasson – there’s a lot going on at the moment, there are so many interweaving policies, storylines, characters, and motivations, all with some pretty strong themes of mild, if not grave peril (no spoilers!) – in the end some well executed community building and understanding will help us avoid catastrophe.”
Emma Schroeder
“Difficult question. The book I’ve decided to go with, after much deliberation, is Piranesi by Susanna Clarke. The book follows a man named Piranesi (sort of) who lives in an ever-changing, seemingly infinite house. Piranesi spends his days documenting the movements and changes within the house, the layout of the statues, the patterns of the birds, the ebb and flow of the tides, trying to understand the labyrinthine world he lives in. I don’t want to give too much more away, because I believe the best way to go into this book is knowing almost nothing. So, check it out and let me know if you think this is an apt comparison.”
The Cyber Statecraft Initiative, part of the Atlantic Council Tech Programs, works at the nexus of geopolitics and cybersecurity to craft strategies to help shape the conduct of statecraft and to better inform and secure users of technology.