A Legal Duty to Secure Data?
Data breaches and cyberattacks are part of our daily existence. Consumers are at increased risk of identity theft following data breaches. Company websites can be vandalized or knocked off-line by attackers. Confidential documents can be stolen and then used by competitors or distributed to the public. If an organization’s computers are locked by ransomware, day-to-day operations might cease entirely until the data is restored.
Anyone who operates a computer connected to the Internet becomes a potential target. Most countries have laws against cybercrime. Unfortunately, it is often challenging to enforce these criminal laws. It is generally difficult to tie a specific person to a specific computer at the time of a specific crime and obtain jurisdiction over that person.
The users and operators of computers and networks have instead adjusted their behaviors to protect against these threats as best they can. In so doing, social norms have evolved in favor of computer security practices, at least among businesses. Surveys tell us that company executives now consistently rate cybersecurity as a pressing concern. Have these developing cybersecurity norms reached a point where the strength of the norm can support the creation of legal liability for failing to adhere to this emerging norm?
To answer that question, we examined tort law in the US. Fitting cyber harms into existing laws has been challenging from the beginning. The common law of torts is one area where the inquiries take place. Is a distributed denial of service attack a trespass to chattel? Do intrusion upon seclusion claims arise when one party duplicates and distributes someone’s social media post without permission? Do companies that collect sensitive customer data have a duty to secure that data?
It may be that the social norms in favor of cybersecurity are strong enough to support the creation of a complementary legal duty to secure third party data. If so, companies that store personal information would have a duty to keep that information on secure systems. It would probably be unwise to set that duty too high, because cyberattack methods are evolving faster than the defenses against them. A flexible standard like “reasonable security practices” would likely be appropriate. What is reasonable would likely depend on the target. Not every company that protects data has the resources for a comprehensive intrusion detection system, but they should all have a plan for dealing with this pervasive modern threat.
In the law, distinctions are often drawn between misfeasance and nonfeasance. Put simply, misfeasance is a wrong action, while nonfeasance is inaction. In a lot of legal systems, nonfeasance does not typically give rise to liability. The misfeasance-nonfeasance divide has been examined in many courts and legislatures around the world. In 1830, the U.S. Supreme Court considered whether the crew of a commercial shipping vessel had a duty to try to put out a fire on the ship before evacuating. In that case, the crew knew there was gunpowder in the cargo hold, and they didn’t want to risk being blown up, so they left and the ship was destroyed by the fire. The court held that they had a duty to try to extinguish the fire, because there was an implied duty incident to navigating the vessel.
If cybersecurity is more correctly described as an afterthought, then failure to add cybersecurity measures is nonfeasance. On the other hand, if cybersecurity is central to the operation of networks, then the failure to install or maintain a reasonable cybersecurity plan could fairly be characterized as a misfeasance. In the latter scenario, a legal duty to secure data eventually arises. Norms seem to be shifting in favor of a view of cybersecurity as being a fundamental aspect of operating a network of computers. Borrowing the Supreme Court’s language, securing your systems becomes an implied duty incident to offering services that rely on your systems.
Data insecurity has created a need for some form of legal intervention. Part of that need stems from cognitive biases that interfere with how individuals assess risk. Poor risk assessment often contributes to poor disaster preparedness. People think that this won’t happen to them, or that if it does happen to them, it won’t be as bad as the warnings say. We also explore the nature of data injuries. In the United States, courts tend to focus on economic harm as the determinant of how much an individual was injured by a data breach. We argue that, while economic damages should be considered as well, the real harm from a data breach is an injury to privacy and autonomy. We examine these issues and several others in our recent paper, Liability for Data Injuries.
As cybersecurity becomes further entrenched in the norms of modern life, a legal duty to secure data will naturally arise. What might have been nonfeasance in 1985, before many efforts were made to connect every computer in the world, could now be a misfeasance. Companies that handle customer data should keep this in mind. With the EU’s General Data Protection Regulation having recently gone into effect, data analytics firms and other data-driven businesses should be proactive in protecting customer data.
Jay P. Kesan and Carol M. Hayes, University of Illinois at Urbana-Champaign