A General Counsel's Guide to Overseeing and Improving Your Company’s Privacy and Security Program

SLVG Blog - Privacy and Security Program.png

Imagine that you are working as in-house or outside counsel for a business and you are acquiring hardware or software for an advanced technology system. What information do you need to help your company manage privacy practices and the company’s information security function? How do you know if your company is managing privacy and security effectively?

A. Importance of Data Protection Management

A business procuring advanced technologies faces strategic risks from picking incorrect privacy and security strategies that lead to customer or public backlash, or in the case of products in the physical world, the business may endanger safety if a compromise of the product could lead to an accident.

Failed internal procedures, such as procedures for maintaining a trustworthy workforce, may lead to operational risks such as breaches caused by insiders. Privacy and data breaches may trigger lawsuits and governmental investigations, resulting in investigative and defense costs, litigation costs, and the cost of settlements and fines. Organizations that sustain breaches face angry customers and damage to their reputations, resulting in the loss of customer and worker loyalty, further resulting in losses of revenue, profits, and ultimately shareholder/equity value.

Consequently, managing privacy and security effectively is crucial for the continued health of any business. Managers at businesses that fail to safeguard customer data may lose their jobs and may face personal legal, reputational, and business consequences.

B. Overview of Counsel’s Role

Attorneys play a crucial role in data protection management functions within businesses. First, they can review applicable data protection laws and requirements and counsel their clients to facilitate compliance. Second, they frequently participate in and assist in contract drafting and negotiation in connection with transactions that implicate data protection issues. Third, they handle potential liabilities and disputes relating to data protection. Fourth, they may lead to investigations regarding data protection violations, incidents, accidents, or breaches. Finally, they help with data protection governance. For instance, they may establish data-protection management structures within businesses; develop and implement privacy and security programs; draft or edit privacy and security policies, procedures, guidelines, agreements, and training materials; and support audits and assessments leading to attestations and certifications, such as those under the EU-U.S. Privacy Shield program, the ISO 27001 security audit framework, and SOC reporting frameworks.

Attorneys must work together with other professionals to develop and implement data protection measures within a business involved with advanced technologies. The businesses that most effectively manage data protection make use of cross-functional teams of business line representatives, privacy professionals, security professionals, internal auditors, and risk managers to handle specific processes, projects, and issues. For businesses developing advanced technologies, cross-functional teams may work on new products or services and integrate privacy and security “by design” proactively during the development process, rather than waiting until the end of the process to weigh in on data protection issues. In any business, teams may be involved in the investigation and response to security incidents or breaches to determine the best response strategy and to implement it.

Since data protection attorneys will need to provide advice about mixed questions of fact, law, and technology, they should learn as much as they can about the advanced technologies developed or used by their business lines to provide products or services, technologies used to secure personal information and information systems, and technologies used to monitor, detect, and report potential violations. Talking with information technology, audit, and security professionals, reading background information about different advanced technologies and security controls, and attending continuing education programs are invaluable. The American Bar Association Section of Science & Technology Law’s E-Privacy Committee and Information Security Committee provides helpful learning and networking opportunities for attorneys new to data protection through publications, programs, listservs, meetings, and events. Attorneys new to data protection will find that a wealth of information is available to help them adjust to new data protection roles and responsibilities quickly.

C. Applicable Laws

Data protection attorneys need to understand the legal landscape of advanced technologies in order to promote compliance and mitigate legal risks. Businesses in the field of advanced technologies may have laws that apply directly to their technologies. They must also account for more general laws that cover their technologies.

1. Laws Specifically Governing Advanced Technologies

A number of new laws bear on information governance regarding advanced technologies. Perhaps the prime example is California’s new connected device law, SB 327 and AB 1906, enacted on September 28, 2018, which will become effective on January 1, 2020. This new law covers Internet of Things devices and other connected devices. Under this law, manufacturers of “connected devices” must equip the devices with one or more security features. These features must be appropriate to the nature and function of the device. They must also be appropriate to the type of information collected, contained, or transmitted by the device. Finally, the security features must be designed to protect the device and stored information from unauthorized access, destruction, use, modification, or disclosure. A “connected device” is “any device, or other physical objects that are capable of connecting to the Internet, directly or indirectly, and that is assigned an Internet Protocol address or Bluetooth address.” Authentication mechanisms, such as passwords, are deemed reasonable if each device has a unique password or the device forces a change from a default authenticator.

The law covers more than just Internet-connected devices in that it covers Bluetooth devices as well, which may include earphones and other computer accessories. On the other hand, the law may be under-inclusive because a direct or indirect connection to the Internet is necessary. Some devices may connect to private networks rather than the public Internet. The definition of “connected device” apparently excludes these devices, even though their security needs may be as great as Internet-connected devices.

California also enacted a new type of law, a “bot disclosure law.” This new law relates to the use of software bots (automated agents), especially ones that post content on social media to distort voting behavior. It also would apply to bots that generate fake reviews to pump up a business’s reputation. The law makes it unlawful for a person to communicate online with the intent to mislead another person about a bot’s artificial identity for the purpose of knowingly deceiving a person about the content of the communication. It applies where the person is trying to incentivize a purchase or sale of goods or services in a commercial transaction or to influence voting. No liability attaches, however, if the person clearly and conspicuously discloses the existence of the bot.

Autonomous-Vehicle.jpg

Other laws regulate autonomous driving. Automated vehicles may be robots, may be connected to the Internet, and may receive or generate large amounts of data. California’s SB 1298 facilitates the operation of autonomous vehicles on California’s highways and the testing of those vehicles. In 2018, the California Department of Motor Vehicles adopted new regulations regarding autonomous vehicles. Under those regulations, manufacturers cannot place autonomous vehicles on public roads unless they provide the Department of Motor Vehicles “[a] certification that the autonomous vehicles meet appropriate and applicable current industry standards to help defend against, detect, and respond to cyber-attacks, unauthorized intrusions, or false vehicle control commands.” Most states now have autonomous vehicle laws, executive orders facilitating autonomous vehicles, or both. Manufacturers testing autonomous vehicles will need to comply with these laws and any data protection laws or regulations associated with them. Autonomous vehicle laws and truck platooning laws may not mention cybersecurity explicitly, but the process to prove safety sufficient to obtain a certification or other approval will likely include some showing of reasonable measures to prevent cyberattack.

Furthermore, privacy laws affect the use of drones with cameras and other surveillance technologies. For example, California has a law that makes a user liable for invasion of privacy for trespassing onto land or in the airspace of another person without permission to capture video or audio where the invasion was in a manner offensive to a reasonable person. Other states have drone privacy laws as well.

Finally, businesses using advanced automated data processing technologies with multinational operations, with customers in foreign countries, monitoring the behavior of foreign citizens, and processing data for foreign businesses should analyze whether they have compliance requirements under international and foreign data protection laws. For instance, the European Union’s General Data Protection Regulation (GDPR) grants individual rights to individuals whose personal data was involved in automated data processing. Article 15 of GDPR gives individuals a right of access to information about personal data collected about them. Paragraph 1(h) of article 15 includes the right of the data subject to know about the existence of automated decision-making and “meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.” Recital 71 refers to the data subject having a right to an explanation of a decision reached by automated means.

In addition to the right of an explanation, a data subject has a right of human intervention. Under GDPR article 22, a “data subject shall have the right not to be subject to a decision based solely on automated processing” producing “legal effects concerning him or her or similarly significantly affects him or her.” In other words, a data subject can opt out of automated data processing, with the implication that a human must make a manual decision. This blanket opt-out right does not exist if automated processing is necessary for entering into or performing a contract, applicable law authorizes processing, or the data subject has explicitly consented. Nonetheless, in instances of processing for contractual purposes or consent, the data controller must still provide for safeguards for the data subjects, which at least includes “the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.”

For example, if a bank covered by GDPR turns down an applicant located in the European Economic Area for a loan based on its software powered by machine learning system used to score applicants, the applicant has a right to an explanation of how the system determined that he or she was not eligible for a loan. Moreover, under article 22, the data subject can demand that a bank official intervene, look at the results of the system, and listen to the data subject’s arguments to contest the decision. These provisions do not require the bank to change the results of the process, but they do give data subjects relief from machine-only automated decisions and a process to challenge them.

The difficulty with these laws is that many machine learning artificial intelligence systems are “black boxes.” It may be difficult for even experts to explain how a machine learning system came up with a decision. Businesses and academics are working on this problem of machine learning explainability in part to satisfy requirements in GDPR and future laws likely to follow.

2. General Laws

General data protection laws may apply to advanced technologies. This section contains some examples of general laws that may impose privacy or security requirements on businesses developing, selling, purchasing, or operating advanced technologies. Some general privacy and security laws are applicable to specific sectors.

For instance, financial institutions purchasing IoT devices or using AI for processing customer nonpublic personal information must account for compliance with the Gramm-Leach-Bliley Act of 1999 (GLBA), which is the main piece of federal legislation governing financial institution privacy and security practices. The GLBA requires covered financial institutions to implement processes and procedures to ensure the security and confidentiality of consumer information, protect against anticipated threats or hazards to the security of customer records, and protect against unauthorized access to such records. In addition, the GLBA requires financial institutions to provide notice to consumers about their information practices and give consumers an opportunity to direct that their personal information not be shared with certain non-affiliated third parties. When financial institutions purchase or license advanced technologies, they must make sure they do not put nonpublic personal information at risk. For instance, banks should create secure transmission protocols with their automated teller machines to prevent interception and compromise of financial information.

Likewise, healthcare providers and their business associates obtaining and operating surgical and service robots, patient data machine learning and AI systems, and operational AI systems will need to comply with the Health Insurance Portability and Accountability Act (HIPAA), the HITECH Act, and regulations promulgated under them. Privacy notices will need to disclose what health information the business collects, how it uses that information, and to whom it will disclose the health information. The HIPAA Security Rule will require the business to implement reasonable and appropriate administrative, physical, and technical safeguards to secure protected health information created, received, maintained, or transmitted by the business. For example, a hospital operating service robots in its facility should have a policy to manage audio and video data recorded by the robots. It may seek to minimize the amount of protected health information recorded in the first place. Moreover, its policy should ensure that any protected health information recorded by the robots is secured and not shared with unauthorized parties.

Other federal agencies have jurisdiction to regulate or at least provide guidance about data protection practices for advanced technologies used in other sectors. For instance:

  • the Food and Drug Administration provides guidance for premarket submissions for and post-market management of cybersecurity issues;

  • public utilities commissions regulate privacy and security requirements for smart meters;

  • the Department of Energy’s programs promote security for Big Data from smart meters and sensors, as well as security requirements for critical power grid infrastructure and integrated distributed energy resources;

  • the Federal Communications Commission and the Department of Transportation oversee Security protocols for connected vehicle communications;

  • the Department of Defense provides cybersecurity guidance and policies that govern the procurement and operation of Internet of Things devices.

Aside from these sector-specific data protection laws, businesses selling or operating advanced technology systems also need to comply with general state breach notification and security laws. Beginning with California in 2003, states began requiring that businesses holding various categories of unsecured personal information about state residents notify those residents of security breaches that compromise their personal information. All states, the District of Columbia, Guam, Puerto Rico, and the U.S. Virgin Islands have breach notification laws. Personal information covered by breach notification laws include Social Security numbers, driver’s license/state ID numbers, and financial account numbers in combination with a PIN, password, or other identifier facilitating use of or access to financial accounts.

A number of states go further and require businesses to take reasonable measures to protect the security of personal information about state residents. A prime example is California’s AB 1950. A business subject to federal or state law providing greater protection for personal information, however, is deemed in compliance with AB 1950. Other states have similar laws. Practitioners should also bear in mind the possible scope of federal preemption of these state laws, especially as Congress considers federal data protection and breach notification legislation.

Massachusetts, however, has a more detailed set of information security requirements. The Massachusetts Office of Consumer Affairs and Business Regulation issued regulations in 2008 to implement the Massachusetts security breach and data destruction law. Unlike the state security laws discussed in the previous paragraph, the Massachusetts regulations require a written information security program with specific security controls that businesses holding personal information about Massachusetts residents must implement.

Businesses using advanced technologies that receive, store, or transmit any of the covered data elements must comply with these state data protection and breach notification laws. Manufacturers selling or licensing these technologies will want to make sure their systems facilitate compliance by their customers. Customers may negotiate agreements with them that places the responsibilities for compliance violations and data breaches on them without constraints of the normal liability caps vendors place in agreements.

Likewise, businesses will need to account for the new California Consumer Privacy Act (CCPA) when it goes into effect in 2020, as well as any other state laws that follow on CCPA. CCPA provides “consumers” (California residents) with certain individual rights, such as the right of disclosure about the collection, use, and disclosure of personal information, the right to demand erasure of personal information, and the right to opt-out of the sale of personal information. Businesses collecting personal information in connection with the sale or operation of advanced technologies will need to comply with CCPA once it becomes effective.

In addition, businesses should take into account laws against unfair and deceptive trade practices. Examples include the Federal Trade Commission Act section 5, California’s Unfair Competition Law, California’s False Advertising Law, and similar laws in other states. The Federal Trade Commission regularly brings enforcement actions against businesses failing to secure their advanced technology products. Manufacturers and sellers that misrepresent their privacy or security practices or fail to include reasonable security features in their products may face federal or state enforcement actions or private party class-action suits.

Finally, businesses may need to meet the requirements of GDPR and other foreign data protection laws. If they have customers from or operations in foreign countries or receive personal data from foreign countries, they should determine if they fall under these laws and how those laws affect them.

Stephen S. Wu is a shareholder with Silicon Valley Law Group in San Jose, California. He advises clients on a wide range of issues, including transactions, compliance, liability, security, and privacy matters regarding the latest technologies in areas such as robotics, artificial intelligence, automated transportation, the Internet of Things, and Big Data. He has authored or co-authored several books, book chapters, and articles and is a frequent speaker on advanced technology and data protection legal topics. 

Previous
Previous

Six-Step Process of Implementing an Effective Security and Privacy Program

Next
Next

Privacy Issues Raised by Advanced Technologies