A significant legal distinction between DAOs and traditional companies is that DAOs do not by default impose fiduciary duties on their members. The traditional fiduciary duties include: the duty of care, the duty of loyalty, the duty of good faith, the duty of confidentiality, the duty of prudence, and the duty of disclosure. These fiduciary duties allow for legal action to be taken against the violating party. I will explain each of the aforementioned duties individually, although there is some overlap between them, and you can decide whether they might be a good fit for your DAO.
The duty of care requires that directors and officers (in the case of corporations), and members (in the case of LLCs), inform themselves prior to making a business decision of “all material information reasonably available to them." Smith v. Van Gorkem, 488 A.2d 858 (1985). The stereotypical example here would be the executive who pays no attention to the market, industry, or products of a toothpaste company and yet shows up to the board meeting and irrationally advocates for the company to purchase a golf course he likes. While any reasonable person would see that the golf course does not align with the organization’s interests and may even bankrupt the company, the director persuades the board to pursue this opportunity. The violation of the duty of care may then be pursued by a shareholder against the board of directors.
If a DAO were to adopt this duty, DAO members should define the scope of what might be considered “material information.” For example, it could be required that all DAO members read a particular newsletter and acknowledge receipt of it. While traditional companies are bound by caselaw that more specifically defines the facts that might violate this duty, the flexibility of DAOs allows for DAO members to determine how best, if it all, to adapt this duty to their organization.
The duty of loyalty is particularly important as it requires directors, officers and members to avoid self-dealing, which are transactions which may present conflicts of interest. For example, if a member were to discover an opportunity in the line of business of the organization, but instead of informing the organization about the opportunity, he or she pursued it himself, that would be a violation of the duty of loyalty. Also, forcing the sale of the organization’s assets at an artificially low price so as to benefit a particular member may also be a violation of the duty of loyalty.
Note that violation of the duty of loyalty can be avoided by defining proper procedures of how members may pursue business opportunities in the organization’s line of business in an individual capacity, such as presenting the opportunity to the organization’s members and allowing them vote on whether to pursue the opportunity. If the members elect not to pursue the opportunity, then the member is free to pursue the opportunity. The duty of loyalty is a good duty for DAOs to adopt if members are pursuing a collective goal in which the fruits of that pursuit should be shared collectively.
The duty of good faith requires that directors, officers, and members of the organization advance the interests of the organization and do so without violating the law. In re The Walt Disney Co. Derivative Litig., 906 A.2d 27 (Del. 2006). This is a good duty for mission focused DAOs to impose on their members in order to keep the organization aligned with its mission and to reinforce legal boundaries on members, which can help to reinforce the position that the DAO operates for a lawful purpose and that any criminal conduct on behalf of an individual member is not an act on behalf of the DAO as an organization.
The Wyoming DAO LLC statute does not impose any other fiduciary duties on members other than good faith and fair dealing. If a DAO were to have an unchangeable mission that is hard to enforce because its mission is a broad pronouncement such as "To reverse the effects of the existential threat of climate change," then imposing the duty of good faith may help to ensure protection from members who may seek to pursue opportunities that do not reasonably fulfill that mission. The extreme example of this would be when some mischievous advocate members try to co-opt a climate change DAO to invest in fossil fuels. The more mild example might be simply pursuing tangential activities such as investing in climate photography in order to mint them into NFTs, an act which may not reasonably promote the goal of the DAO (although there is an argument that these could raise investment funds to support the cause).
The duty of confidentiality requires that directors, officers and members keep the organization's information confidential and to not disclose it for their own benefit. As the crypto community is a very open and vocal one, this duty may not be a good fit for DAOs. Moreover, the transparency benefit of DAOs would be hindered if there were a heightened duty of confidentiality. Nevertheless, there may be some instances of some DAOs that would prefer to operate with confidentiality perhaps due to the sensitive nature of the information contained within them (healthcare data, for example) and therefore it maybe a good idea to impose this duty on members in certain contexts.
The duty of prudence typically applies to trusts and organizations that govern professionals of a particular skillset. It requires that those parties operate with the degree of care, skill, and caution that is reasonable for a person of that position. Trust administrators therefore must know about investing, and invest the trust’s resources with prudence. A DAO governing an insurance fund for stuntmen could impose the duty of prudence to ensure that the stuntmen take the necessary precautions typical of their stunts in order for the stuntmen to obtain relief in the event they are injured. This avoids the abuse of stuntmen who are reckless and will drain the fund with their injuries.
The duty of disclosure requires directors, officers and members to act with "complete candor" and disclose "all the facts and circumstances" relevant to their decisions in the organization. This is probably too much of a burden for most DAOs as it would constrain decision making by requiring a very high standard of disclosure. DAOs represent a nimble organizational form that can pass votes at the click of a button. Efficiency would be stifled if every decision were to be contested for lack of “complete candor.”
I hope that expanding on these fiduciary duties provides you with a better idea of which fiduciary duties may or may not be a good idea for your DAO. While imposing these duties might seem onerous and constrain the DAO from making more efficient decisions, remember to balance procedural efficiency with these fiduciary duties to incentivize good conduct and mission adherence. Please reach out if you have any further questions.
One of the standout consumer rights under the California Consumer Protection Act (CCPA) is the right to delete. While in theory the right to delete is a powerful consumer protection measure, this right is not absolute. The CCPA defines several instances where a business or service provider is not required to delete user data. Specifically, a business or service provider is not required to comply with a consumer’s request to delete personal information if it is necessary for the business or service provider to maintain the consumer’s personal information to:
(1) Complete the transaction for which the personal information was collected, provide a good or service requested by the consumer, or reasonably anticipated within the context of a business’s ongoing business relationship with the consumer, or otherwise perform a contract between the business and the consumer.
(2) Detect security incidents, protect against malicious, deceptive, fraudulent, or illegal activity; or prosecute those responsible for that activity.
(3) Debug to identify and repair errors that impair existing intended functionality.
(4) Exercise free speech, ensure the right of another consumer to exercise his or her right of free speech, or exercise another right provided for by law.
(5) Comply with the California Electronic Communications Privacy Act pursuant to Chapter 3.6 (commencing with Section 1546) of Title 12 of Part 2 of the Penal Code.
(6) Engage in public or peer-reviewed scientific, historical, or statistical research in the public interest that adheres to all other applicable ethics and privacy laws, when the businesses’ deletion of the information is likely to render impossible or seriously impair the achievement of such research, if the consumer has provided informed consent.
(7) To enable solely internal uses that are reasonably aligned with the expectations of the consumer based on the consumer’s relationship with the business.
(8) Comply with a legal obligation.
(9) Otherwise use the consumer’s personal information, internally, in a lawful manner that is compatible with the context in which the consumer provided the information.
The first element likely allows for web and app functions such as shopping carts, and it may also permit some degree of targeted advertising, such as product recommendations based on previous purchases. Because this element explicitly allows for performance of a contract between organizations and consumers, organizations should write account creation contracts consistent with their data use practices. Doing so should allow for the retention of some consumer data as “necessary” for the business or service provider under the law. The second element likely allows for the retention of IP addresses, MAC addresses and other uniquely identifying information for the sake of network security. Similarly, the third element allows for the retention of data such as browser types, but also potentially things like activity logs for debugging purposes.
The sixth element pertains to research in the public interest. It places a very high bar of not only informed consent, but also only allows for retention of data if the research would be rendered impossible or would be seriously impaired by deletion. The CCPA allows for the retention of data for public interest research only if it is:
(A) Compatible with the business purpose for which the personal information was collected.
(B) Subsequently pseudonymized and deidentified, or deidentified and in the aggregate, such that the information cannot reasonably identify, relate to, describe, be capable of being associated with, or be linked, directly or indirectly, to a particular consumer.
(C) Made subject to technical safeguards that prohibit reidentification of the consumer to whom the information may pertain.
(D) Subject to business processes that specifically prohibit reidentification of the information.
(E) Made subject to business processes to prevent inadvertent release of deidentified information.
(F) Protected from any reidentification attempts.
(G) Used solely for research purposes that are compatible with the context in which the personal information was collected.
(H) Not be used for any commercial purpose.
(I) Subjected by the business conducting the research to additional security controls limit access to the research data to only those individuals in a business as are necessary to carry out the research purpose.
These elements are very restrictive and require extra attention to ensure adequate compliance measures are in place. Element A limits use of data to the “business purpose” it was collected and working alongside element G, underscores that there must be a nexus between the research and the context in which the data was collected. The law defines “business purpose” as the “use of personal information for the business’ or a service provider’s operational purposes, or other notified purposes, provided that the use of personal information shall be reasonably necessary and proportionate to achieve the operational purpose for which the personal information was collected or processed or for another operational purpose that is compatible with the context in which the personal information was collected.” The “reasonably necessary and proportionate” constraint means businesses must comply with a request to delete data that does not have a strong nexus to the core operations of the business, such as a flashlight app that collects GPS coordinates.
Elements B, C, D, E, F, and I impose very strong safeguards on data anonymity and security practices. What constitutes anonymized or deidentified data is a raging debate among academics, as data scientists are able to reidentify data by finding unique variables in robust datasets that may appear on the surface to not contain common forms of personal information. The CCPA definition of what constitutes a deidentified dataset does not definitively clarify this definition. The CCPA defines “deidentified” as, “[I]nformation that cannot reasonably identify, relate to, describe, be capable of being associated with, or be linked, directly or indirectly, to a particular consumer.” The “reasonably” language here is sure to be debated, as what is reasonably deidentified to a lawyer looking through a dataset is far different than what is reasonably deidenetified to a data scientist familiar with re-identification methods.
The other aforementioned data anonymity and security elements provide the real teeth of the law. Several focus on ensuring that there are robust processes to prevent the data from being reidentified after it is deidentified. Broad language such as “protected from any reidentification attempts” and “business processes that specifically prohibit reidentification” forces businesses to show that they have systems and processes to prevent data scientists who are keen on linking big datasets from exposing the organization to liability. Data dumps are often done carelessly, combining as much data as possible to spot trends, patterns and anomalies that may prove valuable. General counsels and compliance officers need to become more involved in the process of data analysis to ensure the proper safeguards are in place and adhered to. Because there is no specific standard for deidentified data and security practices, businesses should make every attempt to create these systems in good faith and consistent with the latest data science practices--differential privacy among them.
Finally, the most important compliance check in the "right to delete" is whether your organization is collecting a wide swath of data unrelated to the "business purpose" and context in which the data is collected. The CCPA is littered with context specific constraints on data use. Because penalties are enforceable on a per violation basis, casting a wide net and collecting data that cannot be reasonably tied to consumer expectations could expose companies to a significant financial liability. As a result, it is wise to review what data is collected, when it is collected and how it is collected (passively or with user input, for example) to ensure that the collection practices are reasonable within the context. Otherwise, prepare your engineers to segment and tag data which may be subject to delete requests.
The California Consumer Privacy Act (CCPA) will go in effect on January 1st, 2020. The first question you should ask is whether it applies to your organization. This article will outline what types of businesses are regulated by the CCPA.
The CCPA applies to organizations which:
(1) Earn annual gross revenues greater than 25M;
(2) Buy, receive, or sell personal information of more than 50,000 consumers, households or devices for commercial purposes; or
(3) Derive 50 percent or more of their annual revenues from selling consumers’ personal information. If your business meets at least one of those criteria, it is likely subject to the CCPA.
The next key inquiry is whether your business collects personal information. Personal information is not exhaustively defined under the CCPA the same way it is under other privacy laws. The CCPA instead defines personal information broadly as “information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household.” The law nonetheless excludes publicly available information, such as data made available to the public from federal, state or local governments—subject to some important caveats. Nonetheless, there is some guidance as to what type of data constitutes personal information. The following data the law defines as examples of personal information:
(A) Identifiers such as a real name, alias, postal address, unique personal identifier, online identifier Internet Protocol address, email address, account name, social security number, driver’s license number, passport number, or other similar identifiers.
(B) Any categories of personal information described in subdivision (e) of Section 1798.80 (this references preemption of city, county, local agency and municipal law, which can be interpreted to mean that anything defined as personal information under local laws would also be considered personal information under the CCPA).
(C) Characteristics of protected classifications under California or federal law.
(D) Commercial information, including records of personal property, products or services purchased, obtained, or considered, or other purchasing or consuming histories or tendencies.
(E) Biometric information.
(F) Internet or other electronic network activity information, including, but not limited to, browsing history, search history, and information regarding a consumer’s interaction with an Internet Web site, application, or advertisement.
(G) Geolocation data.
(H) Audio, electronic, visual, thermal, olfactory, or similar information.
(I) Professional or employment-related information.
(J) Education information, defined as information that is not publicly available personally identifiable information as defined in the Family Educational Rights and Privacy Act (20 U.S.C. section 1232g, 34 C.F.R. Part 99).
(K) Inferences drawn from any of the information identified in this subdivision to create a profile about a consumer reflecting the consumer’s preferences, characteristics, psychological trends, preferences, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.
Unfortunately, because the law does not constrain the definition of personal information to those listed above, it is important that you drill down on precisely what information your information your organization is collecting and determine whether it is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. Moreover, because element K includes “inferences drawn from any of the information identified in this subdivision,” you should also be on notice if your organization holds data profiles and analysis based on personal information, even in the absence of the raw data itself.
If your company is subject to the CCPA as outlined in this article, examine your data collection, use and sharing practices immediately. The penalties under the CCPA are steep. Damages can range from $100 to $750 per consumer, per incident. Actual damages are also recoverable, and intentional violations can lead to penalties of $7,500 per violation. The law also leaves discretion for injunctive relief and “any other relief the court deems proper.” It is therefore incumbent upon you to ensure your businesses complies with the CCPA before it goes into effect next year.
The Health Insurance Portability and Accountability Act (HIPAA–referred to here to include its addendums, additions and modifications) limits access, use, and disclosure of sensitive Personal Health Information (PHI). The two key questions to determine whether HIPAA might apply to your organization are: (1) Is your organization a Covered Entity; or, (2) Does your organization deal in PHI? This article will address how to approach these questions.
1. Is your organization a Covered Entity?
HIPAA applies to Covered Entities, an important term which includes health plans, health care clearinghouses, and certain health care providers, such as those that transmit health information electronically in connection with certain financial and administrative transactions (for example, most hospitals). In contrast, HIPAA does not apply to many research organizations that handle PHI. Allow me to clarify which researchers may be subject to HIPAA.
Researchers are not Covered Entities unless they are also health care providers and engage in covered electronic transactions. Covered electronic transactions are those which involve the transmission of information between two parties to carry out financial or administrative activities related to health care, and contain the data points outlined in Section 2 below. HIPAA does not directly regulate researchers who are engaged in research within organizations that are not Covered Entities even if they gather, generate, access, and share PHI. For instance, a company that sponsors its own health research, or creates or maintains health information databases, is not a Covered Entity. An example of this would be a company like Fitbit, which creates wearable sensors.
Nonetheless, there are two key categories of researchers who are subject to HIPAA: (1) Covered Entity employees; and (2) researchers who use Covered Entity supplied data. In the first instance, the rule is simple: researchers who are employees of a Covered Entity are subject to HIPAA regulations.
The second instance may be more difficult to spot in big data sets. The rule is that if a Covered Entity supplies a researcher with PHI data, the data is subject to HIPAA. Furthermore, take note that research repositories and associated data are regulated under HHS and the Food and Drug Administration’s (FDA) Protection of Human Subjects Regulations. Researchers in medical and health-related disciplines frequently rely on access to many sources of PHI, such as medical records, epidemiological databases, disease registries, hospital discharge records, and government compilations of vital and health statistics. Clinical researchers often access medical information from patient charts and tissue and data repositories and create PHI in connection with an experimental intervention. Because this data may come from a Covered Entity, it may be subject to HIPAA. In conclusion, check to see if any of your data came from a Covered Entity.
2. Does your organization store or use PHI?
There are 18 key PHI data points. Covered Entities are granted “safe-harbor” if all these 18 identifiers are removed, and the Covered Entity does not have actual knowledge of a way to use the remaining information alone, or in combination with other information, to identify the subject. The key 18 PHI data points are as follows:
2. All geographic subdivisions smaller than a state, including street address, city, county, precinct, ZIP Code, and their equivalent geographical codes, except for the initial three digits of a ZIP Code if, according to the current publicly available data from the Bureau of the Census:
a. The geographic unit formed by combining all ZIP Codes with the same three initial digits contains more than 20,000 people.
b. The initial three digits of a ZIP Code for all such geographic units containing 20,000 or fewer people are changed to 000.
3. All elements of dates (except year) for dates directly related to an individual, including birth date, admission date, discharge date, date of death; and all ages over 89 and all elements of dates (including year) indicative of such age, except that such ages and elements may be aggregated into a single category of age 90 or older.
4. Telephone numbers.
5. Facsimile numbers.
6. Electronic mail addresses.
7. Social security numbers.
8. Medical record numbers.
9. Health plan beneficiary numbers.
10. Account numbers.
11. Certificate/license numbers.
12. Vehicle identifiers and serial numbers, including license plate numbers.
13. Device identifiers and serial numbers. 14. Web universal resource locators (URLs).
15. Internet protocol (IP) address numbers.
16. Biometric identifiers, including fingerprints and voiceprints.
17. Full-face photographic images and any comparable images.
18. Any other unique identifying number, characteristic, or code, unless otherwise permitted by the Privacy Rule for re-identification.
Please contact me if you have any questions and stand by for part two on HIPAA compliance.
Disclaimer: This blog is not intended to provide legal advice or my legal opinion. Any legal references or citations mentioned in these articles may be out-of-date. It is your responsibility to speak with an attorney before relying on any information included in these articles. Should you need a legal opinion on any topic discussed in this blog, please do not hesitate to contact me.