Skip to main content

Rules, Procedure, Comments

All opinions of the Ethics Committee are predicated upon the North Carolina Rules of Professional Conduct. Any interested person or group may submit a written comment – including comments in support of or against the proposed opinion – or request to be heard concerning a proposed opinion. The Ethics Committee welcomes and encourages the submission of comments, and all comments are considered by the committee at the next quarterly meeting. Any comment or request should be directed to the Ethics Committee at ethicscomments@ncbar.gov no later than March 30, 2024.

Council Actions

At its meeting on January 19, 2024, the State Bar Council adopted the ethics opinion summarized below:

2023 Formal Ethics Opinion 4
Use of a Lawyer’s Trade Name for Keyword Advertisements in an Internet Search Engine

Proposed opinion rules that the intentional selection of another lawyer’s unique firm trade name in a keyword advertisement campaign is prohibited, but that prohibition does not apply when the trade name is also a common search term.

Ethics Committee Actions

At its meeting on January 18, 2024, the Ethics Committee considered a total of six inquiries, including the opinion noted above. Four inquiries were sent or returned to subcommittee for further study, including an inquiry addressing a lawyer’s ability to obligate a client’s estate to pay the lawyer for any time spent defending the lawyer’s work in drafting and executing the client’s will and an inquiry exploring a lawyer’s duty of confidentiality when inheriting confidential client information. Additionally, in October 2023 the Ethics Committee published Proposed 2023 Formal Ethics Opinion 3, Installation of Third Party’s Self-Service Kiosk in Lawyer’s Office and Inclusion of Lawyer in Third Party’s Advertising Efforts; based on comments received during publication, the committee voted to return the inquiry to subcommittee for further study. The committee also approved the publication of one new proposed formal ethics opinion on a lawyer’s use of artificial intelligence in a law practice, which appears below.

Proposed 2024 Formal Ethics Opinion 1 Use of Artificial Intelligence in a Law Practice

January 18, 2024

Proposed opinion discusses a lawyer’s professional responsibility when using artificial intelligence in a law practice.

Editor’s Note: There is an increasingly vast number of helpful resources on understanding Artificial Intelligence and the technology’s interaction with the legal profession. The resources referenced in this opinion are not exhaustive but are intended to serve as a starting point for a lawyer’s understanding of the topic. Over time, this editor’s note may be updated as additional resources are published that staff concludes would be beneficial to lawyers.

Background

“Artificial intelligence” (hereinafter, “AI”) is a broad and evolving term encompassing myriad programs and processes with myriad capabilities. While a single definition of AI is not yet settled (and likely impossible), for the purposes of this opinion, the term “AI” refers to “a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations or decisions influencing real or virtual environments.” Nat’l Artificial Intelligence Initiative Act of 2020, Div. E, sec. 5002(3) (2021). Said in another, over-simplified way, AI is the use of computer science and extensive data sets to enable problem solving or decision-making, often through the implementation of sophisticated algorithms. AI encompasses, but is not limited to, both extractive and generative AI,1 natural language processing, large language models, and any number of machine learning processes.2 Examples of law-related AI programs range from online electronic legal research and case management software to e-discovery tools and programs that draft legal documents (e.g., a trial brief, will, etc.) based upon the lawyer’s input of information that may or may not be client-specific.

Most lawyers have likely used some form of AI when practicing law, even if they didn’t realize it (e.g., widely used online legal research subscription services utilize a type of extractive AI, or a program that “extracts” information relevant to the user’s inquiry from a large set of existing data upon which the program has been trained). Within the year preceding the date of this opinion, generative AI programs that create products in response to a user’s request based upon a large set of existing data upon which the program has been trained (e.g., Chat-GPT) have grown in capability and popularity, generating both positive and negative reactions regarding the integration of these technological breakthroughs in the legal profession.3 It is unquestioned that AI can be used in the practice of law to increase efficiency and consistency in the provision of legal services. However, AI and its work product can be inaccurate or unreliable despite its appearance of reliability when used during the provision of legal services.4

Inquiry #1:

Considering the advantages and disadvantages of using AI in the provision of legal services, is a lawyer permitted to use AI in a law practice?

Opinion #1:

Yes, provided the lawyer uses any AI program, tool, or resource competently, securely to protect client confidentiality, and with proper supervision when relying upon or implementing the AI’s work product in the provision of legal services.

On the spectrum of law practice resources, AI falls somewhere between programs, tools, and processes readily used in law practice today (e.g. case management systems, trust account management programs, electronic legal research, etc.) and nonlawyer support staff (e.g. paralegals, summer associates, IT professionals, etc.). Nothing in the Rules of Professional Conduct specifically addresses, let alone prohibits, a lawyer’s use of AI in a law practice. However, should a lawyer choose to employ AI in a practice, the lawyer must do so competently, the lawyer must do so securely, and the lawyer must exercise independent judgment in supervising the use of such processes.

Rule 1.1 prohibits lawyers from “handl[ing] a legal matter that the lawyer knows or should know he or she is not competent to handle[,]” and goes on to note that “[c]ompetent representation requires the legal knowledge, skill, thoroughness, and preparation reasonably necessary for the representation.” Comment 8 to Rule 1.1 recognizes the reality of advancements in technology impacting a lawyer’s practice, and states that part of a lawyer’s duty of competency is to “keep abreast of changes in the law and its practice, including the benefits and risks associated with the technology relevant to the lawyer’s practice[.]” Rule 1.6(c) requires a lawyer to “make reasonable efforts to prevent the inadvertent or unauthorized disclosure of, or unauthorized access to, information relating to the representation of a client.” Rule 5.3 requires a lawyer to “make reasonable efforts to ensure that the firm or organization has in effect measures giving reasonable assurance that the nonlawyer’s conduct is compatible with the professional obligations of the lawyer[,]” and further requires that “a lawyer having direct supervisory authority over the nonlawyer shall make reasonable efforts to ensure that the nonlawyer's conduct is compatible with the professional obligations of the lawyer[.]” Rules 5.3(a) and (b). The requirements articulated in Rule 5.3 apply to nonlawyer assistants within a law firm as well as those outside of a law firm that are engaged to provide assistance in the lawyer’s provision of legal services to clients, such as third-party software companies. See 2011 FEO 6 (“Although a lawyer may use nonlawyers outside of the firm to assist in rendering legal services to clients, Rule 5.3(a) requires the lawyer to make reasonable efforts to ensure that the services are provided in a manner that is compatible with the professional obligations of the lawyer.”).

A lawyer may use AI in a variety of manners in connection with a law practice, and it is a lawyer’s responsibility to exercise independent professional judgment in determining how (or if) to use the product of an AI tool in furtherance of the representation of a client. From discovery and document review to legal research, drafting contracts, and aggregating/analyzing data trends, the possibilities for employing AI in a law practice are increasingly present and constantly evolving. A lawyer’s decision to use and rely upon AI to assist in the lawyer’s representation of a client is generally hers alone and one to be determined depending upon a number of factors, including the impact of such services, the cost of such services, and the reliability of the processes.5 This opinion does not attempt to dictate when and how AI is appropriate for a law practice.

Should a lawyer decide to employ AI in the representation of a client, however, the lawyer is fully responsible for the use and impact of AI in the client’s case. The lawyer must use the AI tool in a way that meets the competency standard set out in Rule 1.1. Like other software, the lawyer employing an AI tool must educate herself on the benefits and risks associated with the tool, as well as the impact of using the tool on the client’s case. Educational efforts include, but are not limited to, reviewing current and relevant resources on AI broadly and on the specific program intended for use during the provision of legal services. A lawyer that inputs confidential client information into an AI tool must take steps to ensure the information remains secure and protected from unauthorized access or inadvertent disclosure per Rule 1.6(c). Additionally, a lawyer utilizing an outside third-party company’s AI program or service must make reasonable efforts to ensure that the program or service used is compatible with the lawyer’s responsibilities under the Rules of Professional Conduct pursuant to Rule 5.3. Whether the lawyer is reviewing the results of a legal research program, a keyword search of emails for production during discovery, proposed reconciliations of the lawyer’s trust account prepared by a long-time assistant, or a risk analysis of potential borrowers for a lender-client produced by an AI process, the lawyer is individually responsible for reviewing, evaluating, and ultimately relying upon the work produced by someone—or something—other than the lawyer.

Inquiry #2:

May a lawyer provide or input a client’s documents, data, or other information to a third-party company’s AI program for assistance in the provision of legal services?

Opinion #2:

Yes, provided the lawyer has satisfied herself that the third-party company’s AI program is sufficiently secure and complies with the lawyer’s obligations to ensure any client information will not be inadvertently disclosed or accessed by unauthorized individuals pursuant to Rule 1.6(c).

At the outset, the Ethics Committee does not opine on whether the information shared with an AI tool violates the attorney-client privilege, as the issue is a legal question and outside the scope of the Rules of Professional Conduct. A lawyer should research and resolve any question on privilege prior to engaging with a third-party company’s AI program for use in the provision of legal services to a client, particularly if client-specific information will be provided to the AI program.

This inquiry is akin to any lawyer providing confidential information to a third-party software program (practice management, cloud storage, etc.), on which the Ethics Committee has previously opined. As noted above, a lawyer has an obligation to “make reasonable efforts to prevent the inadvertent or unauthorized disclosure of, or unauthorized access to, information relating the representation of the client.” Rule 1.6(c). What constitutes “reasonable efforts” will vary depending on the circumstances related to the practice and representation, as well as a variety of factors including the sensitivity of the information and the cost or benefit of employing additional security measures to protect the information. Rule 1.6, cmt. [19]. Ultimately, “[a] lawyer must take steps to minimize the risk that confidential client information will be disclosed to other clients or to third parties” when using technology to handle, communicate, analyze, or otherwise interact with confidential client information. 2008 FEO 5; see also 2005 FEO 10; 2011 FEO 6.

The Ethics Committee in 2011 FEO 6 recognized that employing a third-party company’s services/technology with regards to confidential client information requires a lawyer to exercise reasonable care when selecting a vendor. The opinion states:

[W]hile the duty of confidentiality applies to lawyers who choose to use technology to communicate, this obligation does not require that a lawyer use only infallibly secure methods of communication. Rather, the lawyer must use reasonable care to select a mode of communication that, in light of the circumstances, will best protect confidential client information and the lawyer must advise effected parties if there is reason to believe that the chosen communications technology presents an unreasonable risk to confidentiality....A lawyer must fulfill the duties to protect confidential client information and to safeguard client files by applying the same diligence and competency to manage the risks of [technology] that the lawyer is required to apply when representing clients.

2011 FEO 6 (internal citations omitted). In exercising reasonable care, the opinion discusses a sample of considerations for evaluating whether a particular third-party company’s services are compatible with the lawyer’s professional responsibility, including:

• The experience, reputation, and stability of the company;
• Whether the terms of service include an agreement on how the company will handle confidential client information, including security measures employed by the company to safeguard information provided by the lawyer; and
• Whether the terms of service clarify how information provided to the company will be retrieved by the lawyer or otherwise safely destroyed if not retrieved should the company go out of business, change ownership, or if services are terminated.

2011 FEO 6; see Rule 5.3. A proposed ethics opinion from the Florida Bar on a lawyer’s use of AI adds that lawyers should “[d]etermine whether the provider retains information submitted by the lawyer before and after the discontinuation of services or asserts proprietary rights to the information” when determining whether a third-party company’s technological services are compatible with the lawyer’s duty of confidentiality. See Florida Bar Proposed Advisory Opinion 24-1 (published Nov. 13, 2023).

Furthermore, this duty of reasonable care continues beyond initial selection of a service, program, or tool and extends throughout the lawyer’s use of the service. A lawyer should continuously educate herself on the selected technology and developments thereto—both individually and by “consult[ing] periodically with professionals competent in the area of online security”—and make necessary adjustments (including abandonment, if necessary) when discoveries are made that call into question services previously thought to be secure. 2011 FEO 6.

The aforementioned considerations—including the consideration regarding ownership of information articulated by the Florida Bar opinion—are equally applicable to a lawyer’s selection and use of a third-party company’s AI service/program. Just as with any third-party service, a lawyer has a duty under Rule 5.3 to make reasonable efforts to ensure the third-party AI program or service is compatible with the lawyer’s professional responsibility, particularly with regards to the lawyer’s duty of confidentiality pursuant to Rule 1.6. Importantly, some current AI programs are publicly available to all consumers/users, and the nature of these AI programs are to retain and train itself based on the information provided by any user of its program. Lawyers should educate themselves on the nature of any publicly available AI program intended to be used in the provision of legal services, with particular focus on whether the AI program will retain and subsequently use the information provided by the user. Generally, and as of the date of this opinion, lawyers should avoid inputting client-specific information into publicly available AI resources.

Inquiry #3:

If a firm were to have an AI software tool initially developed by a third-party but then used the AI tool in-house using law firm owned servers and related infrastructure, does that change the data security requirement analysis in Opinion #2?

Opinion #3:

No. Lawyer remains responsible for keeping the information secure pursuant to Rule 1.6(c) regardless of the program’s location. While an in-house program may seem more secure because the program is maintained and run using local servers, those servers may be more vulnerable to attack because a lawyer acting independently may not be able to match the security features typically employed by larger companies whose reputations are built in part on security and customer service. A lawyer who plans to independently store client information should consult an information technology/cybersecurity expert about steps needed to adequately protect the information stored on local servers.

Relatedly, AI programs developed for use in-house or by a particular law practice may also be derivatives of a single, publicly available AI program; as such, some of these customized programs may continue to send information inputted into the firm-specific program back to the central program for additional use or training. Again, prior to using such a program, a lawyer must educate herself on the nuances and operation of the program to ensure client information will remain protected in accordance with the lawyer’s professional responsibility. The list of considerations found in Opinion #2 offers a starting point for questions to explore when identifying, evaluating, and selecting a vendor.

Inquiry #4:

If a lawyer signs a pleading based on information generated from AI, is there variation from traditional or existing ethical obligations and expectations placed on lawyers signing pleadings absent AI involvement?

Opinion #4:

No. A lawyer may not abrogate her responsibilities under the Rules of Professional Conduct by relying upon AI. Per Rule 3.1, a lawyer is prohibited from bringing or defending “a proceeding, or assert[ing] or controvert[ing] an issue therein, unless there is a basis in law and fact for doing so that is not frivolous[.]” A lawyer’s signature on a pleading also certifies the lawyer’s good faith belief as to the factual and legal assertions therein. See N.C. R. Civ. Pro. 11 (“The signature of an attorney...constitutes a certificate by him that he has read the pleading, motion, or other paper; that to the best of his knowledge, information, and belief formed after reasonable inquiry it is well grounded in fact and is warranted by existing law or a good faith argument for the extension, modification, or reversal of existing law, and that it is not interposed for any improper purpose, such as to harass or to cause unnecessary delay or needless increase in the cost of litigation.”). If the lawyer employs AI in her practice and adopts the tool’s product as her own, the lawyer is professionally responsible for the use of the tool’s product. See Opinion #1.

Inquiry #5:

If a lawyer uses AI to assist in the representation of a client, is the lawyer under any obligation to inform the client that the lawyer has used AI in furtherance of the representation or legal services provided?

Opinion #5:

The answer to this question depends on the type of technology used, the intended product from the technology, and the level of reliance placed upon the technology/technology’s product. Ultimately, the attorney/firm will need to evaluate each case and each client individually. Rule 1.4(b) requires an attorney to explain a matter to her client “to the extent reasonably necessary to permit the client to make informed decisions regarding the representation.” Generally, a lawyer need not inform her client that she is using an AI tool to complete ordinary tasks, such as conducting legal research or generic case/practice management. However, if a lawyer delegates substantive tasks in furtherance of the representation to an AI tool, the lawyer’s use of the tool is akin to outsourcing legal work to a nonlawyer, for which the client’s advanced informed consent is required. See 2007 FEO 12. Additionally, if the decision to use or not use an AI tool in the case requires the client’s input with regard to fees, the lawyer must inform and seek input from the client.

Inquiry #6:

Lawyer has an estate planning practice and bills at the rate of $300 per hour. Lawyer has integrated an AI program into the provision of legal services, resulting in increased efficiency and work output. For example, Lawyer previously spent approximately three hours drafting standard estate planning documents for a client; with the use of AI, Lawyer now spends only one hour preparing those same documents for a client. May Lawyer bill the client for the three hours of work that the prepared estate documents represent?

Opinion #6:

No, Lawyer may not bill a client for three hours of work when only one hour of work was actually experienced. A lawyer’s billing practices must be accurate, honest, and not clearly excessive. Rules 7.1, 8.4(c), and 1.5(a); see also 2022 FEO 4. If the use of AI in Lawyer’s practice results in greater efficiencies in providing legal services, Lawyer may enjoy the benefit of those new efficiencies by completing more work for more clients; Lawyer may not inaccurately bill a client based upon the “time-value represented” by the end product should Lawyer not have used AI when providing legal services.

Rather than billing on an hourly basis, Lawyer may consider billing clients a flat fee for the drafting of documents—even when using AI to assist in drafting—provided the flat fee charged is not clearly excessive and the client consents to the billing structure. See 2022 FEO 4.

Relatedly, Lawyer may also bill a client for actual expenses incurred when employing AI in the furtherance of a client’s legal services, provided the expenses charged are accurate, not clearly excessive, and the client consents to the charge, preferably in writing. See Rule 1.5(b). Lawyer may not bill a general “administrative fee” for the use of AI during the representation of a client; rather, any cost charged to a client based on Lawyer’s use of AI must be specifically identified and directly related to the legal services provided to the client during the representation. For example, if Lawyer has generally incorporated AI into her law practice for the purpose of case management or drafting assistance upon which Lawyer may or may not rely when providing legal services to all clients, Lawyer may not bill clients a generic administrative fee to offset the costs Lawyer experiences related to her use of AI. However, if Lawyer employs AI on a limited basis for a single client to assist in the provision of legal services, Lawyer may charge those expenses to the client provided the expenses are accurate, not clearly excessive, and the client consents to the expense and charge, preferably in writing. 

Endnotes
1. For a better understanding of the differences between extractive and generative AI, see Jake Nelson, Combining Extractive and Generative AI for New Possibilities, LexisNexis (June 6, 2023), lexisnexis.com/community/insights/legal/b/thought-leadership/posts/combining-extractive-and-generative-ai-for-new-possibilities (last visited January 10, 2024).

2. For an overview of the state of AI as of the date of this opinion, see What is Artificial Intelligence (AI)?, IBM, ibm.com/topics/artificial-intelligence (last visited January 10, 2024). For information on how AI relates to the legal profession, see AI Terms for Legal Professionals: Understanding What Powers Legal Tech, LexisNexis (March 20, 2023), lexisnexis.com/community/insights/legal/b/thought-leadership/posts/ai-terms-for-legal-professionals-understanding-what-powers-legal-tech (last visited January 10, 2024).

3. John Villasenor, How AI Will Revolutionize the Practice of Law, Brookings Institution (March 20, 2023), brookings.edu/articles/how-ai-will-revolutionize-the-practice-of-law/ (last visited January 10, 2024); Steve Lohr, AI is Coming for Lawyers Again, New York Times (April 10, 2023), nytimes.com/2023/04/10/technology/ai-is-coming-for-lawyers-again.html (last visited January 10, 2024).

4. Larry Neumeister, Lawyers Blame ChatGPT for Tricking Them Into Citing Bogus Case Law, AP News (June 8, 2023), apnews.com/article/artificial-intelligence-chatgpt-courts-e15023d7e6fdf4f099aa 122437dbb59b (last visited January 10, 2024).

5. In certain circumstances a lawyer may need to consult a client about employing AI in the provision of legal services to that client, see Opinion #5, below. 

The Ethics Committee welcomes feedback on the proposed opinion; feedback should be sent to ethicscomments@ncbar.gov.

Back to top