Community of lawyers.
Common purpose.
Shared goals.
Robert C. Dortch, Jr. | Sellers Hinshaw Ayers Dortch & Lyons PA | Charlotte
Camille Stell
Camille Stell is President of Lawyers Mutual Consulting and Services, offering succession planning, business development coaching, keynote presentations and more. Continue this conversation by contacting Camille at camille@lawyersmutualconsulting.com or 800.662.8843.

In this issue, we’re excited to spotlight an innovative initiative designed to help lawyers build sustainable practices while expanding access to justice. I sat down with Mark Atkinson, founder and Executive Director of the Incubator for Legal Practice and Innovation (ILPI), to talk about his career, the mission of ILPI, and the new cohort launching this fall – September 18 – 19 is the two-day kickoff.
Q: Mark, tell us about your early legal career. What first sparked your interest in access to justice work?
I attended law school later in life and—like many lawyers— I went hoping to make a difference, but quickly realized how many barriers exist, not just for attorneys but for the public trying to access legal help. If you are wealthy or are a big business, you can afford whatever legal help you need. If you are low income, you get some help through legal aid (but rarely enough!). That leaves a big gap of unmet legal needs among those of low and middle income. Those unmet needs are especially acute in rural areas and underserved communities. Seeing this present reality, it instilled in me a strong drive to explore new models that make legal help more accessible and more affordable, while also allowing lawyers to make a living.
Q: And that led you to found ILPI. Can you tell us what it is and what inspired you to launch it?
ILPI is a legal incubator. Our mission is simple: help entrepreneurial attorneys start financially sustainable practices that improve access to justice. Too often, lawyers—especially new ones—aren’t equipped with the business skills needed to launch a solo practice. At the same time, so many communities are “legal deserts,” with few or no lawyers available. ILPI is designed to bridge that gap. We provide training, resources, and a supportive community to help attorneys serve those communities while also building a viable business.
For some additional history, the legal incubator movement was started by Fred Rooney over 20 years ago at City University of New York Law School. Fred is a saint-of-a-man and has helped launch incubators in the USA and overseas. From my perspective, I think every law school and major urban area should have a legal incubator!
Q: What kind of support can attorneys expect if they join ILPI?
Our programming is remote, accessible by Zoom, and starts with a two-day kickoff session. From there, participants can join regular “Lunch-n-Learns” with experts, vendors, and practitioners. We focus on the business side of launching and running a law firm—things law school often overlooks. We provide access to tools like Clio, Lexis+, and CLE providers such as PLI and NBI-SEMS.
But maybe most importantly, we offer community and encouragement. It can be lonely starting a firm. ILPI ensures you’re not doing it alone—you’re part of a cohort who shares your values and your mission.
Q: Who is the incubator designed for?
ILPI is for anyone who wants to serve their community, do good and do well, and build a practice that expands access to justice. Maybe you just passed the Bar—or maybe you’ve been practicing for 10 or 20 years and are finally ready to go out on your own. ILPI is for attorneys who want to return home to a legal desert, who want to serve the underserved, and who want to make a proper living doing it.
Q: What’s next for ILPI? How can interested attorneys get involved?
Our next cohort launches September 18–19, 2025, with a two-day kickoff. Joining ILPI is a simple three-step process:
- Apply – Fill out the short online form.
- Chat – Schedule a call with me to make sure it’s a good fit.
- Agree – Participation requires commitment, but it’s worth it.
We keep the cost low—just $79 a month—because our goal is to help attorneys get started, not hold them back. If you’re interested, reach out to me directly at mark@innovationlegal.org or call 919.630.1143.
Final Thoughts
The Incubator for Legal Practice and Innovation is more than just a training program—it’s a movement to reimagine the delivery of legal services and empower the next generation of lawyers. ILPI has many alums who would love to tell you how the program helped them get their practice off the ground. If you’re looking to launch your practice with purpose, this could be your next step.

Artificial Intelligence (AI) is no longer a futuristic concept for the legal industry—it’s here, it’s evolving rapidly, and it’s already transforming how law firms operate. From drafting documents and conducting legal research to automating administrative tasks, AI is enabling new levels of efficiency and insight. That’s why now is the time for law firms to adopt a clear and thoughtful AI use policy.
Why a Law Firm Needs an AI Use Policy
1. Client Confidentiality Is Non-Negotiable
AI tools can pose a risk of exposing sensitive client information. Without strict internal guidelines, attorneys or staff might unknowingly input confidential data into tools that store or use it for model training. An AI policy helps safeguard privileged communications and ensure compliance with ethical obligations.
2. Professional Responsibility and Ethics
The American Bar Association and many state bars have issued guidance highlighting the ethical risks of using AI without proper oversight. Rule 1.1 of the ABA Model Rules of Professional Conduct requires lawyers to maintain technological competence. That includes knowing how AI tools work—and understanding their limitations. A policy can ensure firm-wide adherence to these standards. North Carolina has gone further and adopted 2024 Formal Ethics Opinion 1, “Use of AI in a Law Practice” which sets out in detail a lawyer’s ethical responsibilities concerning AI.
3. Risk Management and Liability
Misuse or over-reliance on AI can result in errors in legal analysis, missed deadlines, or even malpractice claims. With an AI use policy in place, firms create a structured approach to mitigate risk, establish accountability, and prevent misuse.
4. Operational Consistency
As more lawyers experiment with tools like ChatGPT, Harvey, or AI-powered document automation platforms, inconsistency can creep in. A firmwide policy helps standardize the adoption and application of these tools, ensuring alignment with firm goals and client expectations.
Best Practices for Creating an AI Use Policy
1. Define Acceptable Use
Specify which tools are approved for use and in what contexts. For example, generative AI may be acceptable for internal brainstorming or template drafting, but not for court filings or final client communications without thorough review.
2. Mandate Human Oversight
Require that all outputs generated by AI—whether contracts, emails, or pleadings—be reviewed by a qualified attorney before use. The policy should reinforce that AI is an assistant, not a replacement.
3. Protect Confidentiality and Data
Make it clear that confidential or personally identifiable information should never be entered into open AI platforms unless proper privacy safeguards are in place. Work with IT professionals to ensure secure tools and data handling which will almost always require paid subscription services.
4. Educate and Train
Provide regular training on how to use AI tools responsibly, keep staff informed of emerging risks, and promote technological competence as a firm value.
5. Include a Monitoring and Review Process
Technology moves fast. Your policy should be a living document, reviewed regularly and updated as new tools, regulations, and risks emerge.
Resources to Get Started
Legal tech leaders like Clio have created valuable toolkits and sample policies to help firms get started. Clio’s AI Resource Center includes customizable templates, best practice guides, and safety checklists tailored to law firms.
By leveraging these resources and implementing a thoughtful AI use policy, your firm can benefit from the advantages of AI—without sacrificing ethics, compliance, or client trust.

Artificial Intelligence (AI) continues to revolutionize how legal professionals conduct research, analyze data, and produce documents. For solo and small law firms in particular, AI seems to promise efficiency and affordability. But the convenience of free AI tools can come at a steep ethical and professional cost—especially when dealing with confidential client information such as medical records, police reports, or financial disclosures.
This article highlights the specific risks of free AI platforms, summarizes the North Carolina State Bar ethics opinion on AI, and offers practical tips for safely integrating AI into your practice.
What the NC State Bar Says About AI Use
In 2024 Formal Ethics Opinion 1, the North Carolina State Bar made clear: lawyers are permitted to use AI in their practices—but only if they do so competently, securely, and with proper supervision. Key takeaways include:
- AI must not compromise client confidentiality: Lawyers are required under Rule 1.6(c) to make reasonable efforts to prevent unauthorized access to client information. That includes ensuring the AI platform won’t use client data to train its model.
- Competence includes technological understanding: Per Rule 1.1, lawyers must stay abreast of technological changes—including understanding the benefits and risks of AI tools used in legal practice.
- You can’t outsource responsibility: Whether work is done by a paralegal, summer associate, or AI program, the lawyer must independently review and take full responsibility for all outputs used in client matters.
- Client consent may be required: If a lawyer uses AI in substantive legal tasks (e.g., drafting estate plans, contracts, or pleadings), and especially if the tool affects fees or billing, the client must be informed under Rule 1.4.
- Avoid free public AI tools for client-specific data: The opinion cautions lawyers against inputting sensitive information into publicly accessible AI platforms that may store, retain, or use submitted data for model training.
High-Risk Areas for Confidentiality Breaches
Some legal practice areas are especially vulnerable when using unsecured AI tools:
- Personal Injury & Medical Malpractice: Summarizing client medical records in free tools may violate HIPAA.
- Family Law: Custody evaluations, financial affidavits, and therapy notes must remain confidential.
- Criminal Defense: Sharing police reports or client statements in AI platforms could undermine privilege.
- Elder Law & Estate Planning: Drafting wills or powers of attorney through AI requires safeguards against data leakage.
Red Flags in Free or Consumer-Grade AI Tools
Ask the following before using any AI platform:
- Is the platform trained on user-uploaded data?
- Are there binding terms about confidentiality and data retention?
- Is data encrypted at rest and in transit?
- Can you delete client data from the system permanently?
- Is this platform created for legal use—or for general consumers?
If the answer is uncertain, do not use that platform with client information.
Best Practices: How to Use AI Responsibly in Your Firm
1. Choose the Right AI Platform
Pick vendors who design tools specifically for legal professionals and who offer:
- Written confidentiality and non-disclosure agreements
- Data residency and encryption
- Opt-out provisions for model training
- Clear client data ownership and deletion policies
Evaluate vendors by reviewing their company reputation, data security protocols, and contract provisions for how data is stored and destroyed if services end.
2. Vet Every AI Tool Like a Vendor
Think of AI platforms like any third-party contractor. You are obligated to ensure they operate in a way that’s compatible with your ethical duties. This includes:
- Periodic audits
- Security consultations
- Ongoing training for your team
3. Maintain a “Human in the Loop”
Inquiry 4 of the AI opinion asks “if a lawyer signs a pleading based on information generated from AI, is there variation from traditional or existing ethical obligations and expectations placed on lawyers signing pleadings absent AI involvement?” The opinion response is “no, a lawyer may not abrogate their responsibilities under the Rules of Professional Conduct by relying on AI”.
You must always review the AI’s output. AI hallucinations (inaccuracies) are still a real risk. Use the correct AI tools – for example, don’t trust legal citations to ChatGPT or Microsoft CoPilot, rely on the legal research tools which have their AI incorporated into the products for brief writing or legal research.
4. Be Transparent About Fees
The State Bar prohibits the lawyer for billing a client for three hours of work when only one hour of work was completed. While AI will provide efficiencies, the lawyer’s billing must be accurate, honest and not clearly excessive. Most law firms need to engage in discussion to determine how to bill for efficiencies that result from using AI tools. Flat fees or itemized charges for AI usage are allowed, provided they are transparent and not excessive.
5. Avoid Inputting Confidential Information Into Free Tools
Unless the tool offers specific enterprise safeguards or is configured for legal compliance, don’t risk uploading client files. This includes tools like ChatGPT, unless using a business or enterprise version that allows full control over data privacy.
Bottom Line: Ethics Over Efficiency
AI can support faster research, more consistent drafting, and improved client service. But convenience should never come at the cost of compliance—or client trust. By choosing tools carefully, training staff, and maintaining independent professional judgment, even solo and small firm lawyers can benefit from AI while staying fully within ethical bounds.

We’re talking with Pegeen Turner, owner of Legal Cloud Technology, and now founder of Law Firm Defender. Her company specializes in managed IT services tailored for law firms — a topic many solos and small firms want to better understand.
Q: Pegeen, thanks for joining us! Let’s start with the basics — what is Law Firm Defender, and what inspired you to start it?
A: Thank you! Law Firm Defender is a managed IT services company built specifically for law firms. After more than 20 years working with lawyers on their technology, I saw a clear need for reliable, proactive IT support — especially for smaller firms that don’t have in-house IT teams. I wanted to create a law firm specific service that not only solved problems but prevented them from happening in the first place. I also found that so many IT companies say they work with law firms, but they don’t know how law firms operate or their software. We know law firms and their software. We understand that downtime is critical for a small firm and try to be as responsive as possible.
Q: “Managed IT services” is a term we hear more and more, but not everyone knows what it actually means. Can you explain it in plain English?
A: Absolutely. Managed IT services means outsourcing your technology management to a company that handles it all for you — like having a full-time IT department, but without hiring one. We monitor your systems 24/7, update your software and security tools, back up your data, troubleshoot problems, and help you plan for future needs. It’s a monthly service model, so it’s predictable, scalable, and proactive.
Q: How is that different from just calling a tech person when something breaks?
A: Great question. That’s called “break/fix” support — and it’s reactive. You’re putting out fires, often at the worst possible time. Managed services, on the other hand, are proactive. We’re watching your systems in real time, applying security patches, making sure backups are running, and often fixing issues before you even notice. It’s like the difference between going to the ER and having a primary care physician who helps you stay healthy.
As Law Firm Defender has been working with firms over the last year, we found that smaller firms need flexibility. The true MSP model doesn’t work for all firms. Some firms want that comprehensive service and other firms need security services to protect them, but don’t want to pay a monthly fee for help desk that they don’t need or want. We offer that flexibility. When we onboard a new law firm client, we talk about their needs and put systems in place that will work best for them and their budget.
Q: What kinds of things do you take off a lawyer’s plate when you manage their IT?
A: We handle everything from setting up secure email and document management systems to managing antivirus, firewall protection, and device encryption. We make sure that your systems are up to date, provide recommendations on how to ensure that your systems and data are safe and secure. We run regular security audits, provide cybersecurity awareness training, and make sure your software is compliant with industry and state bar standards. We also help with transitions — like when you are starting a new firm, hiring, moving to the cloud, or updating software. We’re your law firm IT partner, not just a vendor who does not know your business.
Q: A lot of solo and small firm lawyers are concerned about cybersecurity. How does Law Firm Defender help there?
A: That’s one of our biggest focus areas. Law firms are prime targets for cyberattacks, and small firms are often the most vulnerable. We build layers of defense — including endpoint protection, multi-factor authentication, encrypted communications, and backup/recovery plans. We also help lawyers understand what ethical obligations they have around technology and data protection, and how to meet them.
Q: What’s the ideal time for a lawyer to bring in a company like yours?
A: If you’re relying on DIY tech or a friend who “knows computers,” it’s time. Ideally, before you have a data breach, ransomware attack, or major tech failure. But even if you’ve already experienced one of those, we can help you rebuild smarter. We work with solo attorneys just starting out as well as small firms with 5–10 lawyers. It’s never too early to take IT seriously.
Also, if you are not sure what you are getting out of your current MSP, I am happy to chat! Often, the firms who come to us for advice don’t understand what they are getting with their current MSP. They pay a lot of money and are not sure what type of response / service they should receive. They think because they are small, it should take their IT folks time to get back to them. But time is money for law firms! That is now how your IT service should work.
Q: What’s one thing you’d want every small firm lawyer to know about managing their IT?
A: You don’t have to do it alone — and honestly, you shouldn’t. Technology is now a core part of how law is practiced. Having professional IT support isn’t a luxury — it’s part of running a secure, ethical, and efficient law practice.
Q: How can someone learn more about working with Law Firm Defender?
A: Visit www.lawfirmdefender.com or reach out to me directly. We offer consultations to assess your current setup and help you figure out what level of support makes sense for your firm. We’re here to make technology work for you — so you can focus on your clients.

Artificial Intelligence is rapidly reshaping legal work—offering powerful tools for research, drafting, contract review, and case management. But with great power comes great responsibility: you must understand both the promise and the risks to remain ethical, competent, and compliant.
Why You Should Read 2024 Formal Ethics Opinion 1
On November 1, 2024, the North Carolina State Bar issued 2024 Formal Ethics Opinion 1 – “Use of Artificial Intelligence in a Law Practice”. It provides clear guidance about:
- Maintaining competence under Rule 1.1, including understanding AI’s strengths, invisible biases, and “hallucinations” (Clearbrief)
- Protecting client confidentiality (Rule 1.6)—only use AI platforms with strong security, encryption, and clear data handling protocols (North Carolina Bar Association)
- Supervising use of AI (Rule 5.1 & 5.3)—you must oversee how associates or non-lawyers use such tools (One Legal)
- Reasonable billing (Rule 1.5)—don’t bill clients for AI speed-ups; ensure your fees reflect actual work (One Legal)
Stay Informed—Follow Trusted Tech & Legal Tech Sources
Tech is evolving daily. It is up to each of us to stay informed. We encourage following reputable blogs and news outlets:
- North Carolina’s own coverage through news reports in NC Lawyers Weekly, bar association newsletter articles, and the NC State Bar publications.
- Bob Ambrogi’s blog, LawSites, was among the first legal blogs. He offers robust articles about breaking news in technology, including the issues of AI’s legal and ethical concerns such as hallucinations, confidentiality risks, supervision, billing, etc.
- Jordan Furlong is a lawyer, thought leader, and strategist who writes about the best new ideas and strategies for transforming the legal world through his blog on Substack, available through a free subscription.
- Clio, the cloud-based, practice management software, provides excellent information available whether you are a subscriber of their software or not. Their resources include blog posts, e-books, white papers, surveys, and sample AI policies for your law firm.
Be Wary of “Free” AI Tools & Misuse
Free, consumer-grade AI tools often come with hidden costs and ethical traps:
- Hallucinations: AI-generated text may look convincing—but contain fake cases or citations. UK courts recently flagged this citing of nonexistent cases as a “risk to justice”. You must use appropriate legal research platforms – all of them have their own AI proprietary tools – rather than asking for help from ChatGPT or Microsoft CoPilot. Always verify case law and statutes.
- Breach of confidentiality: Entering client data into a public AI provider like ChatGPT could expose sensitive information or violate privilege unless proper safeguards are in place.
- Hidden terms of service: Many free AI tools retain your input for training or allow third-party data access without you realizing. Vet terms of service thoroughly and never use free tools.
- Security gaps (“shadow AI”): When team members use unauthorized AI tools, they may bypass cybersecurity protocols. Establish approved platforms and clear firm policies. The Clio website has sample AI policies for law firms to implement.
Practical Tips for Ethical AI Adoption
- Read and understand 2024 FEO 1 thoroughly. Include the opinion alongside your AI use policies.
- Maintain tech competence—attend CLEs, read tech blogs, and attend workshops and CLEs. Obtain an AI certification – many of the top universities offer online certification programs, as well as LinkedIn Learning, or training classes through MasterClass.
- Vet all AI tools: look for encryption, data usage transparency, secure deletion, and jurisdictional compliance. If you aren’t sure how to do this, work with a technology consultant such as Pegeen Turner with Legal Cloud Technology or reach out to Catherine Sanders Reach with the NC Bar Association Center for Practice Management.
- Track and supervise usage: know who uses what AI, how often, and for what.
- Update client agreements: include AI disclosures, consent language, and how efficiency savings will be passed on.
- Verify outputs: always humanreview AIgenerated research, memos, or filings.
Recommended Reading & Resources
- NC Formal Ethics Opinion 2024FEO1—your ethical compass.
- One Legal blog, “AI legal issues: Risks and considerations” – in-depth on hallucinations and confidentiality
- ABA Business Law Today’s “The Art of Mitigating Risk in the Brave New World of AI” – risk-focused strategies
- Clio AI Resources
Final Takeaway
AI can elevate your practice—especially as a solo or small firm. Just be sure you’re:
- Ethically grounded (thanks to 2024-FEO-1),
- Technologically informed and cautious,
- Transparent with clients—and
- Vigilant in verifying AI’s outputs.
Approach AI as a trusted assistant—not an autopilot—and you’ll tap into efficiency with confidence.