April 10, 2026
|5 min read
|By Rob
The Legal Line Every AI Advisory Product Is Walking
Every founder building an AI advisory product is walking the same legal line, whether they know it or not. There is no federal statute that directly regulates AI advisory tools. What exists instead is a patchwork of professional practice statutes — unauthorized practice of law, investment adviser registration, medical device regulation — being applied to AI systems by analogy. The terrain is genuinely unsettled, and the ground keeps shifting.
The core distinction for legal tools is "general information" versus "personalized legal advice." That distinction has been the primary defense for LegalZoom and Rocket Lawyer for more than twenty years. It rests on whether the content is specific to the user's individual legal situation. General information about contract law is not practicing law. Telling a specific user that their specific contract clause is unenforceable in their specific jurisdiction might be.
AI makes the line harder to hold, because AI is by design contextual and specific. A well-built AI advisor that asks about your business, your jurisdiction, your situation, and your question — and then gives you a tailored answer — starts to look a lot more like advice than information, no matter how many disclaimers it wraps around the output.
Investment advisory exposure follows the same logic. The Investment Advisers Act requires registration for anyone providing investment advice for compensation. If a product charges for AI-generated investment guidance, that likely triggers registration — either as a registered investment adviser or through a solicitor arrangement with a registered firm. The SEC's 2023 AI guidance focused on disclosure and custody rules; it did not clarify the core question of whether AI-generated personalized recommendations require RIA registration.
The FDA's framework is somewhat more developed. The agency distinguishes medical "devices," which require clearance, from health "information tools," which fall outside device classification. Symptom checkers and general health education sit on the information side. Systems that make specific diagnostic or treatment recommendations sit on the device side. The line is not always obvious.
These risks are real but manageable. LegalZoom has operated under the general information defense for more than two decades, weathered bar complaints and lawsuits, and never had to fundamentally change its model. The defense holds when products are deliberate about framing, disclaim appropriately, and stop short of specific professional advice.
EAS is built with that framework in mind. The advisors operate as discussion partners — they help you think through decisions, surface considerations you missed, and push back on weak assumptions. They do not file documents, place trades, or issue professional certifications. The product is designed to be useful right up to the boundary without crossing it.
Whether that boundary holds as AI gets more capable is genuinely uncertain — which is exactly why our General Counsel advisor tracks the regulatory environment around AI advisory liability. If the line moves, we want to know before our product is on the wrong side of it.
Questions about this? Want to discuss your project?
Book a free scoping call →