Legal Tips for Investors in Artificial Intelligence (AI) Startups

BY

Investors have poured staggering amounts of cash into Artificial Intelligence (AI) startups since late 2022. In 2023, a year that was otherwise lukewarm for tech investment, AI funding seemed to buck the trend.

Interest in AI companies is not limited to large Venture Capital firms. Private AI companies have raised cash from angel investors, friends and family members, and other small-check investors. While smaller investors often lack the legal budget to conduct rigorous due diligence before investing, they have useful tools at their disposal to de-risk AI deals. This article describes legal methods investors can use to do so.

The Unique Portfolio Risks Presented by AI

When companies develop AI technology — or use third-party AI in their own products, for that matter — they introduce risks that other software does not. Some of those risks became apparent in the last year of news headlines. Employees have accidentally waived trade-secret protection by typing confidential material into AI prompts; a retailer’s AI tools allegedly disclosed a teenage girl’s pregnancy to her parents; and companies began realizing how easy AI tools make it to infringe other firms’ intellectual property.

Errors like these can jeopardize investors’ returns by making it harder for companies to get acquired, or even to raise follow-on funding. A few missteps in a company’s early days are not uncommon, but AI-related errors can create liability that makes investment or acquisition too risky.

Legal Tips for Managing AI Investor Portfolio Risk

To protect their investments in AI startups, investors can take steps including the following.

1. Add AI-specific warranties to deal documents.

Many investment deals include lengthy “warranty” sections in which the startup must promise investors that they have not engaged in various behaviors that can heighten risk: they have not violated employment laws, do not export technology to restricted countries, and so on. From the investor’s perspective, each warranty helps confirm the deal is within the investor’s overall risk tolerance.

Investors should begin requesting new warranties in their investment documents now. (These usually appear in a Stock Purchase Agreement or Subscription Agreement). Startups should warrant that they have not violated the Terms of Use of third-party AI tools they use. They should also promise that their use of AI tools has not contaminated or devalued their own proprietary intellectual property. Companies should warrant that their employees’ use of AI tools has not spilled personal data or any information protected by Non-Disclosure Agreements.

Warranties like those described above are already fairly common in priced-equity rounds, where investors purchase preferred stock in startups and tend to have lengthy deal documents. They are less common when investors purchase SAFEs (Simple Agreements for Future Equity) or convertible promissory notes. That is not to say that buyers of SAFEs or notes cannot negotiate for them, however. They can take the form of a side letter that accompanies the other investment documents

2. Push companies to adopt AI usage policies.

All tech companies should have written policies on the use of AI by their employees and contractors. This is particularly true when the company’s product itself is an AI offering. These policies can be as simple as requiring a head developer’s approval before adding third-party AI tools to the company’s tech stack. They can also include lists of solutions that are banned, approved, or permitted with restrictions. The goal is to ensure the tech stack never derails the company’s future exit plans.

These policies should be written, should require personnel to confirm their agreement to them, should spell out consequences for their violations (potentially including termination), and should be regularly updated. The company should write these policies with legal counsel and assume investors will scour them.

Investors should feel free to ask their portfolio companies about their written AI policies and to see copies of them. There is no shortage of resources available to tech companies seeking to implement AI policies.

3. Confirm the startup has the right domain expertise.

Not all startups need an AI expert on payroll, but companies making AI or in-sourcing AI tools most definitely do. The AI expert should have the experience to be able to vet AI tools before bringing them into the company.

If you are an investor in an AI startup, ask about the company’s in-house AI expertise. You have the right to ask about matters that are key to the company’s success. And since every investor has a personal network, you may be able to help find and hire AI experts where your portfolio companies lack them.

4. Diligence.

All investors should do some degree of diligence before investing in any startup. Larger dollar-amount deals involve more rigorous diligence with thousands of documents in data rooms, but even smaller deals should involve some basic investigation. Investors can ask companies for copies of the Terms of Use of the third-party AI tools they have used.

Where startups publish their own AI applications, investors can review the companies’ own customer-facing agreements as well. Investors can review these together with their attorneys and, where additional domain expertise is needed, with consulting developers.

Conclusion

The AI sector continues to expand and to transcend sectors. The more ubiquitous AI tech becomes, from smart lawnmowers to mobile fitness apps, the more investors should take steps to reduce investment risk. The legal tips above are not the only tools investors should use, but they should be on the list.

*****

Adam Nyhan represents companies that use and develop AI tools as well as angel investors. He is an attorney in Perkins Thompson’s Banking & Financial Services, Business & Corporate and Intellectual Property & Technology practices.

Related Content