Education Tech | May 9, 2023

Generative AI opportunities and risks for impact investing in education

Rahul Bhide and Ainsley Campbell
Guest Author

Rahul Bhide

Guest Author

Ainsley Campbell

OpenAI’s chatbot ChatGPT took just two months to reach 100 million users. As impact investors, we’re excited about how generative AI might contribute to deeper social impact – in education in particular.

Educators’ responses to ChatGPT have been mixed and heated, ranging from a desire to ban it, to an interest in understanding its potential role in driving student outcomes and success. Education-focused investors have tended to be more optimistic on both the potential for generative AI and how to limit its negative impacts on education. 

Reach Capital, an Impact Engine portfolio fund and co-investor, is excited about AI and has mapped potential impacts, including content creation, personalization of content, and as a creative tool. Anne Lee Skates of a16z predicts that “ChatGPT is hardly the end of education” and can enhance “students’ abilities to think critically and expand their soft skills.”

Applying our lens to the technology, the two places we’d look to see AI’s impact in education are increased access and improved educational outcomes

But before we go deeper, here’s what ChatGPT has to say.

Prompt: Can generative AI improve access to education?

Response: Yes, generative AI has the potential to improve access to education by creating more personalized and engaging learning experiences. For example, AI-powered tutoring systems can provide real-time feedback and adjustments to educational content based on a student’s progress and needs. 

Additionally, generative AI can be used to generate educational content, such as text or multimedia, that is tailored to a specific learner’s level, background and learning style. This can help make education more accessible for students with diverse backgrounds and learning needs. 

However, it’s important to note that AI-powered education is still in its early stages, and its impact will likely depend on how it is implemented and used in practice.

Not a bad start. But as ChatGPT points out, implementation is critical. 

Engaging students

We think the clearest case for AI’s impact is in making education and content more engaging. 

Making learning more active and engaging has been shown to have better outcomes. Generative AI could be used to make lessons and content more customized, as other investors and entrepreneurs have written, or to assist teachers with lesson planning and support materials, which we believe can lead to improved educational outcomes. 

We’ve also seen a number of companies in the XR space that are building education-related and up-skilling content. Generative AI could be used to deepen and customize lesson materials and simulations.

The ability of ChatGPT-4 to generate images can certainly help. Real-time feedback on assignments and writing could also assist with learning. 

Generative AI may change how teachers choose their content and lessons beyond simply creating engaging material. ChatGPT could encourage educators to place more emphasis on work, such as group projects, verbal presentations or debates, above written deliverables. Some educators argue for allowing the tool in the classroom, as it will prepare students for using it in their careers, similar to spreadsheets or calculators. Of course there are concerns about its dishonest use and for plagiarism – by writing whole papers, for example – but that’s a question that has to do with academic honesty and what disclosures students need to provide about how they use AI tools.

Another impact angle we think about is the intersection of content creation and access. Can generative AI enable more frequent or more cost-effective content creation? It’s certainly possible. 

Meagan Loyst of Gen Z VC used generative AI to write and illustrate a children’s book in two hours. She had to tweak the content and revise the prompts, so it wasn’t an entirely hands-off process. A textbook, where accuracy and contextualization is critical, may be significantly harder.

Given market dynamics in education textbook approvals and purchasing, and the broader textbook publishing industry, AI may not bring prices down in the near or even mid-term. While we don’t see cost being a major impact lever, there is potential for more frequent updating and generation of real-time, customized content, to play a supporting role for teachers and textbook publishers. 

Considering guardrails

When thinking about a new space or model, it’s important to examine the impact risks or potential dilution of impact, as well as what guardrails could be put in place.

There are concerns about the use of ChatGPT and generative AI, particularly in the hands of students. Perhaps the most discussed concern is also the most obvious: how will educators know whether and how students have used ChatGPT in their assignments? At this point there are no tools that are accurate in deciphering what is ChatGPT-produced and what isn’t. 

While this is a concerning issue, it is not completely unlike the academic integrity issues that arose from the introduction of calculators, spreadsheets, or spell check. It may influence the types of assignments that teachers include in grading, and may ultimately change how cheating and academic dishonesty are defined. 

There are also concerns about the potential inaccuracy of ChatGPT and other generative AI tools. ChatGPT can answer similar questions inconsistently or just plain incorrectly. That’s because ChatGPT (and potentially other models) is trained on a wide variety of sources, including potentially incorrect information. 

As a caution, upskilling platform Workera uses generative AI to create questions for skills assessments but has human moderators vet them before the questions are made live, demonstrating that generative AI still has some distance to go before the inaccuracy risk is addressed. 

Companies seeking to sell AI tools and AI-based content to school districts will need to prove that issues with accuracy have been addressed in order to achieve scale. 

Other generative AI solutions could be designed using internal data sources, which might give them a higher degree of accuracy. In a non-education context, Seek.AI uses generative AI to assist non-technical business users’ access and analyze internal company databases. 

Another concern is inconsistent or inequitable adoption and deployment of generative AI-based solutions. If schools and districts with larger budgets and/or in higher-income areas are able to deploy generative AI solutions, particularly those that offer real-time feedback and more customized learning, then higher-income and privileged learners will be immersed in environments with faster feedback loops, potentially accelerating their learning more than lower-income and vulnerable students.

Additionally, if generative AI becomes part of the workforce toolkit, familiarization could benefit certain workers more than others. 

In both cases, the effect would be a widened achievement gap. 

But these are concerns for edtech solutions more broadly, and today, with different funding buckets in place to increase the adoption of edtech in public schools, it is less of a concern. It is one we would still watch closely.

Impact opportunities?

What would Impact Engine need to see to consider an investment in an education-focused startup using generative AI? As with any of our investments, we would need to be able to identify the positive impact being created and be able to define metrics that measure it. For increased access, that could be the share of neuro-diverse students engaging. In the case of real-time feedback, that could be the share of students that did not previously have access to this kind of service. For improved educational outcomes, that could be increased academic achievement, increased participation, and higher engagement (including time spent engaging with customized versus non-customized content). 

Student engagement has been a concern before the pandemic. Only 47% of grade 5-12 students said they were ‘engaged with school’ in 2018. That increased to 50% during the pandemic. We’ve since seen the impact of that and other educational headwinds: reading and math performance dropped to levels not seen in 20 years, with high-poverty schools experiencing higher achievement losses.

Generative AI could have a role to play in reversing those trends and improving educational access and outcomes. Koalluh, an AI reading platform, helps teachers engage children through targeted, student-driven reading practice. Children can create their own hyper-personalized stories with interactive generative AI. Their solution enables increased reading engagement, which can lead to better reading outcomes. The company is planning to develop audio listening, for children who may not be able to engage with written material. Having the ability to tailor books to the specific reader, such as an English language learner or language development student, would also significantly increase education access. 

We have identified other considerations as well. One is whether AI tools are teacher-facing vs. student-facing. Schools may (and already have) limit student-facing generative AI, but may be more open to teacher-facing tools, for example, lesson planning and support material creation tools.

Another: How will generative AI solutions integrate with Microsoft and Google education products? Will solutions built on ChatGPT or Google’s Bard have an easier time selling within their own ecosystems? ChatGPT and other Open AI solutions are being integrated into Microsoft’s Bing and Office suite, such as Microsoft’s Copilot.

Also: What truly constitutes a defensible moat? After Chat GPT-3, a large number of startups launched generative AI solutions in education. With the GPT-4 release, new education partnerships were announced. Khan Academy’s Khanmigo is a virtual tutor for students and a classroom assistant for teachers. Duolingo’s Max offers a role play feature, an AI conversation partner, and an “explain my answer” feature, which explains language rules after a user has made a mistake. These powerful OpenAI partnerships with large incumbents likely forced many of the startups building in the space to rethink their business models. We are perhaps moving in the direction where the use of generative AI is more commoditized in some way, and defensibility looks more like distribution and/or partnership advantages.

We’re always keen to learn what interesting and impactful things people are building in the space. Feel free to reach out!


Rahul Bhide a senior associate and economic opportunity lead at Impact Engine. Ainsley Campbell is an MBA Intern. Impact Engine is an institutional investor managing venture capital and private equity strategies that drive positive impact in the areas of economic opportunity, environmental sustainability, and health equity.