In November of 2022, a Silicon Valley company launched an invention that could complete students’ homework for them. Available only to subscribers at first, by the spring of 2023 OpenAI’s ChatGPT-3.5 was available to millions of students. As of January 2023, anyone with . . .
In November of 2022, a Silicon Valley company launched an invention that could complete students’ homework for them. Available only to subscribers at first, by the spring of 2023 OpenAI’s ChatGPT-3.5 was available to millions of students. As of January 2023, anyone with internet service can access the next generation, GPT-4, using Microsoft’s Bing, for free. ChatGPT and other emerging models like it are a form of generative AI, and its widespread availability poses new challenges and opportunities for schools. The response from educators falls along a spectrum, of enthusiasm and optimism on one side, and fatigue, bitterness, and pessimism on the other, and commonly, a mix of positive and negative attitudes. The optimists, the pessimists, and the ambivalent all agree that the sudden, widespread availability of generative AI has been a jolt that has left them scrambling to adapt for the past year.
I saw an email and the title was “Have AI Write Your Next English Paper” and the caption was “Buckle up, here it comes!” I think my emotional reaction was curiosity, but I can tell you in my district, the range of emotions does span from dread all the way to excitement.
AI hasn’t provided ANY benefits to me as an English teacher. And it hasn’t forced me to change much. It’s just another thing I have to deal with. One of too many.
Generative artificial intelligence takes in prompts in many forms, often natural human language, and creates new content such as images, texts, or audio. This “new” content is generated from statistical patterns found in vast datasets, resulting in images, text, and media that replicate features found in the dataset. While generative AI is not new, powerful applications like GPT-4 have in the past year become widely accessible to users without computer programming knowledge. A handful of educators have been thinking about AI’s impact on education for years. The advent of widely available, super-powerful generative AI and its capacity to help students complete assignments has forced many more educators to pay close attention to it, whether they want to or not.
In education, AI optimists envision a future in which students develop their creativity and higher-order thinking, supported by AI, with a deliberate balance of new and traditional classroom practices, freeing teachers to focus on personal interactions. AI pessimists see a future in which many students use AI to bypass cognition or to cheat, while teachers waste time and energy on cat and mouse games or digital surveillance. Pessimists also imagine a scenario in which generative AI is over-adopted, particularly in resource-constrained schools, reducing the teacher workforce, and widening existing divides.
Generative AI is an “arrival” technology. Unlike laptop computers, its presence in schools is not the result of a policy of adoption. Like smartphones, students are using generative AI on school assignments regardless of whether the schools encourage or forbid it. In many respects, the stakes of “arrival” technologies in schools are much higher than the stakes of technologies brought in by school leadership. If schools did not address tablets around 2013, they may have missed opportunities, but they were not harmed. Arrival technologies, on the other hand, can harm school environments if not effectively managed. And more affluent students have an advantage since they are more likely to have higher-quality access to AI, as well as have access to additional staff managing the application of the technologies.
As a class of technologies, generative AI has a “jagged technological frontier”2: AI can be shockingly good and then comically bad at very similar tasks. It will take time to determine how AI can be useful in the scholarly disciplines, in the working world, and in the civic sphere. While generative AI may become a tool educators can deploy toward better outcomes, adapting to it will be challenging. Without the needed time and resources, educators will struggle to adapt, and the pessimistic predictions will be more likely to come true.
Here at MIT, we exist in a technology-rich environment that prizes testing and exploration. Our institution has expertise in AI and is actively shaping its future. To that expertise, we also add our extensive experience working with educators and students. We are aware that this is a particularly challenging time for schools, emerging from a devastating and contentious pandemic, with historic levels of burnout and turnover in school personnel.3
As we consider generative AI in schools, we are guided by constructionism, the idea that people learn best through experiences and through making and sharing things. Constructionism prizes making connections to prior knowledge and building understanding through coaching rather than lecturing or drilling skills. It also values the sharing of knowledge and artifacts between peers.
In this article, we will provide a glimpse of what’s happening in schools right now, contextualize experimentation with generative AI, explore key concerns around the implementation in schools, and provide a roadmap for adapting to generative AI.
We recommend that schools pursue considered, limited experimentation without making undue pedagogical or financial commitments, that schools facilitate access to AI with thoughtful guardrails, that educators consider what productive thought students should engage in, and that industry, researchers, and policymakers work together to support educators and students as they adapt to this disruptive technology.
The path will become more evident as generative AI develops. In the meantime, schools should build familiarity with generative AI, explore pedagogical possibilities, and address immediate challenges around assessment and academic integrity.
Educators are well-acquainted with technology hype cycles. In the past twenty years, they have weathered the rise of smartboards, 1:1 (one device per student), “freemium” apps, and virtual reality. While individual teachers have done amazing things with recent tech, none of these technologies has lived up to the wide-scale “disruptive” promises made by their proponents, and educators know that simply implementing technology does not guarantee better learning outcomes.
At this moment, educators are unprecedentedly beleaguered. The COVID-19 pandemic, the conflicts over opening school buildings, the challenges of teaching remotely, and the conflict over “divisive speech laws” have combined to lead to historic levels of burnout and turnover.4 Teachers who might otherwise energetically experiment with generative AI technology are holding on by their fingernails, with little energy for experimentation. For many, generative AI affords another unwelcome chore next to too many parents’ emails to answer, too many papers to grade, and too many memos to read.
We need time. Teachers are going to need time to experiment with AI and learn how they can leverage it in their practice. We all need time to talk about how we can get the most out of it.
We talked with teachers and administrators to learn how schools are responding now. Each school is unique; each is grappling with its own set of challenges and opportunities in terms of resources, student population, regulatory environment, and more. Schools’ approaches range from inaction, outright banning, informal and sporadic experimentation, and sanctioned experimentation. One example of a thoughtful approach to generative AI comes from our neighbors in Westwood, Massachusetts.
In December of 2022, Steve Ouellette, the director of technology at Westwood, read about ChatGPT on an email list. The post included a video showing the chatbot writing an essay about the play A Raisin in the Sun. Ouellette says, “The sub caption was ‘Buckle up, here it comes.’”
Westwood is a well-resourced school district with a track record of experimenting with technology. Ouellette urged the superintendent to send a memo that weekend, notifying the faculty about a tool that could potentially complete students’ assignments, allowing them to bypass productive thought, or, in other words, to “cheat.” Ouellette enjoys experimenting, so he asked ChatGPT to draft the memo. Its first draft was dry and officious, so he asked it to rewrite with a joke about France winning the World Cup final, “and the way it did it was magnificent.”
ChatGPT could be used to create personalized lesson plans or homework assignments, provide instant feedback on student writing, or even help generate entire lesson plans for teachers. But that’s not all. ChatGPT could also be used to help students learn other languages, such as Spanish or French (which, by the way, I think will win the 2022 World Cup). Imagine being able to have a conversation with ChatGPT in French and receiving instant corrections and feedback on your pronunciation and grammar.
Immediately, teachers, particularly English teachers, expressed concerns about academic integrity issues, and they rely heavily on writing assignments and saw the introduction of ChatGPT as a potentially devastating development. Ouellette says, “I’ve referred to the English department as ‘ground zero’. It’s unfortunate, because their first experience with it was just intensely negative. I don’t think that they’ve recovered from that.”
Ouellette invited an expert on generative AI to present at a meeting with department heads. Then in the summer of 2023, the district convened an AI working group to create recommendations for moving forward. Students were not included in the working group, but Ouellette did speak with students, many of whom expressed concerns about how AI might affect the college application process. Training opportunities were added for the 2023–2024 school year. The working group also created a resource for staff called an “AI Idea Bank,” which offered suggestions of how teachers could use ChatGPT.
Westwood blocked the ChatGPT service through grade 7 because its terms of service do not allow children younger than 13 to use it. (Children 13–18 require parental consent.) The service is open for grades 8 and beyond. In November 2023, Westwood hosted a panel that allowed students and community members to voice concerns, and included insight from AI experts.
We think Westwood’s example is a useful model for open communication, participation from different stakeholders, and an informed approach to adapting to generative AI. According to our scan of teachers, Westwood has done more to help teachers and students adapt to AI technology than most districts. But Ouellette says the district has been scrambling to adapt, and that he does not think they have done enough: “We’ve done a superficial effort at professional development, because we just haven’t had a lot of time.”
According to a Forbes Advisor survey,5 60 percent of surveyed teachers have experimented with integrating the technology. Here are a few approaches that we found worth considering:
Curriculum generation and personalization. Teachers described creating new homework prompts, instructions for lab experiments, and visual aids. Teachers are using generative AI to add references to pop culture or classroom inside jokes to instructional materials. One teacher described using generative AI to revise assignments, tailoring the difficulty level for students with varying reading levels:
If I wrote XY assignment for them, I have students who are high achievers and love challenges. But on the other side I have other students that are struggling with reading. I can ask the AI to challenge [the high achievers] or to modify it, so it will be easier for my students to understand.
Rapid prototyping and starting points. Generative AI can quickly generate a wide variety of content, even though quality can vary. One teacher asked her students to generate a creative work (a poem, song, story, etc.) about the water cycle. She encouraged her students to use AI to create a first draft of a song. The students then rewrote the song, changing some of the lyrics, and recorded their own performance. Teachers report success when they encourage students to use the AI output as a starting phase in their project, and then critique, edit, or build upon that output.
Tutoring/study buddy. Steve Ouellette described a science teacher who used generative AI to prompt his students who were conducting an experiment. “The prompt was, ‘You are my lab assistant. You need to provide me with the steps to do this lab, one step at a time. And wait for me to tell you that I’m ready to move on.” The teacher observed and helped the students as needed, while monitoring for bad outputs from the AI. Ouellette observed that this approach could be helpful for students with executive function challenges.
Computer science teacher Chad McGowan described how he incorporated AI as an assistant when he assigned his students the challenge of writing software:
The goal is that they learn how to break down their problems so that they can ask focused questions into [ChatGPT], so they can feed a chunk of code that they’ve written into it and say, “Why isn’t this working?” And the responses that they are getting are going to make more sense as opposed to “Write my entire program for me.”
Engaging students/community building. Some teachers report that they have strengthened their relationships with students while experimenting with creative prompts for AI.
I had one student who just asked ChatGPT to write their own obituary. She wasn’t being morbid. I think she wanted to know about herself. I think that students want to know that they would be missed. They want to know what people think of them. They’re yearning for this emotional kind of narrative.”
Many of these examples highlight one of the most important reasons to play with AI: it is fun! These tools sometimes generate content that is funny, weird, and surprising. They bring a spark of the unknown into the routines of school, and that playful spirit has a place alongside the search for efficiency and critical perspectives.
Cheating vs. “bypass productive thinking.” There are valid concerns regarding how generative AI can be used to write essays, solve math problems, write code, and more. However, we want to reframe the question from “How will students use generative AI to cheat?” to “How does technology bypass productive thinking?” Generative AI is not the first technology that can be used to bypass learning; over recent decades, educators have confronted calculators, book summaries, homework websites, and internet searches. They all created initial consternation, and in most cases, educators adapted and sometimes benefitted from including these new tools. Outright banning has not worked well with calculators or internet searches, and likely will not work well with AI. Some educators may point to the use of anti-plagiarism software to detect cheating, but these solutions are often not sophisticated enough or not updated frequently enough to keep up with generative AI technology and bear the significant cost of false positives.6
Data collection. All stakeholders need to be aware of how “free” generative AI platforms may be collecting data and ensure that sensitive information is not shared. Since each generative AI tool has its own collection methods and policies, educators should stay informed on how to evaluate these tools.
Garbage in, garbage out. Because generative AI models are trained on existing online data, they may reinforce harmful stereotypes. A recent Bloomberg study7 to understand bias in generative AI found that the majority of images generated to represent people with high-paying jobs such as “CEO,” “politician,” “lawyer,” and so on had lighter skin and were men, while low-paying jobs such as “fast-food worker” and “social worker” were commonly people with darker skin. Educators might inadvertently promote stereotypes like these or send discouraging messages about career paths to students.
Students using generative AI for academic support may also face the challenge of filtering information produced by the models. Although the outputs from the models may look believable, they are not always accurate. The inner workings of the models are complex and opaque. Students will need to learn tools for verifying accuracy and recognizing bad outputs.
Expense and equitable access. There are monetary costs to using certain generative AI products and services. Students who can afford a personal device may have an advantage over students with fewer resources. And students who attend schools with better-prepared teachers, who have benefitted from more planning time and professional development, will have an advantage over students whose teachers have not had those opportunities.
Computer science teacher Chad McGowan told us his district initially blocked ChatGPT at the server level. He argued for them to unblock it, since the wealthier students would have access at home, using tablets and computers. The poorer students only had their school-issued Chromebooks, and he feared that they would fall behind in learning about the strengths and weaknesses of AI. The district agreed with him and unblocked the service.
Concerns beyond education. Other authors have noted concerns about the energy consumed by generative AI applications, the potential to fuel baseless conspiracy theories, proliferations of misinformation and deep fakes, and the threat to sectors of professional work. Those are discussed more in some of the sources we link to in our appendix.
We have to have a heart for kids. We have to embrace what our kids are really, really excited about. And that’s the different technologies that are going to lead to great jobs for them. And so we have to stay abreast of what’s going on.
I think our purpose is to help them become the best human beings and contribute to a better world. And so if we’re just talking about things like academic integrity, then we’re not doing our job to prepare them for a future where the pace of change is just getting faster.
Most of the products marketed at you are going to not be useful. Anybody who tells you they have a groundbreaking tool… I don’t think they’re groundbreaking. Everybody says they’re changing the game. The game doesn’t change.
In the past few months, companies, institutions, and researchers have been filling in the gaps with reports, policy recommendations, curricula suggestions, and training videos. Almost every EdTech company is racing to launch products around generative AI. The flurry of guidance both helps and confounds, as it can be difficult to glean the most useful information. With that flurry of activity as a backdrop, it is important to focus on what has not changed in education.
Education is a social endeavor. For many students, the primary motivation for learning is social: maintaining relationships and earning the esteem of teachers. For any new learning technology, a productive question to ask is “what human-to-human interaction does this technology support?”
Good teaching will still be needed. Students will still need teachers to guide, support, help, share valuable content, give feedback, and more. Young people will always need the mentorship of caring adults in their lives.
Foundational skills, such as problem-solving, critical thinking, and knowledge in the disciplines still matter. In fact, they may matter even more now. More information is at our fingertips than ever before, but we will need good evaluation and decision-making skills to make use of it.
Students will still need to learn how to make and create. Our world runs on creativity, from science and engineering to history and the humanities. It is crucial to foster the next generation of artists, scientists, entrepreneurs, lawyers, and leaders. We need students to be creators, not just consumers, especially as technology evolves rapidly.
Reviewing and revising what and how we teach is beneficial. The rise of new technology can be a great opportunity to look at curricula critically. Are graduates gaining the skills they need for further studies, work, or civic life? Are they on a path to be able to take jobs available in the community? Do curricula reflect the values of key stakeholders?
Not everything must change immediately. Some educators may be excited about experimenting right now, while others may not. That is okay; schools can be thoughtful and measured in adapting to generative AI. One English teacher expressed that he did not have time to experiment or monitor his students for cheating, so he simply has students write essays in class in order to practice critical skills. This is a reasonable response to the moment, but we hope teachers will find the time and support they need to explore the strengths of generative AI.
Question: What activities spark productive thinking that enhances learning? Educators should consider what kind of thinking they want their students to do. This question can be applied at big strategic levels, and it can also be applied to a specific assignment with the following question: What would happen if the student asked the AI to do the assignment? In some cases, it is possible that evaluating, critiquing, or building upon the AI’s response will engage that useful thinking, indicating a good assignment in the age of generative AI. But perhaps the AI will allow the student to bypass useful thinking. In that case, the teacher can monitor the assignment in class, keeping AI unavailable, quiz students after completing the assignments, or think of another assignment. (We know. Easier said than done.)
Question: How is AI changing a discipline or subject? AI is already changing the fields of software development and graphic design. Teachers of those disciplines will likely want to teach students to integrate AI into their projects immediately. For social studies, history, and English language arts, AI cannot yet produce high-quality work. It might be able to overcome “initiation fatigue” or “writer’s block” to help provide inspiration, but in these fields, the purpose of student writing is to refine reasoning, evidence, and argumentation. When used creatively, AI can play a role in supporting or inspiring student thought and writing, without doing the writing for them.
Question: What repetitive tasks can AI support? Teachers, on average, work more than fifty hours per week, but less than half of that time is spent interacting with students. The rest of their work time is filled with countless small tasks, many of which might be automated with AI.8 Initial reports from teachers and administrators suggest that generative AI might help draft memos, guidelines, rubrics, or emails. The technology cannot yet (and may never) generate a final draft, and educators will need time to learn to use it efficiently.
ChatGPT is really helpful with is what I call “initiation fatigue", where I’m just trying to get a kick…, I need to provide a letter of recommendation about an employee and here are some qualities that make this person really good. It will spit back an output, and I will rewrite 75% of it, but it gives me just a start.
Support teacher and student exploration outside class. It is worth making time to play with AI tools, to learn their capabilities and weaknesses. Greg Schwanbeck, a physics and astronomy teacher at Westwood, asked AI to generate a song in the style of Taylor Swift that explained Newton’s Three Laws of Motion. He says the output was not great, but it was a fun experiment that helped him understand the strengths and weaknesses of the tool.
One fun part of new technologies is being able to share the experience of being a novice with our students. They are as likely to invent and discover useful and creative approaches as adults.
Be skeptical of AI-powered products and tools. So far, we have not seen AI tutors, chatbots, or other helpers with the potential to “change the game” of education, though companies like Khan Academy are working to put better guardrails on their products, despite claims from some techno-enthusiasts.9 Most education will still happen in classrooms and be driven by the social expectations of teachers’ and students’ relationships. We recommend caution in adopting products, especially expensive options, until more research about efficacy emerges.
However, some of those tools will likely have some utility, and assigning or encouraging faculty and students to experiment with new tools is a great idea and can help schools evaluate which tools might best fit their needs and culture.
Recognize that adapting to generative AI will be iterative. As teachers experiment and learn how to effectively deploy, balance, and limit use of this technology, the technology will change. The handheld calculator, the scientific calculator, and the graphing calculator appeared at roughly ten-year intervals,10 which allowed educators to adapt to each in turn. New generations of more powerful AI may appear on much shorter time frames, such that educators might still be struggling to adapt to one, when another appears.
We do not know how quickly this technology will develop. The growth of generative AI technology does not correspond to a predictable rate or pattern as with Moore’s Law.11 Some AI experts think that it is possible we will soon develop very powerful AI technology, sometimes referred to as AGI (artificial general intelligence). Such AGI might supplant human work in many spheres, transforming human social life in ways that are hard to imagine. Other experts think such developments will not happen, or are many decades or centuries away. It is useful to consider that there will likely be future developments in the technology, but we do not know enough about the future to design curricula or strategies around those predictions.
Put coherent policies in place. School policies can help teachers and students recognize boundaries and key principles around AI ethics, authorship, academic integrity, equity, and access, while also highlighting areas where exploration is encouraged. We cannot expect teachers to be able to conduct and lead all this experimentation in their own free time. A signature challenge that education faces, in a moment of postpandemic exhaustion, is finding the resources to pay for teachers’ time to conduct this important reflection, experimentation, and refinement of their practice.
Schools will not be able to move forward effectively without systematic support. History tells us that the schools with the most resources—with reasonable classroom sizes, with enough substitutes, with clean and safe buildings, with technologists and instructional coaches, with professional development budgets—will adapt to disruptive technologies most effectively. Some teachers and schools in less-resourced districts will do brilliant work, but on the whole, wealthier schools in wealthier communities will have the greater advantage when it comes to realizing the benefits of generative AI. Without additional support and investment, less-resourced schools in poorer neighborhoods are more likely to encounter the detrimental aspects of AI without the same benefits realized by wealthier schools, thus widening the disparity in educational experiences.
School districts have some resources to deploy, but many are strained. They will need curricula, guidelines, and advice but also financial resources to support professional development, new devices, technical support, adequate staffing, and an adequate substitute pool. (Many teachers’ planning periods are routinely “stolen” as they are compelled to substitute for sick or absent colleagues). Teachers will need more planning time to thoughtfully adapt. We need to make sure access, training, and curriculum is made available to increase equity. Who can pitch in? Education technology companies? AI companies? Governments? NGOs? This requires a team effort.
There are many papers and resources with direct recommendations to policymakers (included in our appendix). To those, we add:
MIT is only one of several excellent research institutions standing by to help adapt and realize the potential benefits of generative AI. How can you help coordinate our efforts and incorporate our research into policy decisions?
How can you provide extra support for schools to adapt to this disruptive technology? That includes both financial resources as well as guidance, curricula, and professional development opportunities.
How can you encourage or require the artificial intelligence sector to support educational goals and to consider the impact of new technology on schools, students, and teachers?
Technology companies should include educators and researchers in planning conversations. Sam Altman, co-founder of OpenAI, has suggested it would have been helpful to communicate more with educators before releasing generations of ChatGPT. The educators we surveyed all agree.
You have unleashed this tool that can do remarkable things. I think it’s exciting and interesting, but we are looking for guidance: What are the skills we need to teach our kids to use this stuff in ways that help rather than hinder?
It is well and good for us to provide advice for educators, parents, and students, but we need to remember that they are in the midst of a daunting challenge and admonitions without support may not land well. Advice and insight are not enough. Educators will need help sifting through the vast quantity of recommendations we have collectively generated. They are going to need curricular and financial resources. We researchers need to find ways to work together so that our insights can be amplified and simplified, and reach the people who need it most.
Computer science teacher Chad McGowan told us about how he advised the yearbook club as they designed the cover, with prototyping from AI. The students came up with four prompts for DALL-E, a generative AI visual art tool, which generated images based on their prompts. The students then discussed the AI’s creations. “Each person was able to weigh in with ‘This is what I was thinking,’ or ‘I like the background on this one”... just looking at different elements and thinking about how it did or didn’t align,” McGowan says.
The students used critical thinking skills to evaluate the AI’s output, and then designed the cover without using Dall-E. We like how the assignment balanced the use of generative AI with the “traditional” design skills, and how the AI augmented, but did not replace, human creativity.
There will be millions of experiments like this one in the coming years, as teachers and students everywhere explore the possibilities of generative AI, looking for an effective balance that still challenges students to think productively. History tells us our teachers are dedicated and flexible professionals who will adapt effectively to evolving conditions when they have what they need.
We remain hopeful that generative AI will be a net gain for educators, students, and society. We know that the task of adapting to its widespread availability will be difficult. We hope educators will find the support, guidance, and time needed to adapt successfully.
“AI Machines Have Beaten Moore’s Law Over the Last Decade, Say Computer Scientists.” Discover, February 21, 2022. https://www.discovermagazine.com/technology/ai-machines-have-beaten-moores-law-over-the-last-decade-say-computer
Bryant, Jake, Christine Heitz, Saurabh Sanghvi, and Dilip Wagle.“How Artificial Intelligence Will Impact K-12 Teachers.” McKinsey & Company, January 2020. https://www.mckinsey.com/~/media/McKinsey/Industries/Social%20Sector/Our%20Insights/How%20artificial%20intelligence%20will%20impact%20K%2012%20teachers/How-artificial-intelligence-will-impact-K-12-teachers.pdf.
Dell’Acqua, Fabrizio, Edward McFowland, Ethan Mollick, Hila Lifshitz-Assaf, Katherine C Kellogg, Saran Rajendran, Lisa J Krayer, François Candelon, and Karim R Lakhani. “Navigating the Jagged Technological Frontier: Field Experimental Evidence of the Effects of AI on Knowledge Worker Productivity and Quality.”Harvard Business School Technology & Operations Mgt. Unit Working Paper No. 24-013. Preprint, submitted September 18, 2023.https://doi.org/10.2139/ssrn.4573321.
Hamilton, Ilana.“Artificial Intelligence in Education: Teachers’ Opinions on AI in the Classroom.” Forbes Advisor. Last modified December 5, 2023. https://www.forbes.com/advisor/education/artificial-intelligence-in-school.
Heaven, Will Douglas. “GPT-4 Is Bigger and Better than ChatGPT—but OpenAI Won’t Say Why.” MIT Technology Review, March 14, 2023. https://www.technologyreview.com/2023/03/14/1069823/gpt-4-is-bigger-and-better-chatgpt-openai.
Kraft, Matthew A., and Melissa Arnold Lyon.“The Rise and Fall of the Teaching Profession: Prestige, Interest, Preparation, and Satisfaction over the Last Half Century.”EdWorkingPapers, November 28, 2022. https://www.edworkingpapers.com/ai22-679.
McQuate, Sarah.“Q&A: UW Researcher Discusses Just How Much Energy ChatGPT Uses.” University of Washington News.Last modified August 2, 2023. https://www.washington.edu/news/2023/07/27/how-much-energy-does-chatgpt-use.
Nicoletti, Leonardo, and Dina Bass. “Humans Are Biased. Generative AI Is Even Worse.” Bloomberg, June 9, 2023. https://www.bloomberg.com/graphics/2023-generative-ai-bias.
Oravec, Jo Ann. “Artificial Intelligence Implications for Academic Cheating: Expanding the Dimensions of Responsible Human-AI Collaboration with ChatGPT and Bard.” Journal of Interactive Learning Research 34, no. 2 (2023): 213–37. https://www.academia.edu/105260068/Artificial_Intelligence_Implications_for_Academic_Cheating_Expanding_the_Dimensions_of_Responsible_Human_AI_Collaboration_with_ChatGPT_and_Bard.
Reich, Justin. Failure to Disrupt: Why Technology Alone Can’t Transform Education. Cambridge, MA: Harvard University Press, 2020.
Singer, Natasha. “Will Chatbots Teach Your Children?”New York Times, January 11, 2024. https://www.nytimes.com/2024/01/11/technology/ai-chatbots-khan-education-tutoring.html
UNESCO. “Guidance for Generative AI in Education and Research.” UNESCO Digital Library. Accessed December6, 2023.
https://www.unesco.org/en/articles/guidance-generative-ai-education-and-research
Valentine, Nick.“The History of the Calculator.”Calculator Site. Last Modified August 26, 2022. https://www.thecalculatorsite.com/articles/units/history-of-the-calculator.php