Wednesday, January 17, 2024

The Refurbishing of Artificial Intelligence

Hello everyone, and welcome to 2024!  Hope you had some happy and restful time with family and friends over the winter break.

Back in December of 2022, I published a blog entry titled "How AI Will Save Education."  It was just a few weeks after ChatGPT 3.5 had debuted on November 30, and as I put it back then, it "exploded into our zeitgeist seemingly out of nowhere, and it feels like something foundational has rapidly and irreversibly shifted in education...indeed, the world."  There was a lot of strong feelings in the air: wonderment to be sure, but also fear.  And there were many questions.  What does it mean to be a "writer" or an "artist" when AI seemingly has the capacity to be creative?  How does teaching look differently in such a world?   Perhaps most worrisome for educators: What can or should I do if students "cheat" with AI?  Is it cheating to use AI?  Will my time be consumed with a neverending quest to catch students passing off AI-generated work as their own?

Certainly, we still lack definitive answers on those questions, nor is it possible to be definitive when discussing something as rapidly evolving as artificial intelligence.  On such a topic, pressing "publish" almost instantly puts you out of date.  But a year later, some answers have emerged. While I won't dare presume to be the oracle or the burning bush of artificial intelligence, I thought it might be helpful to offer my perspective on some AI trends that have caught my personal eye, and share some selected articles and tools that have emerged in the last twelve months.

The Reality of Students "Cheating"

As it turns out, perhaps we can dial back from the feeling of panic that befell us in December 2022.  As Stanford researchers discovered, "Cheating Fears Over Chatbots Were Overblown" (The New York Times, 12/13/23).   In their recent survey of over 40 high schools in the United States, the amount of students who admitted to have cheated at some point has remained the same 60-70% range that it has been for years.  More fascinating, "[m]any teens know little about ChatGPT...[a]nd most say they have never used it for schoolwork."  Two sets of numbers from the research jumped out at me.  The first is that when looking at all U.S. teens, 44% said they only knew a "little" about ChatGPT and 32% said "nothing at all."  That's three out of four students!  The second shows what could become a disquieting trend of racial and socioeconomic inequity.  When deconstructing the total teen numbers into White, Black, and Latino demographics, the total numbers of "little/nothing at all" are relatively close.  However, when broken down by group, you can see the awareness of ChatGPT inching upward for Whites (50% know at least a little about the tool, versus 27% knowing nothing at all) when compared to Blacks (35%/44%) and Latinos (42%/37%).   Household income makes the disparity even stronger. For households under $30,000, students that know "a lot/a little/nothing at all" about ChatGPT is 11%/30%/59%, compared to the students from households $75,000 and over:  26%/50%/24%.

Does this reflect that ChatGPT is being blocked more at schools in "certain areas"?  Are educators trusting, and teaching AI to, some students more than others?

In a separate set of experiments in September 2023, Stanford researchers explored how well AI cheat detectors were working.   The short answer: not well.  There were a number of both false positives and false negatives.  In fact, when the scientist simply asked the chatbot's generated essay that had been previously detected as AI to "elevate the provided text by employing literary language...[d]etection rates plummeted to near zero."  Another disturbing finding: actual essays written by multi-language learners were often falsely flagged as AI.  This is likely because the more simplistic, emerging language skill of such students seem to reveal the bias of the AI detection programmers: simple text "must" mean a machine wrote it.

Of course, this issue of cheating doesn't address the powerful opportunities to ask a chatbot AI to generate an initial draft of a paper for the student to improve and revise, or to have it provide major points to consider in an argument, or the need to make sure AI is properly cited when used.  Passing off someone's work has always been cheating, whether you Google an essay and pretend it's your own, or when a machine (or human!) writes something for you and you just put your name at the top.   What is important is for students to learn, debate and practice the ethical use of AI.

So, in summary: students may not be using AI chatbots as much as we think, and student cheating with AI (and the software that detects it) is a complicated topic at best.  In fact, we may need to think more closely on how AI is being permitted on school devices and networks, if enough awareness and ethics of AI are being taught, and at the most basic level, reflect on how our concern of "cheating" may be unfairly manifesting itself into inequitable practices in our classrooms.

Proactive and Positive Uses of AI in Teaching and Learning

On a more positive note, there have certainly been numerous reports in the news and research of the potential of AI having a positive impact on learning...or at least approaching it with cautious optimism. Here are a few recent examples: 

  • Carnegie Learning has launched several AI tools to help students and teachers. In particular, LiveHint AI is providing real-time math tutoring support.  The tool helps coach students to understand the problem-solving issues of a difficult problem, not simply giving them the right answer: “ 'When students use it, they have the ability to provide line-by-line feedback. If there’s any particular response that they either like or don’t like, [the students] can comment on it,' Carnegie Learning Chief Data Scientist Steve Ritter said. 'And then overall, in terms of the overall quality of the session, they can also comment on that.' "  According to Ritter, the tool has been received positively by faculty and students. ("This Pittsburgh edtech company uses AI to help kids learn math amid ‘uncertainty’ about the tech," technical.ly, 1/4/24)
  • Harvard researchers have been studying the impact of AI-generated feedback on students.  The results were positive for teachers, but mixed for students.  Teachers appreciated how AI sped up their ability to respond, as well as gave them ways to customize their feedback to a student's needs.  Many students appreciated how such feedback improved the feeling of a perceived "caring classroom culture" (likely because it was timely and seemed to be responsive), but some struggling students felt the feedback was "unhelpfully short and insensitive."   ("Harvard researchers explore how to use generative AI for student feedback," WBUR Public Radio, 12/27/23)
  • The University of Kentucky's Center for the Enhancement of Teaching and Learning has published some AI resources on its website.  In particular, I'm impressed with their "Course Policy Examples," which give an instructor options and exemplars for clarifying whether AI is permissible in their class, from "No [Student] Use" at all, to "Conditional," to "Unrestricted."  

From Prompt Professors to Pragmatic AI Professionals

When the generative chatbots came out in full force in early 2023 (ChatGPT and its rivals like Google's Bard, as well as various graphic-generating AI), there seemed to be a brief time when the educator who was an expert in "prompt engineering" might be the next new thing.  Certainly, we soon became awash in websites and PDFs offering effective prompts to copy and paste for our own AI adventures (such as this one).  What it reminded me of was the pioneering past of other Internet tools.   Do you remember the halcyon days of Geocities, when the idea of publishing your own personal website made us hungry to HTML code?  Or the beginning of Google Search, when we all were ablaze with Boolean operators and the power of a well-placed parenthesis or quote mark?  This was grand stuff for early tech adopters, but for most of us, the effort of process overwhelmed the product.  We don't want be an expert HTML coder unless we plan to create sophisticated websites full-time; we just need a decent templated site that gets our message across.  We prefer a natural language inquiry that gets us the right result, rather than pondering whether it is better to type "AND" instead of "OR" in a search engine.  In short, it is the content that is king. 

In the same way, the appetite for using an AI tool like ChatGPT is also shifting.   We appreciate the power of AI, but want to concentrate on where the vehicle takes us, not so much the vehicle itself.   What that has led us (in a remarkably short amount of time!) is into a world of tools that refurbishes artificial intelligence into an interface that is more inviting and user-friendly.   The word "refurbish" may at first bring to mind a negative connotation, if you only think of the broken device that is fixed and resold at an electronics store.  However, I'm leaning into the original Oxford Dictionary definition of refurbish as renovate and redecorate.   That is an almost literal description of what many new AI tools are doing.  They often take the same "raw" AI code -- such as ChatGPT -- and put it in a renovated, redecorated package that is easier and often more effective to use.  In fact, if the tool has a premium tier, people are now often willing to pay for such convenience, regardless if or whether the raw AI engine underneath the tool's hood is free or relatively inexpensive.  There's a bit of irony at work here; many of the same people (especially educators) struggling with their "buy in" for AI in 2022 are now buying AI.

Here are just a few AI tools that offer outcomes that would have seemed impossible for technology to achieve just a few years ago.  (While there are not necessarily education-specific, it takes the smallest leap of imagination to see how they could be used creatively for learning by a teacher or a student in the classroom!)

  • Beautiful.ai.  An AI-assisted presentation tool, this is particularly powerful in creating infographics within seconds.
  • Synthesia.  This can not only create videos with AI, it can insert a virtual lifelike avatar that speaks the script you type for it, in over a hundred languages.
  • Owlift (formerly called "Explain Like I'm Five").  The tool takes what could be complicated subjects and adapts the explanation of such topics for an audience of varying needs.  Note that you can not only toggle how you're feeling on the topic ("pretty dumb/dumb/smart/pretty smart"), but also whether you want the reply to be sarcastic or not.  The snarky language aside (which arguably may not be appropriate for a young student to interact with), the nuanced functionality of the tool makes it worth mentioning.
  • I also need to mention that many popular online tools now include some sort of embedded AI tool, from Zoom to Canva to Google Docs to the writing of LinkedIn posts.
From the website Owlift (formerly known as Explain Like I'm Five).

This refurbishment concept can particularly be seen in educational tools.  It is no small thing to save me time from constantly having to tell AI at every new prompt, "I'm an educator," so that it shapes its answer accordingly.  As such refurbished and personalized AI becomes increasingly sophisticated in the near future -- it will soon know and remember that I am a fourth grade teacher who needs a science lesson every Tuesday, and which of my students will need a translation in Spanish -- the possibilities for AI to become my "artificial instructional aidebot" loom larger and larger.

Here are some promising education-specific tools, with some accompanying cautionary tales:

  • Toddle.  A learning management system I first discovered at the Aurora Institute 2023 conference, it was the first LMS I've seen that incorporated AI into its platform.  When the LMS has been preloaded with your district/school curriculum, it can leverage this (along with information about your grade level, content taught, etc.) to have very specific results within Toddle itself, which can then be easily exported or incorporated into your classes.  (And Toddle is not unique in its AI upgrade; since I first encountered it, I've seen several more learning platforms add a version of AI to their toolbox.)  It's great to keep your AI generation within the same browser tab as your LMS!  It should be noted, however, that the purchase of Toddle does not automatically include AI, as it is a premium feature (and likely this would be true of other LMS's that add AI as a feature).
  • Curipod.  Another AI-powered presentation tool, but more focused on student and teacher usage.   By entering in your prompts, a lesson plan in the form of a multi-slide deck with built-in interactives for students is generated.  The features and the name remind me of Nearpod, and like that platform, a student can join a teacher's live session with a PIN.  There is a limited free account available.
  • LitLab.  A tool for creating "AI decodable books" to help primarily with early literacy.  Once perimeters are given, an online book is generated with customizable text and images, which can then be shared with students via its URL.  However, the images, while clearly AI generated, are not very sophisticated, nor are they specifically tied to what is occurring in the text of its particular page.  While currently free (if you nab one of the limited initial seats available), the platform will become a paid service in June 2024.
  • Magic School AI is impressive by having a one stop shop approach to AI integration.  You can discuss ideas with Raina the Chatbot, which can then export your result as needed.  Additionally, there are several dozen AI "Magic Tools" that lean into a specific need.  Examples include "Text Leveler," "YouTube Video Questions" (give the video URL, and you can create a custom assessment based on the video's content customized by grade level and number of questions), "Choice Board (UDL)," "Project Based Learning" (create a plan), "Rubric Generator," and much more.  (Again, it should be pointed out that free chatbots can generate many of these outputs, but perhaps without the specificity and convenience this platform can achieve.) While Magic School offers an initial free trial, it does eventually cost to use.

Magic School AI

One of Magic School's "Magic Tools" is an "IEP Generator."  This brings up one of the biggest takeaways and pieces of advice about using AI, regardless of the platform or type: be mindful of what you put into a prompt, and be careful how reverently you treat its output.  To further explore this example, I asked my OVEC colleague and special education teacher Dr. Debbie Mays for her opinion, and Dr. Mays shared some thoughts she and her ECE colleagues have had on using AI for creating an IEP: "We have the concern of confidentiality and information being exposed [when personally identifiable information is inputted into an AI tool]. Also, if AI just considers federal requirements, it may miss the unique state regulations and statutes that do need to be considered and followed. It has to do both, as well as follow district policies and procedures. Finally, the intricacies of an IEP – all the connections and 'threads' that need to be addressed for that specific student throughout the program -- will probably be missed, opening the district to legal ramifications." Ultimately, Dr. Mays gave her personal "thumbs down" on AI-generated IEPs, but acknowledged "there may be ways to use it. The trick is a teacher needs to know how to write an appropriate Individual Education Program. If they don’t know how and need help, they may not know how to fix one either, if AI started one for them."  Again, while applying AI to IEP creation is a very specific use case example, it illustrates the general need for educators to practice discernment.   AI is not infallible, nor is it always "right."

As I wrap up this entry, we can see that the last year has proven the reality of AI is somewhere between an apocalyptic destroyer and consequence-free manna.   In May 2023, the U.S. Office of Educational Technology published "Artificial Intelligence and the Future of Teaching and Learning: Insights and Recommendations." In a brief handout with the paper's core messages, a comparison was made between electronic bikes and robot vacuums.  Robot vacuums may be treated as somewhat independent machines, with a "mind" of their own; such machines may be disruptive in the way that they can substitute for a human completing the task it was designed for, or perhaps more ominously, replace the human worker that was previously needed for such janitorial labor.  However, the OET would assert that it is more accurate to think of artificial intelligence as not our eventual robot overlords, but as an electric bike: "We envision a technologically-enhanced future more like an electric bike and less like robot vacuums. On an electric bike, the human is fully aware and fully in control, but the burden is less, and their effort is multiplied by a complimentary technological enhancement."

From the OET's summary handout of their May 2023 full report.

In the world of our educational future, we should hope to use more artificial intelligence "electronic bikes" for ourselves and for our students.  Let's embrace AI to augment and enhance what makes us human so we can dare to dream, think, make, and create in even more innovative ways, and by doing so, transform the learning experience.

Update 6/15/24:  The website "Explain Like I'm Five" is now Owlift.  I edited the entry in the appropriate spots above.