AI, Schools, and the Labubu Problem

Ever wonder why word processing remains controversial in schools or the Internet is more accessible at McDonald’s? Why is the technological bounty embraced by the rest of society so elusive in schools? The explanation lies in the result of decisions made long ago in a system reluctant to change.

Schools and school systems often make technology investments unlike any other purchases. Stakeholders are rarely consulted, even though different subjects, teachers, and students likely have different needs. It would be imprudent to trust math teachers with requisitioning sports equipment and the football coach is in no position to select the best bassoon. Yet, decisions about hardware and software adoption are often made in a top down, one-size-fits-all fashion. Today, making precedent-setting decisions about Artificial Intelligence implementation is not only premature but plagued by what I call The Labubu Problem.”

When confronted with a groundbreaking new technology, schools and districts often act in haste to standardize. Historically, the alacrity and overconfidence with which such decisions were made left educators looking foolish or resulted in lost potential for students. Now, with the advent of AI, schools and districts are poised to repeat an error that may have grave consequences and profoundly missed opportunities for learners. In short, while viewing Educational Technology as a purchasing decision has always been wrong, the consequences in the age of Artificial Intelligence will be counterproductive at best and perhaps catastrophic.

Sloppy shopping

The desire to standardize on hardware or software in schools is understandable. It makes life easier for adults, can lower the barriers to adoption, save money, and might even expedite progress. There are however countless examples of how a race to be first or to keep a lid on emerging technology created more headaches than it solved. This is especially true for children who deserve computers and software capable of doing everything their teachers expect and a whole lot more that our adult reptilian brains may ever imagine. The greatest ROI (return on investment) is in using hardware and software to amplify human potential and increase opportunities for several years. The “best” technology has a low threshold and no ceiling.

So, alas, schools bought iPads for little kids. “Little kids – little computers,” the logic said. It didn’t seem to matter that young children need greater processing power, or keyboards, or that the iPad is fundamentally a consumption device when learning is a constructive process. Giving young children a TV and notetaking advice was the last thing they needed. Even if the objective was to save money, the addition of often buggy, precarious, and uncharged keyboards, cases, and other accessories drove iPad costs above those of much more powerful and flexible laptops. When iPads failed to earn their stripes, “edtech” took a reputational hit.

Next, schools were so desperate to be pals with Google and control what children did “on their devices” that they purchased Chromebooks at least a decade before “the cloud” was robust or reliable enough to be dependable. Even today, educators ask for advice on video editing with a Chromebook, when storage, bandwidth, and processing power limitations makes this unnecessarily cumbersome if not impossible.

When someone argues that you can edit video or compose music on an iPad or Chromebook, I challenge them to go first.

A distant memory

Word processing is a killer app. That is indisputable. For forty years, it has been widely adopted (although slowly in schools) by nearly every computer-user on earth. In the early 1980s, educators saw that value of word processing as a tool for student writing, editing, and expression. To meet this need, Bank Street College decided to produce what became one of the first word processing software packages for children.

Primitive technical constraints of low-resolution monochrome displays required word processing to be limited to 40 or 80 characters per line of text, but the folks at Bank Street had an even nuttier idea. Looking at the new technological revolution of word processing through the lens of typing and handwriting instruction, it was decided that The Bank Street Writer needed BIG letters for little kids. Therefore, users were invited to write in a system allowing only sixteen characters per line. It didn’t seem to matter that there are few combinations of English words that come in 16-character units (with spaces) or that we were no longer handwriting or using a typewriter – word processing is different. 16 characters is the length of two or three short words. The folly of unusably big letters was sold as a feature to eager adults who wished to portray the illusion of modernity while treating children like children.

Hubris and arrogance repeating themselves

When the World Wide Web exploded in the public conscious, schools needed to respond. Aside from the typical impulse to control, limit, censor, and monitor how children would use the Web, commercial ventures found ways to sell denatured versions of the Internet to schools. There were even companies who printed out web sites and sold the static printouts to eager education customers seeking the illusion of the information superhighway while riding tricycles.

Commercial interests cannot succeed without willing customers. So, all sorts of dumbed-down Internet products flooded the market and the ones promising the least power at the highest price experienced success.

However, making dopey purchases with other people’s money was a misdemeanor. Some educators were so high on their own supply, new-fangled expertise, and self-professed vision that they were about to make a series of immensely bad choices. Their logic went something like, “This Internet thing is really great. You know what would make it even better? If we, a bunch of former teachers, make a better one. Yeah, yeah! We can make make it do everything – teach kids, take attendance, talk to parents, deliver content, be used for research, schedule school buses, and make a mean espresso.”

Groups of educators spun themselves up with their enthusiasm and escalating self-concept without anyone puncturing their bubble to ask, “Do you really think this is a good idea?” In the case of Australia’s Ultranet, a system which crashed upon launch, three education officials were criminally charged.

In mid-2024, the Superintendent of the Los Angeles Unified School District was so desperate to be embraced as a leader on all things AI that he handed $6 million to a company to develop a chatbot specifically for LAUSD. Of course this turned out to be a handful of magic beans, the company folded, and its CEO was charged. LAUSD also holds the dubious distinction of being embroiled in an expensive iPad debacle just a few years ago.

AI is different

Artificial Intelligence isn’t a brand or specific app, it is a form of computing that takes many forms, some invisible. To most people today, artificial intelligence means generative AI chatbots such as ChatGPT, Claude, or Gemini. If for whatever reason you feel that this technology has a role to play in schools, I urge you to wait before standardizing on one brand and restricting members of your community from using the others. Under no circumstances should you purchase an expensive prechewed K-12 simulacrum of the popular tools used by the rest of society. These systems often focus on teacher chores that should be eliminated, rather than automated. AI producing sub plans? Really?

Schools will pay a premium for make-believe school AI that will never be anywhere near as good as the products with multibillion dollar development budgets. Students, teachers, and taxpayers have a right to better stewardship of limited funds.

The wisest may even assume a more radical stance of letting a thousand flowers bloom and allow many chatbots to co-exist in your school or district. You should even pay for that software when necessary, especially when it is beneficial for learning. Generative AI tools, and whatever future computational technology that may emerge are in their infancy. Such software will continue to improve and expand its capabilities at an astonishing rate. It is irresponsible to play Emperor Nostradamus and proclaim that one and only one chatbot may rule your school.

Putting AI to work

I recently spent more than a week trying to design a WordPress plugin to perform a task for which I may be the only customer. The process was a lot more frustrating than it should have been.

[Note: A brief technical description follows. You should be able to understand enough of the content to make sense of the resulting conclusion. At the very least, this personal anecdote reveals that I actually used AI to do learn and create something I would be incapable of producing without its assistance.]

WordPress is a popular opensource platform on which a lot of web sites and blogs are built. It’s easy to use it for simple things, but as soon as you wish to employ a bit more design panache or produce some interactivity, you may find yourself in over your head. The good news is that WordPress is extensible. So, lots of small and large developers have designed themes for changing the look of your site and plugins that add functionality that fewer users need. If you want your web site to do anything else, you better possess a mastery of Python, JavaScript, Ajax, Json files, CSS, and a whole lotta other stuff above my paygrade even though I began programming computers fifty years ago.

I had a sense that AI chatbots could serve as my apprentice and help me create the WordPress plugin I needed to produce a daily blog post, sharable via social media, that would randomly pull quotes of interest to educators, from the text, audio, and video archives of Dr. Seymour Papert I assembled online at The Daily Papert. My confidence was bolstered when ChatGPT assured me that it could create the software I wished to design and would teach me how to deploy it. Yippee, I thought!

ChatGPT started cranking out all sorts of code. Some of it I could understand enough to make simple changes. I could also use AI to assemble the HTML links for all of the research sources stored in the archives and then ask it to format that data in the way the sources.json file it expected. Along the way, I learned a fair bit and began to understand how all the various pieces of code would interact to achieve my desired result. I was at least 90% of the way to success for what felt like an eternity.

Along the way, each bug I caught in the chatbot’s work product was greeted by solicitous apologies, and even more disappointing performances. The chatbot would quiet quit on me, summarize details that needed to be completed, and only do part of the job while pretending it was finished. Suddenly I felt like the parent of a cyber-slacker teen. At one point, ChatGPT literally suggested that I take a break from the project.

After about a week, I was about to give up on ChatGPT and try using Claude to design my plugin software. Folks online were quick to tell me that Claude was vastly superior for “vibecoding” than ChatGPT. (“Everyone knows that, stupid” was implicit in their advice.) Claude wanted a credit card from me to pay for the API calls required by the intelligent research tool I was designing. Since this is a labor of love, I was less than enthusiastic about handing over my credit card since I could end up with a giant financial mess on my hands – in any number of ways.

Ken Kahn, author of The Learner’s Apprentice: AI and the Amplification of Human Creativity, (Kahn, 2025) suggested I use Google’s Gemini since a certain number of API calls were free. While it seemed like Gemini may have created useful code, I was incapable of successfully navigating the Google web site to establish the necessary commerce/code handshake.

So, back to Claude, credit card in-hand, and guess what? Claude generated the software I needed and it worked right away! I suggested some cosmetic tweaks and it changed the code accordingly. That next version of the plugin worked too. I then asked Claude if it could create two other ways of using the body of audio, video, and text data in the archives – no sweat.

The Labubu problem

So?

Choosing one generative AI environment for a school or district is like Labubu collecting. If you don’t know what a Labubu is, consult with a six-year-old, check your school’s lost-and-found, or watch the NSFW South Park clip below.

AI chatbots and Labubus share an important characteristic. They’re both blind box purchases. (not all Labubus are sold in blind boxes, but the most desirable ones are) When you purchase a Labubu monster or subscribe to a generative AI service, you don’t know exactly what you’re going to get.

The major generative AI players, Open AI, Anthropic, and Google pretend that their chatbots all possess the same magical powers and functionality, but actual users know differently. Choosing a chatbot is not like buying a Coke or Pepsi. The tools are different, even if those differences are opaque.

Whenever I reported on my struggles creating the working plug-in, I was met by vitriol from online know-it-all trolls quick to announce that I was obviously using the wrong system, even if they had no idea whatsoever what I was trying to accomplish.

Implications

  • Before any one generative AI system becomes the de-facto “Swiss Army Knife” standard, if that ever occurs, there will be constant jockeying for poll position with new functionality and “intelligence” added constantly. The software literally changes continuously.
  • It is impossible to predict which tool is “best” and that status may change overnight.
  • The chatbots remain unreliable and unpredictable. Something they “knew” how to do yesterday may now seem impossible.
  • Free versions of the generative AI software limit the number of queries a user may make over a period of time. Even those constraints are mysterious, you are just unceremoniously cut off for some number of hours. Therefore, you may need to use multiple chatbots even to perform common operations.
  • Educators are poorly suited to predict what learners might wish to do with generative AI and the possibilities for knowledge construction, problem solving, and personal expression are seemingly infinite.
  • Too many educators mistake “AI” for the latest encyclopedia dispensing answers to test questions and generating five-paragraph essays about toe fungus. This narrow image of Artificial Intelligence is not only ignorant, but reinforces the weakest parts of schooling while seizing the potential to empower children to learn and new in ways previously unimaginable.

In short, you may need several chatbots in your toolbox to solve problems or get work done. Standardizing on a single AI system, especially in the technology’s infancy, is likely to be a costly mistake in terms of time, treasure, user satisfaction, and missed learning opportunities.

Rather than shopping for the latest Wikipedia or worse, banning it, schools should consider systems like Wolfram Notebook Assistant + LLM Kit, a world of computational power underneath a chat interface that allows even young children to produce complex models, solve problems, explore the frontiers of STEM, and be mathematicians rather than being taught math. Such uses of AI have barely registered on the Richter Scale of AI in education gasbaggery.

This is a time for research and development, mucking about, and learning about learning with AI – and with children. Check your ego at the door, welcome serendipity, and enjoy the learning adventures supercharged by AI.


Extra treats for reading this far


References:

Anderson, J. J. (1983). Bank Street Writer. Creative Computing, 9(6), 33-33. https://www.atarimagazines.com/creative/v9n6/33_Bank_Street_writer.php

CBS Los Angeles. (2014, December 2). FBI seizes LAUSD documents linked to iPad rollout. CBS News Los Angeles. https://www.cbsnews.com/losangeles/news/fbi-seizes-lausd-documents-linked-to-ipad-rollout

EdSurge. (2014, February 7). LA Unified officials cannot access iPad curriculum. EdSurge. https://www.edsurge.com/news/2014-02-07-la-unified-officials-cannot-access-ipad-curriculum

Gomez, J. (2025, 2025/10/09). Pop Mart Labubu blind box: Big into energy. USA Today. https://www.usatoday.com/story/money/2025/10/09/pop-mart-labubu-blind-box-big-into-energy/86597189007/

Kahn, K. (2025). The Learner’s Apprentice: AI and the Amplification of Human Creativity. Constructing Modern Knowledge Press.

Lapowsky, I. (2015, May 8). What schools must learn from LA’s iPad debacle. Wired. https://www.wired.com/2015/05/los-angeles-edtech

Molnar, M. (2017, February 23). Feds drop investigation into Los Angeles district over $1 billion iPad purchase. Education Week Market Brief. https://marketbrief.edweek.org/regulation-policy/feds-drop-investigation-into-los-angeles-district-over-1-billion-ipad-purchase/2017/02

Newcombe, T. (2015, May 14). What went wrong with L.A. Unified’s iPad program? Government Technology. https://www.govtech.com/education/what-went-wrong-with-la-unifieds-ipad-program.html

Papert, S. (1972a). Teaching children thinking. Programmed Learning and Educational Technology, 9(5), 245–255.

Papert, S. (1972b). Teaching children to be mathematicians versus teaching about mathematics. International Journal of Mathematical Education in Science and Technology, 3(3), 249–262.

Papert, S. (1984). Computer as mudpie. In D. Peterson (Ed.), Intelligent schoolhouse: Readings on computers and learning. Reston Publishing Company.

Papert, S. (1985). Different visions of Logo. Computers in the Schools, 2(2–3), 3–8.

Papert, S. (2000). What’s the big idea? Toward a pedagogical theory of idea power. IBM Systems Journal, 39(3–4), 720–729.

Papert, S. (2002). Hard fun. Bangor Daily News.

Papert, S., & Franz, G. (1988). Computer as material: Messing about with time. Teachers College Record, 89(3).

Papert, S., & Solomon, C. (1971). Twenty things to do with a computer. (Artificial Intelligence Memo #248). MIT Artificial Intelligence Laboratory.

Solomon, C. (1986). Computer environments for children: a reflection on theories of learning and education. MIT Press.

Taylor, R. (1980). The Computer in the School: Tutor, Tool, Tutee. Teacher’s College Press.

Wikipedia, c. (2025). Bank Street Writer. https://en.wikipedia.org/wiki/Bank_Street_Writer

Leave a Reply

Your email address will not be published. Required fields are marked *