Homework and AI

If you are a long time podcast listener, you will recognize PJ Vogt and if you miss Reply All, you might want to take a look at Search Engine. I have an just as before just about every episode has something to think about that is often very relevant to what is happening in the world. Being back on the teaching horse and as always wearing my tech-curious hat this episode hit square in the brainpan.

When AI roared onto the scene in November-ish ’23 I remember thinking that I should get some ideas down, but things were moving so fast that by the time I thought through things, the entire landscape had changed. I got one post in that essentially parroted “AI is a calculator” a year later. Moving into ’25 I was thinking along the same line but as I listened to this episode of Search Engine the view of how students are thinking of using it was quite revelatory. As is the comment made by Weinstien and Tench; “Homework should be interesting, meaningful, and joyful”.

“You know, when ChatGPT arrived, there were a lot of people analogizing it to calculators. And this is true in some senses, but it’s not true in some very important ways. The most important way is that the labor of you struggling to do long division at your kitchen table and what a calculator does when it calculates is identical.

The exact same thing is happening between a calculator or a human doing calculation. ChatGPT is not using a process that is the same as what humans do when they write. ChatGPT is generating syntax on the basis of weighted probabilities, and that is not what humans do when they write.

When humans write, we are using syntax in order to try to capture an idea or notion or image or whatever we’re trying to get on the page. Writing is thinking, writing is feeling (my emphasis), writing is communicating. All of these things happen as humans write, and none of them happen when ChatGPT writes.

This is meaningful, it must be meaningful. If it isn’t, then we can just pack it in and give up on writing.”

From Search Engine: Playboi Farti and his AI Homework Machine, Feb 14, 2025
https://podcasts.apple.com/ca/podcast/search-engine/id1614253637?i=1000692332213&r=2208
This material may be protected by copyright.

So how do we make sure that homework is “all that” – or in my case, how are assignments “all that”? We need to get that feeling element. If the student is only using AI as a supervisor, they may, as likely as not be detached from the final outcome, but if they are using AI as a director, then we might be on to something. Creating student assignments that would require the user to be able to generate through their prompts the vision they have for the final product would be a step toward that feeling. Seeing that product appear and having to struggle for it would suggest meaning.

I won’t assume the teen viewpoint (my resident teen will likely tell me that I’m all wrong anyway :O), but from the course that I’m helping to deliver now, these ideas feel a little forced. How do I make data visualization emotional? When you are learning the basics of a tool, outside of getting around some user errors, what is meaning? I’m not sure – having just finished the podcast, but early thoughts… generate:

  • questions from a given data set
  • images to complement data
  • forecasts from given data
  • connections to other information
  • ways to spin the results in a positive or negative manner

Thoughts for now…


Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *