My mother gave me Kathleen Flinn’s The Kitchen Counter Cooking School: How a Few Simple Lessons Transformed Nine Culinary Novices into Fearless Home Cooks because she knows I’m a fan of food and cooking memoirs, and indeed I gobbled this book right up.
For once, the subtitle of a nonfiction book is an accurate reflection of what’s inside: the book is about Flinn’s decision to give cooking lessons to nine volunteers who are reliant on ultra-processed package foods but want to be able to cook for themselves and their families. Over the course of just a few lessons, they learn how to cut apart a whole chicken, tell when meat is done, measure ingredients for baking, season their food to taste, and just generally cook dinner without having to slavishly follow a recipe.
A couple things struck me as I was reading. The first is that, culturally, there seems to be an assumption in America (perhaps other places?) that most skills are hard to learn, when in fact the basics are often quite easy. I found this myself when I began learning the dulcimer: I have the musical talent of a potato, but nonetheless within the month I could play a recognizable rendition of “Amazing Grace.” Similarly, a novice cook probably won’t jump straight to baked Alaska, but you can teach an eight-year-old to turn out passable chocolate chip cookies.
The other realization, which builds on the first, is that a lot of industries are built on convincing people that it would be sooooo hard to learn the skills to do the thing for themselves. The car industry has gone one better by making cars you literally cannot repair at home, which the food and garment industries can’t manage, but they do rely on the idea that cooking and garment repair are sooooo difficult and time-consuming and hard to learn.
I think AI is creating a similar dynamic with writing and art generally, where people come to see skills that are really pretty simple as completely beyond their grasp. I’m in a class to train advisors, and we had an assignment to write an advising philosophy, and the instructions went from “No AI” to “Well, maybe AI to brainstorm” to “Well, actually I ChatGPTed the whole philosophy! But I edited it myself, so that’s fine, right?” And indeed it was fine, apparently. I’m not sure, at this point, what wouldn’t be fine. ChatGPTing the thing and then not editing, one fondly imagines, but maybe that would have been okay too. Once you let the camel’s nose in the door, the whole camel is coming into the tent.
For once, the subtitle of a nonfiction book is an accurate reflection of what’s inside: the book is about Flinn’s decision to give cooking lessons to nine volunteers who are reliant on ultra-processed package foods but want to be able to cook for themselves and their families. Over the course of just a few lessons, they learn how to cut apart a whole chicken, tell when meat is done, measure ingredients for baking, season their food to taste, and just generally cook dinner without having to slavishly follow a recipe.
A couple things struck me as I was reading. The first is that, culturally, there seems to be an assumption in America (perhaps other places?) that most skills are hard to learn, when in fact the basics are often quite easy. I found this myself when I began learning the dulcimer: I have the musical talent of a potato, but nonetheless within the month I could play a recognizable rendition of “Amazing Grace.” Similarly, a novice cook probably won’t jump straight to baked Alaska, but you can teach an eight-year-old to turn out passable chocolate chip cookies.
The other realization, which builds on the first, is that a lot of industries are built on convincing people that it would be sooooo hard to learn the skills to do the thing for themselves. The car industry has gone one better by making cars you literally cannot repair at home, which the food and garment industries can’t manage, but they do rely on the idea that cooking and garment repair are sooooo difficult and time-consuming and hard to learn.
I think AI is creating a similar dynamic with writing and art generally, where people come to see skills that are really pretty simple as completely beyond their grasp. I’m in a class to train advisors, and we had an assignment to write an advising philosophy, and the instructions went from “No AI” to “Well, maybe AI to brainstorm” to “Well, actually I ChatGPTed the whole philosophy! But I edited it myself, so that’s fine, right?” And indeed it was fine, apparently. I’m not sure, at this point, what wouldn’t be fine. ChatGPTing the thing and then not editing, one fondly imagines, but maybe that would have been okay too. Once you let the camel’s nose in the door, the whole camel is coming into the tent.