I’ve become deeply involved with a new hobby — home automation. I’m talking about the installation and scripting of “smart” appliances, switches, and stuff. Do you remember how “Internet of Things” was going to change the world about 5 years ago?
I spent 30 minutes yesterday walking around with a ladder trying to figure out which “smart” smoke detector was chirping. With the pace of technology, in a few years they’ll probably have that down to 10 minutes. I think we’re about 5 years away from this being very exciting stuff for the average homeowner. Let’s call it “Matthew’s Rule” — 10 years from the “What Is This” feature in Wired until it’s truly ready for primetime.
Learning a new technology means spending a lot of time reading message boards and chat rooms. I came across a really interesting post I wanted to tell you about.
A poster described their experience using ChatGPT to code an iOS app for helping them run their smart home. They claim that the AI created the application completely from scratch!
This stopped me in my tracks. It defies my expectation of what AI is currently capable of. I keep a close eye on all of the subjects that would be increasingly embarrassing if it turns out I was wrong about, so I can go back and delete all the ill-advised posts when the time inevitably comes.
The author explained that they’d gotten ChatGPT to develop an iOS app for them. Without any assistance from a software engineer! They said that ChatGPT had completely replaced the role of the engineer in the process of getting this done.
This person claimed they’d never built an application before, except working in concert with a Software Development Engineer at work. That wouldn’t be necessary any more, they said, as they beheld the power of ChatGPT to create something new and useful out of whole cloth.
What would you say… you do here?
I thought I smelled bullshit, but it could have been cow, or maybe horse. I was concerned that the writer might be giving a bit too much credit to ChatGPT, and perhaps not enough to his engineering colleagues at work. I continued reading.
The product manager started out by supplying the the AI with a prompt. They told ChatGPT that they had no programming ability and asked the AI to create a new app. They described the platform he wanted it to run on, and the overall function of the app they wanted built.
From here this very patient and optimistic person described a fascinating sequence of events. The AI replied with a first attempt that didn’t quite work. Through some Googling, the poster deduced the problem. They rephrased the prompt to ChatGPT, and got back a result that corrected the error. But it still wasn’t quite right.
This process was repeated over many iterations over the course of about a week. Enter a prompt, get some output from ChatGPT that wasn’t quite right, do some Googling, then adjusting the prompt to ChatGPT. And repeat.
Eventually this process yielded a useful result, without any input from a software development engineer. So he claimed.
But of course he did have an engineer. The PM, bless his heart, became a software engineer. He was engineering ChatGPT and programming the home automation app!
Please don’t tell them this, but PM’s can in fact learn to code using powerful yet unpredictable software like ChatGPT. I have never heard of a Product Manager learning to code without realizing it, but I’m sure this one wasn’t the first.
Does it seem like this was too easy? Learning to code is SO easy, my friend. There are many memes about it. There’s a “learn to code” program for every category of ambitious person on the planet: children, babies, the incarcerated, incarcerated babies, you name it!
ChatGPT did not stand in for the role of a programmer. Real programmers do not work by giving untested code to a product manager, expecting them to… um standby a moment. There’s a voice in my earpiece, and it’s saying that is in fact exactly what real programmers do, especially Python programmers. Ok, my mistake! (Just kidding, love you Python programmers!)
Chat GPT did not stand in for a programmer, it stood in for the role of Google. When someone is learning to program, that’s what they do! They Google the thing they’re trying to do, they copy and paste the code into their program, and they make a couple of adjustments. When it doesn’t work, they go to IRC, I mean Discord, and try to find somebody to look at the code and tell them what’s wrong with it.
As a matter of fact, Google plays this role for people learning virtually every task. And AI will improve that process dramatically for many applications. Today’s AI adds to the capabilities of a search engine by making it easier to formulate queries which are better understood, producing better-structured, more meaningful results that can accelerate you along your way towards completion of your task.
ChatGPT may have become a better tool that Google for this process in that it combined the capabilities of Google and the person on Discord. But does it deserve credit for replacing the software engineer?
There’s a human creator writing the prompts. But if they would not have been able to complete their task without AI, is AI the rightful creator of the product?
I Made This
The pronouncement “This is a Wes Anderson Film” is what is known as a “possessory credit.” It’s used when creative recognition for a film is primarily attributed to an individual who is deemed to have made the most important artistic contribution. This is usually the director or writer.
In the career coaching world, a similar idea is encapsulated in the slogan “I, Not We.” When talking about your experiences, give yourself a kind of possessory credit for projects that you made a meaningful contribution to. You communicate your ownership and accountability for a project by saying “I made a decision to redesign the landing page in an effort to increase conversion.” rather than “We made a decision…”
Your intent is not to take credit you don’t deserve away from the team. Everybody knows projects are done by more than one person. The purpose is to say and show that you’re accountable for the results.
If you use Microsoft Word to write a cover letter, and you use a tool like Grammarly to complete some sentences and correct your grammar, you wouldn’t say “Microsoft Word” wrote the cover letter.
You might use Photoshop to remove the background from an image and composite it with another. You created an artistic collage that tells the story of how you helped a lost dog. Did Photoshop design the poster? Of course not.
Even if you didn’t use any generative software capabilities at all, your computer is (in a sense) 100% responsible for making the document. Yet we do not say that the laptop wrote your resume or report.
Computers rarely and generative AI almost never deserves the possessory credit for their creations — at least not for the ChatGPT examples we’re being supplied every day.
In a perfect world, credit for generative output would be properly attributed to the creators whose work (improperly, unfairly, and I hope someday illegally) was appropriated by the AI’s model.
Here are my proposed rules for the attributing credit:
Until the deficiency in AI’s ability to credit its sources, do it manually. If you get output that likely could not have been created without training on earlier work by an individual, that person should be credited, when possible. “I used Dall-E in the creation of this image, and it was surely influenced by the work of Jessica Hische.”
It’s not necessary to credit your tools, AI or not. People are frequently turning up in meetings with “I got this from ChatGPT!” as though that’s something to be proud of. It’s like saying “Sorry I’m late, I couldn’t find my own car so I had to steal one.”
If you use ChatGPT to write a cover letter — which I should not need to tell you that you absolutely should not do and yet I do need to tell someone this almost every day — just say “I wrote this cover letter” even if you did not touch a word. See Rule #2.
The attribution of creativity, agency, and autonomy to today’s AI is the result of a mass misunderstanding of how the technology works, what it does, and what it’s good for. What it’s best at is telling us things that the internet already knows in an extremely rapid and convenient way. For this, its creators deserve the credit for making tools that have the potential to dramatically accelerate our work, improve our execution, and automate repetitive tasks.
There’s simply no need to anthropomorphize the AI in order to give it credit, especially what that credit rightfully belongs to content creators who have, as yet, largely been ignored. It is not harmless to gloss over the difference between a creator and their tools. Doing so diminishes the value of the former, and trivializes the real potential of the latter.