tl;dr—I think genAI and cultural norms will create a demand for authenticity of work, and we’ll need to find new ways to show “proof of work.”
Today’s goal is to share some thoughts I’ve been pondering about the future of work with genAI.
There’s a moment coming which some creators dread and others celebrate, which is the great “leveling of the playing field” with the newfound powers of generative AI.
Need ideas for a book? A song? A game? A product? A new business venture? We’re seeing evidence that generative AI is proving to be more creative than humans (see this Wharton school experiment).
Not only can the latest genAI models come up with ideas, it can also do most of the writing, proofreading, drawing, composing, and it wont be long before it can also do most of the coding for you.
I don’t think genAI will fully replace humans for another decade, but I believe in the next few years it can help humans get 80% of the way there with a decent quality start. I think this also means that the cost of creating content will go down and a lot of it will start seeming similar and homogenized, because it’ll be easier to just follow AI suggestions than do the work yourself.
I think that authentically human-made content will become rarer and more expensive. It’s like handcrafted furniture vs the machined stuff you get from IKEA. IKEA is more popular, but if you want something really custom, you’ll have to pay for it.
Why does this matter?
This will create a growing problem in the realm of education and work experience: how can you trust whether someone truly is doing the work and whether they actually have the learned expertise or are just using a bunch of AI tools as a “crutch”?
There are some interesting philosophical questions to answer here, like:
- Does it really matter if someone actually did the work versus using a tool to do it? Does it matter if you dig a hole with your own hands vs using an excavator? Does it matter if you painted artwork yourself vs hired someone else to do it? Does it matter if you wrote something yourself vs copied another person’s work? What is the difference that makes one example feel acceptable and another feel like cheating?
- Similarly, does it matter if someone truly has the expertise and knowledge engrained in their brain, versus leveraging external tools and resources? Does it matter if you memorized the events of the US Civil War, vs looking it up? Does it matter if you did the deep data analysis yourself, vs leveraging an analysis by another company or team? Does it matter if you actually spent 10yrs writing backend software, if you could write similar code with some Googling, StackOverflow answers, and GitHub Copilot fresh out of college? Why do these things matter?
One might say that people without real experience are more prone to catastrophic mistakes, which may have business consequences. Another might say work that requires true intellect, skill, and creativity is special — almost like intellectual property, and it feels wrong to copy it without permission or credit. Some might say that by relying on external tools and resources, you never truly learn deep expertise in something and are less capable of a person. And some would just completely disagree with any of those statements.
I don’t think there’s a right answer to these questions; they strike some engrained cultural norms around hard work, doing your time, tenure and experience, and how we value effort. That isn’t something that changes easily, and will take generations to shift the public perspective.
If we assume for now that people will continue to value “authentic human-effort,” one thing that I think will become important is HOW you show and prove authenticity in your work.
Showing proof of work
What does this mean? Let’s imagine there’s a famous painting, and two different people contest that they are the original artists. How would you prove who is right? Maybe there’s a way to trace the materials? Maybe the artist would have earlier sketches that led up to the final piece? Maybe the artist would have other evidence or witnesses to the in-progress creation of the piece? Maybe there’s an official stamp or mark that was made on the original that certified it was theirs?
Now think about how this changes for digital images… My 2c, the current techniques for proving a piece of digital art is your creation is oddly similar: show proof of your Photoshop file with all the in progress layers, point to some original post you made with the artwork, mark with a signature or watermark. The latest evolution in this may be the NFT blockchain approach, and blockchain has its own interesting concepts around “proof of work.”
There are now different services like the Amazon bookstore or online art communities requiring disclosure if you’ve used AI tools to create your work of art. Let’s be real, most people are probably NOT bothering to disclose this unless the content is so obviously generated that they’d get called out or banned for it.
Some major companies have banded together to agree to some form of “fingerprinting or watermarking” generated content from their services. This helps, but only works if the majority of generative services follow this standard.
The approach I’m seeing in some online art communities is the old fashioned “show your work early and often” — artists post early sketches of their pieces and show how it progresses to the final product. In a way, it’s a proof of work that mostly works because it’s time consuming to reproduce all the in-between versions and show how it came together. It’s not foolproof, but it is a proxy for snapshots of the time and effort invested.
It’s going to be interesting to see if this becomes a more prevalent way of showing proof of work, or whether new techniques get developed or maybe a cultural shift happens and people stop caring about this.
Proving my work… at work
Bringing this back to a work context, I thought about what I do as a product manager. When I make product decisions or document product requirements, I have enough street cred now where people don’t ask me to prove my work most of the time anymore. However, there was a time when I was fresh and unknown, and people would ask why I had decided on XYZ or what is the reason behind a requirement being a P0.
The way I prove my work is to show the data points from our analysis that backed the decision, or the user research report we did last week that documented the user needs and pain points. Again, it involves showing evidence of steps I had taken earlier to arrive at the final outcome. If I think about it--the time, these artifacts of early work, and my peer witnesses that come together as my “proof of work.”
I wonder if this will still be the way to prove my work in the future? I wonder if this paradigm could scale for other areas that are facing a “proof of work” crisis (text, music, video)? I wonder what happens if we can no longer prove our work reliably anymore.
I’m not sure, but I see hope in the fact that there are people who trust me even without proof of my work. Human trust, for better or worse, ultimately holds the world together.