'Work' is whatever AI can't doReaction to Sarah Tavel's blog post on LLMs + Work |
Welcome to the 107 people who have joined us since last week! If you aren’t subscribed, join 3,563 AI folks. View this post online.
Sarah Tavel wrote a great blog post on selling ‘work’, not software. Sell outcomes, not seats. Her main argument is that “LLMs create an opportunity for startups to look beyond this way of thinking and discover surface area that previously was out of bounds for selling software given the required GTM and pricing limitations of software.” She goes on to say, “...rather than sell software to improve an end-user's productivity, founders should consider what it would look like to sell the work itself.” I was asked my thoughts on this essay and found I slightly disagreed with Sarah’s framing. "Work" is simply what computers can’t do, and this definition changes with all new technology (not just LLMs). If you replace "work" with "things humans need to do" then work has been getting delivered by technology forever. With LLMs however, we are witnessing a step-level change in what technology can do and as a result what "work" actually is. At one point, "work" was adding, subtracting, and multiplying numbers (see the famous everyone in this room is now a cell on a spreadsheet image). But then Excel came along and delivered this "work" to customers. Now the "work" is managing spreadsheets, not arithmetic. So what was work was no longer work anymore, the definition evolved. Work has transitioned up the value chain to something else. Something that technology couldn't do, yet. For a recent example, it used to be someone's job to sit in meetings and take notes, track next steps, and write summaries. But now many LLM-powered products give that to us for free. For a notetaker, the work is transitioning from taking notes to doing something with those notes. Automation will always run up the value ladder until it gets to a rung it can't climb. The spot where automation can't go is where humans need to step in and where the current definition of 'work' is. But here is the fun part - the top of the ladder grows too. Humans only have so much energy and ability, they can't climb up far past where automation stops. But once a new ladder rung is automated, humans can exchange the energy they spent on a lower level task for a higher level one. It used to be that "work" was "create a financial report" but now it's "model out 5 scenarios of the next 3 years." But still, that’s not the end. The latter scenario will be automated (if you’re an optimist) and it will eventually no longer be "work." Something else will take its place. This is evident with Sarah’s "demand package" example in her post. “Take EvenUp as an example (who I have no doubt will dominate their vertical). If you are a personal injury lawyer, a work product you create on behalf of a plaintiff is called a demand package. Essentially the demand package is a summary of the case, the medical costs of the injury (including lost wages), and then a recommendation on the settlement value from the defendant’s insurance company. Law firms have stretched lawyers, paralegals, or outsourced groups writing these documents. If you were still in the mindset of selling software, you could imagine a software offering for personal injury law firms, sold on a per-seat basis, that uses AI to help people in the firm create a demand package (imagine a builder where you drop in the medical records during one step of the process). But then EvenUp would have been stuck in the paradigm of selling software – selling a 10% productivity improvement instead of 95%. Instead, EvenUp had the foresight to sell the work product itself: the demand package.” Yes the package was made (which is great), so creating it is no longer "work", but now the "work" moves onto a higher level task: client services & getting more customers (what tech can't do yet). Sometimes the technology jump is so large that a material percentage of the "work" that someone did gets automated. The company may then decide it is not worth transitioning those employees to a different type of work, but rather let them go. Like what happened with Duolingo. So yes, LLMs are able to deliver us "work," but not for long, because whatever "work" is will change. In case you missed it
Greg Kamradt Twitter / LinkedIn / Youtube / Work With Me |
AI, Business, and Personal Milestones
Sully Omar Interview 2 years of building with LLMs in 35 minutes Welcome to the 100 people who have joined us since last week! If you aren’t subscribed, join 9,675 AI folks. View this post online. Subscribe Now Sully, CEO Of Otto (Agents in your spreadsheets) came on my new series AI Show & Tell I reached out to him because you can tell he feels the AI. His experience is not only practical, it's battle tested. Sully's literally built a product of autonomous async agents that do research for...
Joining ARC Prize How the cofounder of Zapier recruited me to run a $1M AI competition Welcome to the 2,450 people who have joined us since last post! If you aren’t subscribed, join 9,619 AI folks. View this post online. Subscribe Now "We gotta blow this up." That's what Mike Knoop (co-founder of Zapier) says to me in early 2024. "ARC-AGI, we gotta make it huge. It's too important." "Wait, ARC? What are you talking about?" I quickly reply. "It's the most important benchmark and unsolved...
Building a business around a commodity OpenAI's models are a commodity, now what? Welcome to the 296 people who have joined us since last week! If you aren’t subscribed, join 3,939 AI folks. View this post online. Subscribe Now Large Language Models are becoming a commodity. We all know it. So if you’re a foundational model company, what do you do? You build a defensible business around your model. You build your moat. Google famously said they have no moat, “and neither does OpenAI.” But...