The Crayon Still Has to Be Picked Up
I remember my kids sitting at the table with blank sheets of paper in front of them. Sometimes they knew exactly what to draw. Sometimes they just sat there, crayon in hand, not quite sure where to start. That image has been coming back to me lately, watching how our industry talks about tools and the future of digital product creation.
There is a version of this conversation where designers, product leads, and developers become optional or unified. Tools like Claude Code or Cursor take a description and produce working software — interface, logic, and implementation all in one. The input boxes accept natural language. Specific skill sets, the ones product experts spent years developing, start to look like an expensive detour.
The Tool Landscape Is Shifting
How we build digital products has always been shaped by available tools. Design moved from Photoshop to Sketch to Figma. Development shifted from manual implementation to component systems and SaaS integrations. Each shift changed how practitioners worked and how fast.
But with tools like Cursor, Claude Code, or Lovable, something more fundamental changed. For the first time, you don't need to master either discipline to ship something working. Describe what you want, and the tool figures out the rest — even PMs can do that now. The enthusiasm is understandable.
Before the Prompt Exists
There is something that enthusiasm about AI tools tends to overlook. Every creation tool, at the moment you open it, confronts you with an empty screen. This is not a minor friction.
Research on creative cognition consistently shows that the hardest moment in any creative process is not execution — it is the transition from no direction to first direction. Herbert Simon described this as the "ill-structured problem": the kind where you don't yet know enough to know what you're solving. More options, more capability, more output speed — none of that makes this transition easier. It makes it harder, because you can now generate a lot of work in the wrong direction very quickly.
Most people using generative AI today are still operating in a narrow band: generating images, building quick demos, producing prototypes that will never see production. That is not a criticism — it reflects where the tools are still being understood. But it also illustrates the underlying pattern. When direction is unclear, the output looks plausible and goes nowhere. The tool is not the problem. The absence of upstream thinking is.
Figma templates, AI-generated starting points, vibe tools that produce working screens from a sentence — these lower the barrier to starting. They don't change the underlying problem.
Because before any tool is opened, something else has to happen. Curiosity about a specific user problem. Dissatisfaction with how something works. An idea that a different approach is worth pursuing. That judgment — shaped by domain knowledge, user empathy, and experience — is what sets everything else in motion. Even the most capable AI waits for a prompt. It does not initiate. It does not decide what matters. Someone has to make that call.
What Actually Changes
The tools will keep evolving, and some roles will narrow. The solo designer who spent most of their time producing screens will find that work automated faster than they expected — probably within two to three years at current trajectory. The same applies to developers whose primary contribution is translating a spec into working code.
What does not automate is the role that sits upstream. The person who facilitates the conversation between business goals and user needs. Who synthesises research into a direction. Who frames the problem clearly enough that the tool can execute it well. That role might not carry its current job title in five years. But the function will exist, and it will matter more, not less — because the cost of prompting in the wrong direction just went up significantly.
The question is not whether your current job title survives. It is whether you are developing the judgment that the new version of this work requires. That judgment grows from the same places it always did: working closely with users, understanding organisational constraints, sitting with ambiguous
problems long enough to frame them well. The tools don't replace that practice, they make it more consequential.
The crayon still has to be picked up by someone. What matters is knowing what you want to draw before you reach for it.