From prompts to geometry: Using Claude with Rhino3D

Apr 10, 2025AI Practice, design, Tools

TL;DR: I used Claude to model a cabin in Rhino3D using only a photo and a prompt and it somewhat worked. This experiment revealed a shift: from learning complex software to mastering clear, descriptive language. In an AI-driven design future, our words may become our most powerful modelling tool.

 

A few weeks ago, I set out to test a simple but increasingly relevant idea: how much of a design task could I complete without touching traditional modelling software?

My goal was to build an AI agent that could handle spatial design tasks. Not just text generation or image generation, but the kind of 3D modelling work that normally requires technical fluency in tools like Rhino3D or Grasshopper. Instead of clicking, scripting, or modelling by hand, I wanted to see what would happen if I just described what I wanted and let an AI do the rest.

So I wired up Claude, Anthropic’s large language model, to Rhino3D, using something called Model Context Protocol (MCP). MCP is a standardised, open protocol that connects AI models with external data sources and tools, and allows the AI systems to perform actions beyond their initial training data.

The design outcome?
It was clunky.
It was weird.

But it was one of the most thought-provoking design experiments I’ve done in a while.

From image to abstraction

I started with a prompt and an image: a photo of a small cabin. Nothing too complex, just a straightforward request: “Model a simplified 3D version of this cabin in Rhino3D using primitives and basic geometry.”

I was expecting Claude to get confused or stall. After all, even experienced designers have to squint and guess at proportions when reverse-engineering from an image. But instead, it began building, not with perfect precision, but with a kind of abstract intuition.

In Rhino3D, a few simple shapes began to appear: a blocky base, a pitched roof, openings in the right places. It wasn’t the cabin, but it was unmistakably a cabin. Claude had translated the image into form using a kind of high-level architectural reasoning. It saw the sloping roof, the base volume, the fenestration, and responded with geometry that matched the vibe.

That was the moment something clicked for me.

This wasn’t just about technical performance. It was about language as a design medium.

From scripts to sentences: The reversal of digital design

Over the past two decades, designers have invested deeply in learning the tools of computation. Rhino. Grasshopper. Processing. Python. C#. We became fluent in technical systems because we believed that code would expand our creative range.

And it did. Programming made us more powerful. It let us produce complex geometries, test systems, and explore design space at scale. But this experiment made me realise that we’re now entering a new phase:
Technical expertise is less about knowing how to operate a tool,
and more about knowing how to talk about design with precision.

With AI models like Claude acting as co-creators or agents, the key question becomes, “Can you describe what you want clearly enough that an AI can model it?”
In a way, we’re coming full circle. Before parametric modelling and visual scripting, design was about conversation, diagrams, sketches. Now we’re designing with prompts, returning to language not as a documentation tool, but as an active part of the modelling process.

And that’s no small shift. It repositions the role of designers from tool operators to design narrators. People who can frame ideas, set constraints, and guide form-making through dialogue.

Implications for design practice and education

This has big implications for how we teach and practice design.

For students:

  • Prompt engineering might soon be as important as learning Rhino or Grasshopper.
  • Expressive clarity and conceptual thinking will shape how well you can use AI tools.

For educators:

  • It’s time to rethink “digital skills” — not just teaching tools, but teaching language for design
  • Assignments could shift toward describing design problems and iterating through conversational workflows

For professionals:

  • Delegation to AI agents could change team dynamics, workflows, and what we consider “design work
  • Technical depth might matter less in the long run than the ability to orchestrate tools through intent

This doesn’t mean code is dead. Far from it. But it does suggest that the next wave of design fluency will be verbal, not visual. That the most powerful designers might be those who can articulate design intent with precision, empathy, and creativity.

Hello! I'm Linus, an academic researching cognition, behaviour and technologies in design. I am currently writing about AI in Design, academia, and life.