AI-native vs AI-bolted-on: a category distinction for 3D tools
Pillar · editorial
# What "AI-bolted-on" looks like
- An "AI" button appears on the canvas — clicking it opens a sub-modal that produces an asset and asks where to place it.
- Plugins for desktop DCCs (Blender GPT, etc.) emit code that breaks when the host app updates.
- The AI is opt-in; the rest of the tool exists without it.
# What "AI-native" means
- The scene graph and the AI are co-designed. Tool calls map 1:1 to scene mutations.
- Prompt is a first-class input alongside mouse and keyboard.
- The AI improves the system as base models improve — you can swap Cerebras for Claude for GPT-5 with a config change.
- Without the AI, the tool still works — but the AI is the primary mode, not an extra.
# Why the distinction matters
Bolted-on AI tends to lag. Every Blender update risks breaking community plugins. Every API change in a 3D editor risks breaking the AI plugin layer. AI-native systems, by contrast, ride the model wave: when a better LLM ships, the entire system gets better without code changes.
This is the same dynamic that played out between code editors. Cursor was AI-native; bolting AI onto VS Code was bolted-on. Both had a moment; AI-native won.
FAQ
Is Spline AI-native or AI-bolted-on?
Bolted-on, by their own positioning. The AI features were added to the existing Spline editor; the editor exists fine without them.
Is Yugma AI-native?
Yes — the scene graph is shaped for tool-call mutation, and the primary input is the prompt. You can use Yugma without the AI, but the editor is built around AI-first.
Will bolted-on AI catch up?
Some will, especially if they re-architect. Most won't, because re-architecting an existing tool around a new primary input mode is a hard reset.