I want to be clear about something from the start: I am not anti-AI.
I use AI tools daily. They have made parts of my workflow faster, helped me think through problems differently, and opened up creative possibilities I would not have explored otherwise. The technology is remarkable, and I believe it will continue to produce genuine value across almost every industry.
This is not a Luddite manifesto. This is a letter from someone who has spent decades building an aesthetic practice, who sees the power of AI clearly, and who is asking a very specific question:
Why should the years I spent developing my visual voice be available for free to any company that decides to scrape the internet?
The thing about style
When people talk about AI art scraping, the conversation tends to focus on individual images. Someone's painting shows up in a training dataset. A photographer's portfolio is indexed by a crawler. An illustrator's work appears in the output of a generative model.
These are real problems. But they are not the deepest problem.
The deepest problem is style.
An aesthetic style is not a single image. It is the accumulated result of every creative decision you have ever made. The colour palettes you gravitated towards in your twenties. The compositional habits you developed through thousands of hours of practice. The techniques you learned from mentors, refined through failure, and eventually made your own.
Your style is your creative fingerprint. It is what makes clients hire you specifically, not just any artist who works in a similar medium. It is the reason someone can scroll past a hundred images and stop at yours. It is, in the most literal sense, your professional identity.
When an AI model trains on your work, it does not just copy your images. It learns the statistical patterns underlying your aesthetic choices. It absorbs your style — the thing that took you twenty years to build — and makes it available to anyone with a text prompt.
This happens in seconds. The scraper does not care that those colour choices reflect a decade of experimentation. The training pipeline does not know that your compositional instinct was shaped by years of studying classical painting. The model does not understand that the thing it has just decomposed into weights and parameters represents someone's life's work.
What I am not arguing
I am not arguing that AI should not exist. I am not arguing that generative models have no legitimate use. I am not arguing that technology should stop progressing.
I am arguing that artists should have control over whether their aesthetic is used as training material. That is it. That is the entire position.
If a company wants to train a model on my work, there should be a conversation. There should be consent. There should be compensation that reflects the value of what they are taking.
This is not a radical position. It is how every other form of intellectual property works. Musicians license their recordings. Writers control their publishing rights. Software developers choose open-source or proprietary licensing for their code. In each case, the creator decides how their work is used and on what terms.
Visual artists deserve the same agency.
The consent gap
The current system has no mechanism for this. Crawlers do not ask permission. Training datasets are assembled in bulk, with no individual notification or opt-in process. The legal frameworks are years behind the technology. And the practical tools available to artists — robots.txt, Do Not Train flags, opt-out registries — are trivially easy to ignore.
The result is a consent gap. Artists share their work online because that is how the modern creative economy works. Portfolios, social media, client galleries — visibility is not optional. But sharing work publicly was never intended as a blanket license for commercial extraction.
When I post a painting to my portfolio, I am showing potential clients what I can do. I am not consenting to have my aesthetic decomposed and used to train a product that competes with me. These are fundamentally different acts, and the technology industry's conflation of them is not an accident. It is a business model.
Why I built Art Vault
I built Art Vault because I wanted a tool that matched the seriousness of the problem.
The consumer tools that existed — and I used them, and I appreciated them — were a starting point. But they were designed as individual shields, not as institutional infrastructure. They relied on single techniques that were eventually bypassed. They had no update mechanism to respond when they were defeated.
Art Vault takes a different approach. Three layers of adversarial protection, updated quarterly. Cloud-based processing for consistent quality. A service model that funds the ongoing research needed to stay ahead of bypass techniques.
Your art looks exactly the same after protection. The colours, the composition, the texture — all visually identical. But the statistical fingerprint that AI models extract for style mimicry has been scrambled. The model chokes on it. Your aesthetic is no longer legible to the machine.
That is all I wanted to build. Not a weapon against AI. Not a political statement. Just a practical tool that gives artists the one thing the current system denies them: the ability to share their work without involuntarily surrendering their creative identity.
The principle
If an aesthetic is to be scraped, it should be at the artist's discretion, and for a fee they agree to. Not taken arbitrarily, not justified post-hoc, not hand-waved away as the cost of progress.
Decades of craft should not be available for free extraction because you had the audacity to share your work online.
The control should be with the artist. Full stop.
This is what Expression Labs exists to make possible. Not by fighting against AI — which is here to stay, and which I will continue to use — but by building the infrastructure that ensures creative professionals maintain sovereignty over the work that defines them.
Your aesthetic is not a dataset. It is the visible evidence of a life spent creating.
It is worth protecting.