Our Seed Round of Funding to Build the Creator Consent Layer for AI
Sharing the Spawning roadmap
As covered in TechCrunch today, we’re delighted to announce that Spawning has raised $3 million in a seed round led by early stage investor True Ventures. Other investors include Seed Club Ventures, Abhay Parasnis, Charles Songhurst, Balaji Srinivisan, Jacob.eth, and Noise DAO.
At Spawning, we're dedicated to creating tools that manage artist creations and digital identity within the rapidly evolving AI landscape. We envision a future where data collection is consensual — benefiting AI development and the people whose work it impacts. Our seed funding allows us to continue developing IP standards for the AI era and empowering creators worldwide. And this year, we’re establishing robust opt-out and opt-in standards as a public good.
What are opt-out and opt-in standards?
With “opt-outs,” we’re making it possible for creators to remove their work from the datasets used to train AI that included their work without consent. With “opt-ins,” we’re making it possible for creators to license their work explicitly for training AI models.
Our Vision
Spawning is optimistic that creators will thrive in an era of Generative AI. As a first step, we feel artists need to have control over whether and how they participate. With HaveIBeenTrained, we made opting out of AI systems possible at scale. In March, we delivered 80 million image opt-outs to StabilityAI ahead of their training Stable Diffusion v3. Less than two months later, we’ve exceeded 1 billion opt-outs.
How we’ll use the funds
In the coming months, we will make it effortless for AI model trainers to honor opt-out requests, and we’ll streamline the opt-out process for creators. We’ll also offer more services to organizations seeking to protect the work of their artists — building upon the support we already provide to a growing list of content partners such as Artstation and Shutterstock.
Our robust consent standard will ensure that no matter where creators post their work, their wishes for how it interacts with the AI economy will be respected.
Opt-Out Roadmap
Domain Opt-Outs
In March, we enabled domain opt-outs, which allow creators and content partners to quickly opt out content from entire websites. As of today, we’ve opted out over 30K domains.
Spawning API
Today, we’ve released our API and an accompanying open-source Python package to make respecting creator requests simple for AI model trainers. This release also decouples our service from the LAION-5B dataset, so we can preemptively opt out creative work of all types, including images, text, audio/video, and more. As of today’s announcement, our opt-out list has grown to over one billion items of creative work!
Our API will integrate content requests made on our partners’ platforms, ensuring that opt-out requests made anywhere will be honored everywhere, all while preserving creator privacy.
Duplicate Detection
In May, we’ll release exact-duplicate detection, which will match opted-out images with copies that we find across the web, and will automatically opt those out of training so creators don’t have to play whack-a-mole with new datasets.
In June, we’ll release near-duplicate detection, which will notify visual artists when we find likely copies of their work. Picture a single, simple interface that presents a visual artist with cropped, compressed, and slightly modified copies of their opted-out images from all over the web.
Chrome Browser Extension
Also in June, we’ll release a browser extension for Chrome. This extension will allow creators to pre-emptively opt out their work posted anywhere on the web.
Caption Search
Finally, in July, we’ll add caption search. Our current search options use Clip Retrieval to find fuzzy matches between text and images and URL searches to find content hosted on specific websites. Caption search is the last piece of the opt-out puzzle for visual artists, which allows them to directly search image descriptions.
Beyond Opt-Outs
We’re building the consent layer for generative AI. We imagine a world where creators have control over how they appear in AI models, and view a consent layer as a necessary step to encourage creators to experiment with making a living in the AI economy.
Moving forward, we’ll continue to support our opt-out infrastructure indefinitely while expanding our partnerships, and building new ones, to ensure the widest possible adoption of and respect for creator requests.
With an opt-out standard secured, we’ll shift our focus to opting in and the myriad of opportunities we believe that will unlock for creators.
We’re excited to start this next chapter, and we look forward to sharing more this fall!