On March 19, 2026, Pearl Abyss launched Crimson Desert, their highly anticipated $70 AAA open-world RPG. Within 24 hours, players discovered AI-generated artwork scattered throughout the game — distorted portraits fusing humans and horses, medieval paintings with obvious AI artifacts. The developer issued a public apology, added AI disclosure tags to Steam, and began patching out the offending assets. However, this is becoming a pattern.
The “Placeholder Excuse” Is Running Thin
Pearl Abyss claimed the AI assets were “experimental tools” for early iteration that were “always intended to be replaced before shipping.” This is the exact same excuse Clair Obscur: Expedition 33 used when it lost its Indie Game Awards wins in December 2025 after AI assets were discovered.
Moreover, game developers responded to the Crimson Desert controversy by sharing what real placeholder art looks like: MS Paint nonsense, meme images, bright pink squares. The message: real placeholders are intentionally terrible to ensure they get replaced. AI-generated images look polished enough to ship, which is exactly the problem.
This reveals a process failure, not an honest mistake. If AI art can slip into a $70 AAA game’s final build, the studio’s QA standards are broken. As Techdirt puts it: “If you ship it, you own it — ‘placeholder’ doesn’t excuse shipping AI slop.”
The Industry Disconnect: 97% Adoption vs 85% Player Rejection
97% of game developers now use AI-assisted tools for some form of asset creation, with 50% of studios actively deploying AI in their production pipelines. Meanwhile, 85% of gamers hold negative attitudes toward AI in games. Consequently, this fundamental tension is exploding into public controversies.
AAA budgets now average $80-300M+ with development times of 5-7 years. Studios are turning to AI for texture upscaling, retopology, and set dressing to manage ballooning costs. Nevertheless, there’s a critical distinction being lost: AI-assisted workflows (where humans refine AI output) versus AI-generated content that ships to players. Studios are crossing that line and players are noticing.
Economic pressures don’t excuse shipping substandard work. The issue isn’t AI tools during development — it’s what ships to players. Studios need to draw a bright line: AI can accelerate the process, but human artists must own the final output that players see.
Transparency Is Non-Negotiable
Steam updated its AI disclosure policy in January 2026, requiring developers to disclose pre-generated AI content that “ships with the game files.” Crimson Desert had no AI disclosure until players discovered the assets — then Pearl Abyss added the Steam tag post-controversy.
Pearl Abyss admitted: “We should have clearly disclosed our use of AI.” Furthermore, the disclosure came only after being caught, not upfront. This is the transparency problem in a nutshell — studios are using AI but hoping players won’t notice rather than being transparent from the start.
Transparency is the dividing line between acceptable and unacceptable AI use. Players can understand AI-assisted development if disclosed honestly. What they won’t tolerate is being deceived — which is exactly how post-facto disclosure feels.
Premium Prices Demand Premium Quality
When players pay $70 for a AAA game, they expect premium quality. AI placeholder art in the final build violates that social contract. Additionally, studios can’t have it both ways — charging premium prices while cutting corners with AI assets that were “forgotten” in the shipping build.
Crimson Desert launched to “Mixed” Steam reviews, with AI art controversy contributing to player dissatisfaction. The reputational damage from shipping AI assets may cost more than hiring artists to create proper assets would have. Short-term savings aren’t worth long-term trust damage.
Here’s where we draw the line: Premium games deserve premium quality, period. AI can be part of the development pipeline, but what ships to players must be human-crafted or at minimum human-refined to professional standards. “Placeholder” is not a valid excuse for shipping work to paying customers.
The Path Forward
The US Copyright Office has ruled that purely AI-generated works cannot be copyrighted, creating massive IP risk for AAA studios. Therefore, this legal reality is forcing hybrid workflows where human artists must significantly modify AI output to maintain copyrightability. Studios are adopting a two-tier approach: AI-generated scale (terrain, vegetation, generic NPCs) paired with human-crafted hero assets (main characters, signature pieces).
The industry faces a choice: transparent AI use with human oversight, or continued controversies that erode player trust. Steam’s disclosure requirements are tightening. Award organizations are taking hard stances against undisclosed AI use. In fact, players are getting better at spotting AI artifacts.
The solution isn’t rejecting AI tools — it’s maintaining quality standards and transparency. AI for scale, humans for craft. Disclosure upfront, not after being caught. Process controls that prevent AI from shipping unrefined. The technology isn’t the problem. The shortcuts are.

