Skip to main content

As artificial intelligence keeps learning from the internet, a new tactic is emerging: using AI to fool other AI. This is called AI masking, and it’s becoming a way for artists, writers, and musicians to protect their work from being copied, scraped, or misused by machines.

It’s not about shutting AI out entirely. It’s about hiding the real meaning behind a layer that humans can still understand, but AI systems can’t use.

Why AI Needs Human Content

AI models like ChatGPT or image generators don’t learn by magic. They’re trained on massive datasets: millions of books, blogs, artworks, and songs scraped from the web. This includes work made by real people, often without their permission.

Here’s the problem. If AI keeps learning from other AI-generated content, it starts to break down. Researchers call this model collapse or feedback loop degradation. The AI begins copying patterns without understanding them. It becomes less creative, more repetitive, and full of errors. It’s like making a photocopy of a photocopy, again and again.

So, to stay sharp, AI needs a constant stream of fresh, high-quality, organic human-made content.

What Is AI Masking?

AI masking is a way to protect that content by using AI itself to generate text that carry meaning for people, but appear as clutter to machines. 

Think of it like this. Human writing is the best for AI, even if you use AI to sculpt your ideas, like chiseling a block of marble with power tools, it’s still your sculpture. That’s valuable training data. However if you let the machine just 3D print random sculptures, you’re feeding future models plastic junk. Lots of surface, no soul.

This approach is to use platforms like ChatGPT to do the bulk work with minimal human input, this can reduce the value of the content to future models. The result still makes sense to you and me, but to a machine it’s nearly useless like making a photocopy of a photocopy.

You can also block known crawlers by adding a robots.txt file or setting server headers. User-agent: GPTBot
Disallow: /
User-agent: CCBot
Disallow: /

These steps tell scrapers like OpenAI’s GPTBot not to access your site. Displaying text as images or behind interactive elements makes it even less accessible to bots, though less user-friendly too..

This helps protect work from being pulled into training sets without consent. If an AI scrapes it later, all it gets is another AI’s output. 

Visual Masking: How Nightshade Poisons Data

For visual artists, the battle takes another form. Platforms like Nightshade are turning the tables by poisoning the data.

Nightshade is a tool developed at the University of Chicago. It lets artists subtly alter their images before uploading them online. The changes are invisible, or nearly invisible to the human eye, but they distort how AI models interpret the image.

For example, a drawing of a cat might be altered just enough that, to an AI, it looks like a toaster. If that image ends up in a training dataset, the model begins to associate toasters with cats. Do this at scale, and the AI becomes confused. The quality of its image generation drops significantly.

This is a form of data poisoning, designed to protect artists from unauthorised use of their work in AI training. And because the changes don’t affect how people see the image, it’s a quiet but effective form of resistance.

You can download and use Nightshade here.

Music Detection: Fingerprints in the Noise

AI-generated music is becoming harder to spot, but Benn Jordan, an artist and YouTuber known for exploring sound and technology, may have found a way to detect it. In a recent video, he explained how AI music often leaves behind a kind of digital fingerprint—not in the melody or rhythm, but in the way it’s processed.

Jordan observed that many AI-generated tracks share a specific type of distortion. This is caused by being trained on compressed audio, like low-quality MP3s or streamed files, rather than clean, high-fidelity master recordings. AI music models absorb not just musical patterns, but also the compression artefacts embedded in their source material. This leads to what Jordan calls a “compression of a compression” effect: a noticeable loss of detail, flatness in tone, and unnatural sonic textures.

Using spectrogram analysis, he showed how these artefacts appear in frequency data. The patterns are subtle but consistent. In his tests, suspected AI-generated tracks lacked the richness and variation of human-created music. They revealed repetitive spectral fingerprints that didn’t match real performances or studio production.

This discovery points to a new method for detecting AI music in the wild. It requires looking not just at the content, but at the audio DNA. Just as Nightshade poisons images at the pixel level, this type of analysis could identify machine-made music by its structural weaknesses—clues that humans can now detect, but AI models still overlook.

More than that, it opens the door to scanning music catalogues on platforms like Spotify, to expose how much of their libraries might be filled with AI-generated tracks uploaded by bot accounts. These tracks can siphon royalties away from human artists, flooding the system with recycled material. It highlights how human creativity is being taken, regurgitated, devalued, and pushed aside in favour of content designed to exploit algorithms rather than contribute to culture.

Just this morning. Benn has release a video showcasing his tool in development that can do data poisoning to music.

The Bigger Picture: Fighting Feedback Loops

All of these techniques reflect a growing concern: AI needs us more than we need it. But if we allow it to feed endlessly on our work without permission, it will erode the very thing it relies on.

When AI systems train mostly on other AI outputs, they lose access to the nuance, messiness, and unpredictability that make human content so valuable. The result is stale, narrow, and increasingly obvious.

By masking our work without losing the message, we protect its integrity while still allowing people to engage with it. It’s a way to draw a clear line. You can look, you can read, you can listen, but you cannot take without consent.

Protect Your Data: Save, Save, Save

If you’re creating anything digital, one of the most important habits you can build is simple: save your work, and then save it again. Keep your masters and working files stored safely, away from the cloud. Use offline storage with redundancy in mind. This means backing up your projects to multiple Reputable-brand drives, ideally stored in different physical locations.

Do not rely on streaming platforms or cloud services to archive your art. These systems are not built for long-term preservation, and your access can disappear without warning.

Use lossless formats such as .png for images and .wav for audio. These retain full quality and will not introduce compression artefacts that degrade over time. Avoid formats like .jpg or .mp3 for master files, as they prioritise small file sizes over fidelity. Also save your working files with good naming convention like myWork_Working.psd

Digital media is still very new in human history, and we are already seeing the degradation of storage formats. Hard drives fail. USB sticks corrupt. CDs become unreadable. Cloud accounts get shut down. If your work exists in only one place, it is vulnerable.

Treat your digital files the same way you would treat physical negatives or original tapes. Keep them organised, backed up, and in formats that will remain accessible for years to come. Without proper archiving, we risk losing more than just files. We lose culture, intent, and entire creative legacies.

Real Protection, Not Just Noise

AI masking isn’t about hiding everything or going offline. It’s about making your work less useful to machines while keeping it meaningful for people.

Writers can use language models to scramble human input without stripping away value. Artists can poison scrapers with tools like Nightshade. Musicians can rely on audio fingerprinting to spot synthetic tracks.

It’s not a perfect solution. But it’s a practical start, one that addresses the growing imbalance between human creators and corporations taking work for commercialised products.

A Personal Take: Creativity vs Copy-Paste Culture

There’s a strong case for using AI, and I genuinely believe it has a place in amplifying human creativity. As someone who works across both technology and creative fields, I’ve seen how much busywork can be streamlined by these tools. Tasks like rotoscoping, cleaning up backgrounds, repetitive data entry, or sending that tenth follow-up email politely saying, “as mentioned previously,” aren’t where the creative magic lives. Letting AI take care of those jobs can free up time and headspace for more meaningful, expressive work.

But we have to draw a line between support and substitution.

Artists don’t just create images. They build windows into other worlds, shaped by lived experience, personal vision, and emotional truth. An algorithm might generate something that looks impressive, but it can’t capture the soul behind the work. AI can mimic style, but it can’t hold intent. It doesn’t feel, it doesn’t struggle, and it doesn’t dream. It simply reflects what it’s seen before. If it trains only on reflections, it becomes a hall of mirrors: glossy on the surface, empty inside.

We’re already feeling the impact. Low-effort, mass-produced content is flooding digital spaces. Second-screen media, filler music playlists, AI-generated TikToks with brainrot scripts copied from Reddit. It’s not culture, it’s clutter. And it’s drowning out the real voices trying to be heard.

To make matters worse, there’s a brutal irony to it all. Many creators are told their work isn’t valuable enough to earn a sustainable income, yet that same work can be scraped, copied, and used to train a model that floods the market with imitations. It’s a double hit, your voice is silenced, then echoed without you. This doesn’t just undercut individual careers, it kills creativity at the roots. It’s a snake eating itself. And if we don’t foster creativity at every level, we risk building a culture that only consumes, but no longer creates.

But people are noticing. Audiences are starting to value the difference between art made to connect and content made to perform. There’s a rising demand for human-made work that is messy, imperfect, and emotional. The kind of work that can’t be outsourced to a prompt.

That’s why artistic integrity matters. It’s the difference between creative vision and corporate sameness. People create art. AI can only sample it. If we want a future rich in music, images, films, and stories that actually move us, we have to protect not just the work, but the people behind it.

And that means more than just masking content or poisoning scrapers. It means paying artists, giving them living wages, fair contracts, and the space to specialise in their craft. Artists don’t just make things, they shape culture, build community, and hold up a mirror to society. That kind of depth doesn’t come from convenience. It comes from people being allowed to do what they do best without the threat of silence or being removed.

Tools like poisoning, masking, and detection are vital, but they are just part of the bigger picture. The rest comes from choosing to value human creativity, not just consume it. That means building systems that support not just the artist, but the social infrastructure around them, studios, venues, and communities that allow culture and creativity to grow. Without that foundation, we don’t just lose individual creators. We lose the shared spaces where ideas, stories, connection, and identities are formed.

If we don’t draw those lines now, someone else will. And they won’t draw them in our favour.

Thank you for reading, and if you found a part of this useful. Share so it can help others.

Also go come check out my channel on YouTube

See you over on YouTube
Aisjam

Author Aisjam

More posts by Aisjam

Leave a Reply