Intellectual Inertia: The Death of New Ideas
- Liana Yadav

- Apr 30
- 8 min read
Updated: Oct 8
Technology is taking us further, but holding creativity back.
Creativity is a non-renewable resource and it’s currently facing a very real threat of being exhausted.
Art across disciplines has become particularly referential as of late. Current-day aesthetics—office siren, old money, coquette—make reference to past fashion trends, instead of curating an entirely new look to define this generation. Magazine covers feature tried-and-true faces of actresses, musicians and iconic supermodels, instead of discovering new talent. Hollywood greenlights yet another blockbuster or reboot, instead of taking a chance on an independent filmmaker. Powerful forces of automation and capitalism are cultivating a system which is preventing fresh ideas from emerging. As a result, culture and creative expression today is defined by nostalgia.
Most online content is generated on the back of something else. Trends are copied until no one remembers why they were even started. Cultural theorist Paul Virilio coined a term for this in 1989, “Polar Inertia.” He described it as the “instantaneous present that has replaced space: everything happens without the need to go anywhere.”
The popular nature vs. nurture debate usually leads to the same conclusion, while traits such as creativity can be honed over time, it is for the most part inherent in all of us. People have fresh ideas from the moment they are born. But for these ideas to grow, they need a place where they can be nurtured, appreciated and criticized, so that their value can increase with each round of reflection. This is why ideas thrive in collaborative spaces like studios and classrooms and grow to be better and more evolved when bounced off of other ideas.
As children grow older and internet use (and with it, AI) dominates most spaces they function in, the need to generate new ideas decreases. It's not that there aren't new ideas, it's just that there is no incentive to generate them. Creativity isn’t dead as much as it is lying on its back, sputtering heavy wheezes of breath as it whimpers for help, a last battle cry before the AI monster crushes it into oblivion.
The Death of Opinions
Being “woke” (i.e. being aware of the rotten systems that plague the world), is an important quality to possess today. Wokeness is associated with a certain political structure; it is so intrinsically tied to liberal ideologies that a person with conservative beliefs may refuse to adopt it, even if parts of it apply to them. Being woke becomes not just about being in the know, but being in the know about the right thing. The right thing, in this context, is measured by popular ideologies and public opinion online. These voices call for a knowledge of the past as we create our future. After so many years of blatant discrimination—racism, classism, misogyny—wokeness creates a pressure on people to question unconscious biases embedded in their belief systems.
Yet, this online trend does not echo reality. Conservative politics have seen a resurgence in the world. Political leaders are outright undoing the progress that took many painful, laborious years to reach, trumping much hope for the future.
Where has the pressure to be “woke” really gotten us? The reason it hasn’t “changed the world” is that at the end of the day, being woke is a trend. The chances of people wholeheartedly accepting time-contingent ideals are slim.
All this online pressure does is ensure that people put up a performance while continuing their transgressions in private. Repost a tweet on eating the rich while you vacation at your summer home in the Hamptons, preach about uplifting women as you tear them apart behind closed doors, condemn the California wildfires as you contribute to carbon emissions dangerous enough to wipe out a whole country.
“Woke” is a superficial agenda. It has led to a false sense of being informed, the mistaken knowledge that the pockets of “news” we consume is enough to contribute to a well-rounded understanding of the world.
The Death of Ignorance
There has never been a time in the world where we have been so acutely aware of everything that is going wrong. Wars, blasts, fascism, discrimination—these injustices are a footnote in our lives. We cannot possibly ignore all this pain so we succumb to being exposed to it.
Mindless entertainment and numbed hours spent on the internet—browsing, scrolling, watching—are a rebellion of sorts. It is both self-harm and self-protection to tether ourselves to the internet in a way that is now ingrained in our muscle memory. Any time you open social media, you are opening yourself to the possibility of watching, reading about, or hearing an innocent person suffering. It happens often, making every second spent scrolling merely a waiting game for that moment.
The alternative, however, is even worse. The not-knowing. What happens when an injustice happens behind your back? Somehow, you feel even more helpless. At least in a world that feels so far beyond saving, you can arm yourself with knowledge. It is one of the only things that is being distributed extensively free of cost—or at least, free of any visible costs.
There is also the added emotion of restlessness when you know of every peril that awaits humanity. Nothing can be done, and yet. A share feels important, a supporting comment meaningful. You can’t donate all that you have but when you do what you can, it feels good. The capitalists may have won our minds and time, but they have not yet captured our hearts. Community is a strong benefit of what social media is giving us, and one that feels wrong to surrender by abandoning the platforms that hand it out.
David Foster Wallace predicted the age of irony we live in now, where any and all sincerity is mocked with a fashion that was once deemed rebellious. This mirrors today’s culture and media trends which seem to exist solely to mock reality. Shows such as Severance and Black Mirror are so extensively watched because they are a sadistic and tragically aesthetic (or aesthetically tragic) version of our lives.
Severance takes the myth of work-life balance in corporate culture to a garish extreme—what if companies wanted to put a chip in your brain that creates a psychological divide between your work self and real self? When compared to pre-Y2K shows, say Friends, we hardly see any satirical work that mocks the systems of their world. A group of 20-something New Yorkers hang around in a cafe practically everyday with no regard for money, their jobs, and others who might want to sit on the couch they hog all day.
While it can be argued that Friends was as such because it reflected much simpler times, pre-internet, pre-pandemic, and pre-smartphones, the show still carried nuance in the capacity a comedy sitcom of the time can. There was relationship plight, character development (albeit surface-level), a somewhat natural progression from the naïvete of youth to the somberness that comes with age, stretching beyond singular episodes and lending to story continuity. It was a world of characters we could relate to, see ourselves in.
Characters in Severance, on the other hand, are dystopian caricatures. We live in a culture that is constantly mocking the culture we live in, and as noted by Wallace, rebellion seems to have become the norm. Our constant state of awareness has created a numbness to the knowledge we are aware of. If everything means something, then nothing means anything. If every waking second can be spent consuming ideas, no one will ever feel the need to come up with their own.
The Death of Combinations
This might be an important question to ask at this point: what is considered a new idea?
Many great minds have attributed other minds that led them to the discovery of their great idea. In 1904, French mathematician Henri Poincaré called invention the art of choice, the intelligence of discerning which elements of “useless” existing ideas can be combined together to create a “useful” one. Mark Twain once wrote a letter to 12-year-old Helen Keller after a plagiarism allegation over her short story. “No doubt we are constantly littering our literature with disconnected sentences borrowed from books at some unremembered time and how imagined to be our own, but that is about the most we can do,” he wrote, concluding that the plagiarism allegation is “ignorant rubbish made by solemn donkeys.” I wonder what Twain would say if he saw the secret algorithm that governs every successful piece of content on social media today. Trends are quite literally plagiarism, copies of an idea whose source is lost in a sea of recreation.
AI creates a new dilemma in this process. If we so willingly partake in this sort of regeneration of ideas, why is it a threat for AI to do the same?
All answers to this question can simply be opinions. If backed by research, they can be called well-informed opinions, but opinions nonetheless. There is no way to tell what the future is going to look like, just like how there was no way of telling where the Internet would take us in the year 2000. Many in the past have deemed their own times to be one marked by posers who recreate styles instead of creating new ones, but this time, something is dramatically different.
We now have a machine that is copying at a rate collective humanities haven’t been able to. The fear is that the machine will give rise to a time where even copying ideas to produce our own combination of a unique set will become an obsolete human task. Why string three sentences together if AI can do it, possibly better? Writing them individually will seem even more unachievable and conceiving of the idea behind them—unimaginable.
This is the threat that AI’s existence poses in spaces of creation. If AI is only re-churning existing work, we might face a massive idea shortage soon. The “dead internet theory” is a conspiracy theory that takes this fear to an extreme, suggesting that most of the content that we see online is generative. This is a scary thought. It suggests that new ideas are endangered, that we might be handing one of our most human activities—thinking—to a machine.
The Death of the Impossible
To me, an ideal AI bot would be modelled with a focus on improving mental health and encouraging authenticity. AI app Writer prompts the user with questions and arguments while writing, in a way giving direction so the writer doesn't lose their way. This can be a helpful writing tool (though again, probably perilous to rely upon completely), especially for those suffering from a shortened attention span.
The only tweak I would make is that instead of creating its own prompts, the AI system would be taught to surf the internet and prompt the user with existing arguments. This would be like having experts looking over your shoulder, constantly asking you “what if” questions. While potentially annoying if overused, this might help writers overcome writer’s block and avoid dead-ends. It would allow a unique idea to be challenged with other unique human ideas, instead of a randomly produced combination of generative words.
Planes were once revolutionary, they let us travel distances we could have never covered on our own. AI can serve as a similar means to an impossible end. Like other technologies of the past, it has the ability to help humankind achieve insurmountable heights: take us beyond space, help us figure out what happens after we die, give a concrete answer to why we all exist on this planet. What I do not want it to do is make the possible impossible—to make simple acts of thinking and writing obsolete. We have come a long way and any new technology should only take us further, not inhibit our growth.





Comments