I got bored and watched seasons 1-2 of The Walking Dead.
Somehow the studio (big corporate media) managed to avoid paying George A. Romero (independent film-maker) for his ideas.
This also gave them imprimatur to purge all the sharp satire and social commentary in the Romero films, leaving only cannibal zombies, which become the backdrop for "human" soap opera. Every ten minutes or so, a domestic argument is interrupted by a knife or bullet entering a zombie's brain cavity.
This TV show about a cannibalism-causing virus ran for nine years before the current pandemic, and is very popular. It's almost as if our corporate overlords were preparing us for something. As if on cue, upon the covid announcement in 2020, people began stocking up on toilet paper.
Watching zombie apocalypse theatre prepares us for a harsh world, run by gangs. I see it as more of an indoctrination process leading to a future like the one Philip K. Dick envisioned in his book The Penultimate Truth (1964).
The book starts with people living desperate lives in deep underground shelters, in the aftermath of an atomic war. Rationing is severe, and rumors circulate about terrible viruses spreading through the population. Each day, TV screens show images of violence and devastation outside the shelters, as the war continues on the Earth's surface.
After a few convincing chapters of this, Dick reveals that it's all fake. The world outside the shelters is green and beautiful, and all the land has been divided into park-like "domains" owned by a few lucky rich people. A small industry of advertising men keeps up the horror show illusion for the benefit of the poor slobs underground.
Something like The Walking Dead belongs in that genre. Entertainment for the locked-down rubes, to be consumed while their betters enjoy the good life in gated estates and on private islands.