Is there a culture war in the US right now? There is utterly a culture war. I place the blame squarely at the feet of the DEMOCRATIC PARTY.
At some point, some democratic strategist had a bright idea. It made so much sense! If the party moved to the left, what would that gain them? They were already the party of the left, and all votes to that side of the spectrum were effectively won. But what if they moved to the right? The left had nowhere else to go, the logic of winner takes all makes the emergence of a third party challenger unlikely. But suddenly, a whole universe of "reasonable republicans" would become accessible to their ballot boxes. They could only win!
The Republicans might have responded by moving to meet them in the center. But they were cannier than that, they understood their side better: the right is not really about ideals, unless a tribal fealty to power is an ideal. So then, they moved to the right, and took their whole party with them. The democrats didn't gain shit. But, being morons, they chased after the republicans, left their whole base behind them, and became the party of nothing. The republicans, happy to oblige, moved themselves ever further right, straight into the abyss.
Now we have a left which is abandoned, incohate, powerless, bewildered, and despairing, and a right, still delirious with their monumental upset in 2016, which has evolved into a full blown fascist clown cult. Media algorithms have further segregated the two sides to the point where they don't share a common frame of reference, they both walk the earth, but live in entirely separate worlds. They are entirely separate cultures, and entirely opposed. In such a situation, the "culture" war may plausibly be a prelude to plain, war.