Why do some white Americans always have this irrational fear of a "race war?"

Like, bruh, y'all hold all the power in America. The judicial, financial, and housing systems/institutions are all set up for you.

Nothing's happening to y'all.
Every time black people point out the wrong that has been done by whites, some whites try to diminish it by calling it "race baiting" or "racial riling up" as if black people are some kind of animals who can't control themselves when agitated.
Anyway, most of what Dems are saying is true.
