catherine-cortes-masto

COMMENTARY

How Democrats won the West | John A. Tures

BY: - January 2, 2023

Since 1992, Democrats have flipped the region away from Republican control, a shift that began with the end of the Cold War and carried through a Pacific Coast economic recession, anti-racism demonstrations and violence in Los Angeles and the area’s increasing diversity.