Something is wrong with American journalism. Long before fake news became the calling card of the Right Americans had lost faith in their news media. But lately the feeling that something is off has become impossible to ignore. Thats because the majority of our mainstream news is no longer just liberal its woke. Todays newsrooms are propagating radical ideas that were fringe as recently as a decade ago including antiracism intersectionality open borders and critical race theory. How did this com