Домой United States USA — mix Who Told Nikki Haley And Ron DeSantis That America ‘Has Never Been...

Who Told Nikki Haley And Ron DeSantis That America ‘Has Never Been A Racist Country?’

155
0
ПОДЕЛИТЬСЯ

Haley and DeSantis insist that centuries of racist acts don’t necessarily mean America is or ever has been racist. There are potential explanations for their conclusions.
In a Fox News Channel interview, Republican presidential candidate Nikki Haley confidently proclaimed that America “has never been a racist country.” She went on to say, “I faced racism when I was growing up, but I can tell you today is a lot better than it was then.” Florida Governor and presidential hopeful Ron DeSantis was subsequently asked in a CNN Town Hall if he agreed with Haley. “The U.S. is not a racist country, and we’ve overcome things in our history,” was his response. Nether Haley nor DeSantis disclosed the data sources on which their declarations were based. With whom have they spoken specifically about this? What are the racial demographics of their informants?
“America has always had racism, but America has never been a racist country,” Haley campaign spokesperson AnnMarie Graham-Barnes said in the statement. “The liberal media always fails to get that distinction.” Beyond the critique of journalists, there was no acknowledgement that Americans who have historically and contemporarily experienced racism often have a very different appraisal of our nation. Instead, both Haley and DeSantis determined that centuries of racist acts don’t necessarily mean America is or ever has been racist. There are a few potential explanations for how they reached such conclusions.
Despite DeSantis’ war on so-called woke education in Florida, there’s no evidence to confirm that K-12 teachers (nearly 80% of whom are white) are aiming to convince schoolchildren that America is racist.

Continue reading...