Furthermore, what part of the country are you living in that leads you to believe we are less racist than other countries!? Our racism has defined our country ever since it was created.
Seriously, I am curious what part of the country you live in? It is sheltered from most American realities
Your point was that we’re better because we talk about it.
All over the country legislatures are banning books, and curriculums that even mention racism. It isn’t an isolated incident either.