There is no doubt that the United States has a history of racism, most recently with the slavery and discrimination of black people. There has been huge progress over the years to ensure that nobody will ever again be discriminated against due to the colour of their skin, but many still argue that the US is a racist country and racism is actually ingrained in people and cannot be changed.
Whether this is true or not is debatable leaving the question, does the United States have a racism problem?
- Have you ever been to the US?
- If yes, did experience or witness any kind of racism?
- Why do you think some people still believe the US to be a racist country?
- Why do you think others don’t?
- Are you born racist or is it something you learn?
- Do you think the US could do more to prevent racism?
- Are there any news events you can think of that would support the argument that the US is a racist country?
- Racism noun
- Racist noun/adjective
- Discriminate verb
- Slavery noun
- Ingrain verb
- Discrimination noun