Racism is and has been a problem in the United States. It is something that has been interwoven into the nation's history and foundation through slavery and oppression of minorities, but this in no way justifies or excuses it. Racism is, as defined by bartleby.com, "The belief that some races are inherently superior (physically, intellectually, or culturally) to others and therefore have a right to dominate them." In the United States, racism, particularly by Caucasians against blacks, has created profound racial tension and conflict in virtually all aspects of American society.
…