I always hear white conservatives saying "why are black people allowed to say the n word but not us", why are white conservatives so eager to be able to use the n word? Do they want the U.S. to be a country like Spain with no political correctness, where spanish football fans throw bannanas at black football players?