I was working out with a Black friend of mine over the weekend and as we hit the weights we started discussing race relations in our country these days. We are both about the same age(45) and we we reminiscing about when we were in high school back in the 80's and it seemed like everybody from all races got along a whole lot better than they do nowadays. Sure there were racial dustups every once in awhile, there's bound to be some from time to time in a country where slavery played such a big part in our early culture. For the most part though we both agreed that things have gotten considerably worse, especially in the last few years, and we were trying to come to a conclusion of why that was.
As we talked we both agreed that President Obama has something to do with it. His thoughts were that Obama's election brought America's racism out of the closet and I guess I have to agree with him to a certain extent on that. I mean, a Black man in the Oval Office has to wrankle a racists feathers like nothing else doesn't it? I can picture old Bubba with his rebel flag tee shirt and a pot belly spitting his Budweiser out as Obama took the oath of office. That being said I feel like a lot of good people who are not racists have been labeled so by the left because they despise Obama's political persuasions. I think my friend is right to a certain extent, but I personally feel like President Obama has exacerbated racial relations in this country by taking every opportunity possible to throw flames on any situation involving race. Whether it be inserting himself into the Trayvon Martin case, offering up his criticism of the Cambridge police in the Skipper Gates ordeal, or siding with the illegal immigrants in Arizona, Obama has cast himself as a racial instigator. My friend seemed like he agreed with most of what I was saying, but you could tell there was only so much "trash" he was going to talk about the president and I understood that.
My question to you all is are we better or worse off as a nation since President Obama was elected? My friend and I both agreed that things seem to be much worse racially in the United States over the last few years. Do you think it is a result of Obama's election and if so why? At some point we are going to have to come back together as a nation or we will cease to exist. We need to figure out what the real problems are and deal with them or things are going to get much worse. What, if anything, can we do to get race relations back on the right track in this country?