Everyone must be thinking these days, what's wrong with this country. I can put a finger on the problem immediately. Our media. I'm not talking just about the news, but TV programs, commercials, comedy shows, movies, kids shows, everything is skillfully being geared toward thinking the worst of the country we live in. The root of the problem is a prevailing belief that America is a bad country. Having lived in several others I can say this isn't true. I've lived in better and much worse. Much, much worse. But you wouldn't know that if you had never left the confines of our shores. Everything we feel about America is a byproduct of our media sources. They have a huge impact on our national mood. I seriously doubt there is anyone right now that is happy with the direction this country is going. Watching the Republican primary we see candidates making up lies about each other and tearing each other down. We see a president that we can't trust, that supports anarchy in the streets, and seems to think the constitution is just something to be ignored. Just as an example, look at the nominations for Oscars this year. How many uplifting pictures got any recognition? Well, one that is purported to be so only pulled in $20 million. "The Artist". Another "Moneyball" tells the story of how a penny-pinching owner forced a general manager to come up with a new way of picking winners without paying them for it. The favorite this year appears to be "The Help". A story about racism that is long past, but for some reason Hollywood feels we need constant reminders of. Every once in awhile I see something that is uplifting. Something positive. But it's being drowned out by all of the negativity that constantly surrounds us. The only folks that seem to be enjoying this are folks that aren't really good people. I don't expect much of a discussion about this because unless the media is talking about it nobody else will.