Thursday, October 23, 2014

The Culture War is Lost

I wonder if we, the decent and moral people, have lost the United States.  I say this because I look around in the media and nobody seems to think in the way a sane, moral person would.  On top of that, we are seeing every refuge be attacked by Leftists as if they are performing a mop up job on what’s left of non-Leftist culture.

Some people would argue that as long as we can vote, we haven’t lost yet.  But culture is more than just who we voted into power.  If anything, voting would only be a choice between dramatic change for the worse and maintaining the status quo, which is pretty bad.

I’ve heard it stated that while communists lost their major countries in the last century, they won the propaganda war in the West.  This is fairly obvious with the many communist policies being promoted and pushed throughout the mainstream media and their systemic attacks on the outsiders.

To top that off, we have basically lost whole areas of the Southwest to Hispanic immigrants (mostly Mexican).  At any other point in history, this would be considered an invasion.  There are many radical groups in that area and all over the country calling for the return of those areas to Mexico.

I think that the United States is lost at this point.  The Federal government is too big to monitor correctly by the ordinary citizenry, which defeats the whole point of representative democracy in the first place.  The culture has by and large shifted away from the teachings of Christianity and instead focused on subjective morality, which is really just immorality at the end of the day.

The mainstream entertainment industry seems to be bent on pushing deviant sexual behavior as normative.  While it can be hilarious in certain contexts, the fact that people believe that 25% of this country is LGBT when the reality is that it is less than 3% LGBT really tells us the true state of the culture wars.

Meanwhile, the education system continues to foster lunacy such as keeping kids genderless and focuses more on things like putting condoms on bananas instead of balancing a checkbook.  We find that teachers are more concerned with self-esteem (thanks Nathanial Brandon, you asshole) than actually teaching students how to learn.

I suppose I could go on.  The fact is, I think we have lost the culture war here in the United States.  It has failed for the most part.  At this point, the only real way to defend ourselves is to break away from the rest of the country and form our own country.

Unfortunately that option brings with it the pain of war and death.  But when you’re backed into a corner, there is little recourse if you wish to fight back.