Saturday, July 3, 2010

America was never a Christian Nation


Continually for the last 50 years there has been a movement to eradicate anything that is perceived "Christian" from America via the courts, control of the media and by what is taught in our schools and universities. Even though these groups and organizations say, "America was never a Christian Nation" they resort to passing laws and rewriting history to stop what they claim never was. They bring up slavery and the conquest of the American Indians as proof. What a government does and what the people believe are two different things. When the American government joined forces with Stalin in WWII, they sought the help of evil instead of relying on God to win WWII. An oppressive government can overthrow the will of the people. If you don't believe so just ask the people of Russia and Germany. The founding of America and it's Constitution was something the people of the world looked for but rarely found. Was America a Christian nation. When I was growing up in the 1950's I believe so. My parents and grandparents also believed. I'm sure my previous generations believed. I'm also sure that each generation has seen a 'falling away' from God, that is from the Truth and the Righteousness of Jesus Christ as Lord and Savior. I know I have. I have seen the moral fabric of America completely shredded. The same was said by the older generation of my youth. The people today cannot even to begin to understand the 'Faith of our Fathers'. So I do believe America was founded on Christianity. A thousand lies cannot change the Truth. Is America a Christian nation today? Paganism rules!

No comments:

Post a Comment