Religion in the United States

When it comes to religion in the United States, we still have the freedom to do whatever we want. The government allows us to

practice our religion without persecution which makes this the greatest country in the world. This nation was founded on Christian principles and Christianity is still the leading religion in this country. However, in recent years, numbers of born-again Christians have been on the decline. In fact, every generation is showing significantly fewer Christians than the generation before. At the same time, Muslims are multiplying rapidly in this country. Atheism is becoming more popular every day. Here are a few things to consider about how things are shaking out in the US.

If you are reading this website, chances are you are a Christian or you are at least interested in Christianity. Because of this, I'm going to give you a bit of a challenge. You need to get out and start spreading the word about The Gospel. We need to save as many people as we can in the time that we still have on this planet.

This country was founded by Christian men that wanted to include God in everything we do. If you don't believe that, just take a look at the motto that is on much of our currency (In God We Trust). Much of what the government does today tries to eliminate God from our country. It is no surprise that our country has been headed in the wrong direction for a long time. Is this a coincidence or is this what happens when you try to distance yourself from God?

I believe that this country needs a spiritual reawakening. We need to step up and start evangelizing again. Even though Christians are still everywhere now, at the current pace, they will be in the minority a few years down the road. Do everything that you can to win people to Christ. If this country refuses to turn back to God, we will most likely be in for a very tough road in the future.