Has America lost her sense of evil? Did she ever have one?

Yes, by-in-large, America has lost her sense of evil. Of course there is a remnant that abhors what has happened in our country, but the majority have turned away from God’s Word to secularism and New Age ideas. Too many today now call black white, and white black.

Once this nation, which was truly founded on biblical principles, did know right from wrong, but the problem is, there can be very little moral integrity or righteousness without godliness. When people turn their back on God and His Word (since his Word is needed to know Him), they inevitably become immoral as explained in Romans 1:18f. We have taken the Ten Commandments out of the schools, and in most cases, even out of the courtrooms. People are more concerned about financial prosperity and an administration that gives that to them than they are about the moral integrity and honestly of their president. There is plenty of historical evidence to show this country was founded on Christian principles, and at one time did have a great sense of what was evil.