Christians, why is it important to "save" America from the path it's heading down?

RedFoxAce

New member
Christians, why is it important to "save" America from the path it's heading down?

Isn't America - and the world, really - approaching the End Times? And doesn't Scripture warn that everything will basically head further down the path of destruction before the Return of Christ?


If so, why should Christians be concerned about America's moral decay, or America's huge debt, or social problems, or crime, or government flaws, etc.?



Why is America worth saving? Isn't the day rapidly coming when America and everything else will become irrelevant or be destroyed?
 
Top