zoooooomttt
New member
Just a thought. Why do people think that once you become a Christian you are a better person and why does the general public in the US tend to trust Christians more than any other religion? Also why is it that when you donate to the poor or help someone many people automatically assume you must be a Christian?