Snugglepuss1
New member
- Feb 12, 2009
- 1
- 0
- 1
Why do Christians promote the idea that their religion is the "right" religion?
What I mean is treating people like they're wrong if they don't follow the same beliefs, and forcing beliefs on others.
What I mean is treating people like they're wrong if they don't follow the same beliefs, and forcing beliefs on others.