Quote:
Originally posted by Abyss
True, the demise of religion can be nothing but a good thing though,
|
Ya'll don't see it happening already? At least in America.
Ever since the rise of Protestantism, religion has started to slowly denigrate. When it was first put into play, Constantine instituted Christianity as a STATE religion, then the people wanted religious freedom so the US was formed. People were puritans. Sexuality evolved. Women and Blacks were repressed by scripture. They were later granted rights. Now the gay are struggling...but they will eventually win.
The way I see it, religion is just a placeholder for actual knowledge. A lot of those things were written as a way of understanding the world. As we come to understand the world better, the less use we have for religion.