I'm not sure tbh. On the one hand religion teaches people to love and be kind but it seems people are just using religion to promote hate and when I see people committing acts of terrorism or gay kids being kicked out I can't help but think maybe it's time society outgrew religion.
Right? People always put Christianity on blast using this argument and it's clear that they have not one iota of knowledge in regards to Biblical history.
I think I have a clear understanding of your religion. Everyone picks out the morals and ethics which please them regardless of what your god tells you to do. Then when you "sin" and want forgiveness you run to your nearest chapel or church, get down on your knees and beg for forgiveness in front of air.