In this country we have freedom of religion. Unfortunately, once you proclaim religion, you lose your freedoms! It seems that people think Christians shouldn't meddle in politics. And what right does a Christian have in big business? Shouldn't Christians spend all their time in church talking about Heaven and God and "spiritual" stuff like that?! Politics should be left up to "worldly" people who are too smart to fall for all those religious fairy tales. And business, well.... aren't Christians supposed to be poor and sad? Isn't their reward in Heaven? Well then, leave all the gold and silver and those nasty greenbacks to us sinners! And what do they think they're doing nosing around in public education? They should worry about Sunday School and let us take care of "real" education!
The "separation of church and state" has become "separate all the church-goers from state matters and keep those Christians noses out of public affairs"! Why does my becoming a Christian mean I must stop being an American? And why are all other "religions" tolerated (even esteemed) while Christianity is mocked and hated?
I am an American AND a Christian! I will be heard! I will exercise my rights and fight for my freedoms until the day I die! I don't ask for favoritism, but I will not accept second-class citizenship! I will vote my faith and my morals! I will speak up for absolute truth and moral responsibility! I will stand in the gap for the weak and defenseless! I will not be forced into "political correctness; I will call sin SIN! Get used to it!