Do you think the US should be a Christian nation?

Most US adults believe America’s founders intended the country to be a Christian nation

Yes

The US should be guided by Christian values

No

Religion should be kept out of political matters

Leave a Reply

Your email address will not be published. Required fields are marked *

0