In response to an email from a regular listener which included a question about the mixture of Christian theology and American nationalism.
Some Christians claim that the USA is a “Christian nation” and that we ought to structure our government with this ideal in mind. Is this true? Are we a Christian nation? If not, should we seek to make it an officially “Christian” country?