April 21, 2010
No pun intended.
When Conservatives talk about America being a “Christian nation,” where exactly are they getting that from? I’d like to see them back that up with any text from the Constitution or even the Declaration of Independence. In other words, anything establishing the U.S. as a “Christian Nation.”