The United States of America is not a Christian nation, and probably never will be.
By "Christian nation", I don't mean "composed of Christians", because as long as the Constitution remains in effect, the government can't prohibit other religions from existing or even flourishing within the nation's borders. Rather, I mean "following the teachings of Christ" — for despite some politicians' fervent avowals that they're good Christians, they don't (and can't) apply the teachings of their religion to the offices they hold.
The most obvious example, of course, is the President. Despite his claims to be a Christian (and his statement that Jesus was the philosopher he admires most), George W. Bush did not in fact "turn the other cheek" when America was attacked, as Jesus taught. Rather, he ignored that teaching (as well as the commandment "Thou shalt not kill"), in order to strike back at those who attacked us (showing that, to Dubya, man's desire for retribution takes precedence over the teachings of Christ).
Indeed, Bush could not have done otherwise and still remained President; had he told the nation that, as a Christian, he was choosing to set a Christian example by turning the other cheek, he would have been impeached by angry citizens demanding retribution for the attacks of September Eleventh.
As long as America institutes capital punishment, builds lethal weapons and trains its armies to kill, it is not — and cannot legitimately claim to be — a Christian nation, despite the religious right's insistence that we are. (Then again, the religious right has always comprised those "Christians" who pick and choose which of Christ's teachings they want to follow, and ignore the ones they find inconvenient.)