Many of us have asked the above question.  We have heard that our culture here in the United States is close to becoming a "Post-Christian" culture, much like what Europe has experienced over the past several decades.  While this may be something worth lamenting, it may also be something that the Church & Christians desperately need.  Check out this article by Tim Keller as a starting point on this topic.  

I'd love to hear your thoughts as well, so feel free to post some comments...

1 Comment