Something I’ve been pondering for awhile is this: Is the culture war over? And did we lose it?
I part company with those who seek to Christianize the culture as though this in itself is a noble goal. It seems to me that this would in effect merely make our culture a ‘white washed tomb.’ More important than the culture are the people within it and their state of mind and eternal fates. Nonetheless, people are strongly influenced by the culture at large whether they know it or not or admit it or not. An unfriendly culture will make it harder for people to receive the Gospel.
I believe that. To an extent. I note, however, that the Christian Church itself exploded into existence within a culture that was not yet, by virtue of the fact that there wasn’t a pervasive Christianity to Christianize, Christian.