The Dual Commitment of Western Christianity

Western Christianity is a dual-commitment faith. Beyond a commitment to the historic tenets of the Christian faith, western Christianity additionally demands a commitment to defending and upholding a very particular conception of "the West" (or "western society", "western civilization" etc) and its "Christian foundations and heritage".

Specifically, this additional commitment is under-girded by three distinct beliefs:

First, there is the belief that the distinctive features of the West emerged as the Christian faith — viewed as the foundational religious, social & philosophical influence — interacted with other intellectual currents of the day (enlightenment philosophies, etc). 

Second, there is the belief that the wealth, dominance, flourishing, and global impact of the West are a direct result of these Christian foundations — that is, the West was blessed by God as it emerged as a distinctively Christianized society.

The third belief is a derivative of the first two: that the central problem of western society is the erosion of these Christian foundations by various secularizing or non-christian forces, and that the mission of the church includes "restoring the Christian foundations of the west" by battling for religious and social renewal.

I'm convinced that this dual commitment and its undergirding beliefs are flawed and extremely problematic. 

So in what respects are they problematic? I am not arguing that these beliefs are entirely without merit. Even "secular" perspectives freely acknowledge the influence of the Christian faith, its beneficial effects and results, and its central role in the ethical and moral foundations of western society.

The real problem lies deeper. The ugly, hidden root here is that there can be no honest conception of "the West" or "western society" without acknowledging the ideology of white supremacy as its dominant historical social reality. In other words, no matter what else you claim as foundational to western civilization, it is intellectually dishonest and ahistorical to leave out white supremacy.

Furthermore, white supremacy was fully integrated into the Christianity that was foundational to western society.

Western theologies evolved and emerged to buttress inferior (non-white) and superior (white) racial categories to the extent that eventually, race-based chattel slavery of the most degrading and demonic nature was being widely defended as a Christian edifice instituted by God from ages past. The conquest, subjugation, displacement, exploitation, and (in some cases) destruction of indigenous "nations, tribes and tongues" were legitimized ethically and theologically by western Christianity. 

From this perspective then, if one views the various distinctive, positive features of western society — individualism, personal rights, personal responsibilities and freedoms, rule of law, checks and balances of power, free press, representative government, private ownership, free markets, protestant work ethic, the value of service & charity, scientific rationality, sexual moral ethic and so on — it is impossible to deny that these were primarily applicable and beneficial to "white society" and were, only recently and with great reluctance, bestowed upon non-whites, after much conflict and contention by those who were thus disenfranchised.

And what of the wealth, dominance, and flourishing of western society? Do you not find it problematic to claim that God blessed these "Christian foundations" that were fully complicit with the heretical, violent, and oppressive ideology of white supremacy?

I do. 

There are billions of people today that cannot envision the Christian faith as separate from the imperialistic colonial violence and injury done to them — because that was the essential nature and global expression of white supremacist "western" Christianity. 

Whatever the historical reasons for this dual commitment, I see no advantage to holding on to this. Critiques of western culture or of founding narratives of Western nations are not attacks against the Christian faith.

Why can't we agree that influencing society positively is part of the mission of the church — why the need to reference some past "golden age of christian influence"?

Perhaps it is time to jettison these flawed beliefs and this commitment to "restoring the Christian foundations of western society" for something more honest and humble?