01 July 2014

Editorial: In Fact, the West Is Rising

Statue of Liberty (Image: Wiki Commons)

By Robert Dujarric

What some view as decline is actually the ability to win over large swaths of the globe to its ideals.

Every day we hear that Western hegemony, begun with the Portuguese rounding the Cape of Good Hope in 1488, is ending. Some rejoice, others mourn, but all are writing the West’s obituary. The West, however, is in a period of expansion, not retrenchment.
The West has no clear demarcation line, nor is its ideology set in stone. Policies such as institutionalized racism, which were perfectly acceptable in the West until well into the mid-20th century, now condemn its practitioners to pariah status in the Western community, as happened to apartheid South Africa starting in the 1970s.
Today, the West comprises two groups. One is made of European nations and their overseas offshoots, which share the same socio-political order and roots. Another comprises countries, primarily in East Asia, which have adopted the Western liberal model, with Japan and South Korea being the biggest examples.
Has the West declined? The question depends on the baseline. In the late Victorian Era, except for the still tiny (economically and militarily) Japan, the world consisted mainly of Western nations, their colonies, and declining non-European empires. Since then, the West has lost some of its relative power.
However, looking at the more recent past, since the demise of the Soviet Union, the evidence points in the other direction. 

Read the full story at The Diplomat