Westcentrism

From Wiktionary, the free dictionary
Jump to navigation Jump to search

English

[edit]

Alternative forms

[edit]

Etymology

[edit]

West +‎ -centrism

Noun

[edit]

Westcentrism (uncountable)

  1. The practice of viewing the world from a Western perspective, with an implied belief, either consciously or subconsciously, in the preeminence of Western culture.

Meronyms

[edit]
[edit]

Translations

[edit]