‘The evil west’ is a term for the narrative claiming that Western Culture is inherently oppressive. According to this narrative, the West has a self-proclaimed sense of civilisational superiority attained through the glorification of its own scientific, cultural and religious achievements. In academia, the concept of 'the evil west’ is used portray the West's racial and imperialistic practices as incomparably evil to the rest of the world, and uniquely responsibility for global inequalities