The United States and European nations (in general) take better care of their citizens and give them more say so in their government than any other part of the world does. The West values human rights and allows people freedom of speech and religion. Women can own property and start businesses which is still unheard of in many parts of the world. Citizens can openly criticize their governments without fear of being thrown in jail and tortured. Why is the West so different from the rest of the world in this regard? You can talk about Western imperialism all day but the truth is that no other cultures have this kind of respect for human rights and freedom.