Definition of West

Definition of West
  1. West Proper noun The Western world; the regions, primarily situated in the Western Hemisphere, whose culture is derived from Europe.
  2. West Proper noun The Western bloc; the countries of Western Europe.
  3. West Proper noun The Western United States in the 19th century era of terrestrial expansion; the Wild West.
  4. West Proper noun The western states of the United States.
  5. West Proper noun The western part of any region.
  6. West Proper noun The European Union a Western Region that is primarily an economic and political Bloc that covers 27 Member States from Western Europe to Eastern Europe
  7. West Proper noun for a newcomer from the west, or someone who lived to the west of a village.
Need more help? Try our forum NEW