Definitions | West |
| proper noun (wikipedia, The_West)
the West
- The western, Western world; the regions, primarily situated in the Western hemisphere, Hemisphere, whose culture is derived from Europe.
- The Western bloc; the countries of Western Europe.
- The Western United States in the 19th century era of terrestrial expansion; the Wild West.
- The western states of the United States.
- The western part of any region.
Supplemental Details:Sponsor an extended definition for West for as little as $10 per month. Click here to contact us.
Full Definition of West
|
| west |
| noun
- One of the four principal compass points, specifically 270°, conventionally directed to the left on maps; the direction of the setting sun at an equinox.
adjective
- Situated or lying in or toward the west; westward.
- (meteorology) Of wind: from the west.
- Of or pertaining to the west; western.
- From the West; occidental.
adverb
- towards, Towards the west; westwards.
Etymology: Old English west
Supplemental Details:Sponsor an extended definition for west for as little as $10 per month. Click here to contact us.
Full Definition of west |
|