Noun Definition

western united states

1.Definition: the region of the United States lying to the west of the Mississippi River

Related Noun(s):west

Category: Places