Noun Definition
west coast
1.Definition: the western seaboard of the United States from Washington to southern California
Category: Places1.Definition: the western seaboard of the United States from Washington to southern California
Category: Places