Wild West

noun

Definition of Wild West

: the western U.S. in its frontier period characterized by roughness and lawlessness

Other Words from Wild West

Wild West adjective

First Known Use of Wild West

1844, in the meaning defined above

Keep scrolling for more