the Wild West
nounDefinition of the Wild West
: the western United States in the past when there were many cowboys, outlaws, etc.
stories about the Wild West
—often used before another noun Wild West storiesa Wild West show