2011
Abstract: The American West is an evocative term that conjures up images of cowboys and Indians, covered wagons, sheriffs and outlaws, and endless prairies as well as contemporary images ranging from national parks to the oil, aerospace, and film industries. In addition, the West encompasses not ...