English
Proper noun
the Southern United States
- An expansive region encompassing the southeastern and south-central part of the United States.
- Synonyms: American South, Dixie, the South
Usage notes
- The term Southern United States is defined more by shared culture and history than strictly geography. Although located in the extreme south of the United States, southern California, New Mexico, and Arizona are not considered part of it. In contrast, Virginia and West Virginia, though located in the middle of the east coast, are considered part of it.
Further reading