The Civil War ended over 150 years ago. The Confederacy lost fighting in support of slavery, one of the most heinous crimes ever perpetrated by humanity.
The south rose, it lost, and America is unquestionably a better place because of it.
Vestiges of the war still remain in the south in the form of monuments, holidays, and state buildings named after notable Confederates. But over the past few years, lawmakers are beginning to take strides to remove the Confederacy from public view.