Was American Conservatism ever much more than hand-wringing and saying "this is evil" while at the same time taking great pleasure engaging in vicious cruelty if not outright elimination towards vulnerable groups? I'm speaking of American Conservatism itself and not any political party.
I'm not much of a "good" vs "evil" person but I will say IMNSHO American Conservatism has always engaged in a disproportionate amount of evil against fellow humans not to mention animals.