Mike Essig
1 min readJul 18, 2017

--

America certainly became an official empire after the Spanish American War, although the tendency was there well before.

But you forget, Svetlana Voreskova, that we are touchy about the term “empire.” Empire connotes “bad guys” and we know we are always the “good guys.” Being one of the good guys is part of our American birthright, like air-conditioning.

Still, if it looks like a duck, quacks like a duck, etc. :)

--

--

Mike Essig
Mike Essig

Written by Mike Essig

Honorary Schizophrenic. Recent refugee. Displaced person. Old white male. Confidant of cassowaries.

No responses yet