Lately, I’ve been seeing signs and banners around the area that read “Trump 2024: Bring America Back”. I would like to clarify this.
Back to when? By the 1950s, an era of widespread racism, sexism and social conformism? Or maybe the 1890s, the age of robber barons, when working people suffered low wages, few benefits and no social security, and a few super-rich people did what they wanted while staying above the law?
Back to what? For a whitewashed, soft society of “Ozzy and Harriet” that never existed? Or to the right authoritarian state?
And where did they get the idea that America should take them? When did it become their exclusive property?
Many people believe that it is better to lead America forward, towards greater racial, gender and class equality, environmental responsibility and recognition of the dignity and worth of every human being. How sad that so many Americans want to go in the opposite direction.