Work to improve life for Americans
I hear a good number of strong-voiced politicians using the sentence, “what the American people want.” Was there a poll taken I wasn’t aware of? The muck of political bullying, complaining without real solution and cheering for failure while our American taxes burn away on nothing has me dismayed. Shouldn’t the keyword in American life be “better”? Shouldn’t our goal be to become better as a society, as a unified country? When did it become the American way to allow our own citizens to suffer without intervention because of cost, while billions upon billions are poured out and into places we have no real connection to outside of war?
Health care for Americans is now the politicians’ battle, not because of any true virtue on their part, but only to promote themselves. Where were their strong voices when the same big businesses who are yanking insurance plans tossed workers’ pensions out the door? Will real American societal interests ever improve? Our coffers are being looted. Have the rats overtaken the ship? I really hope not. What I want is to move forward.