Recently a list of restaurants have been put together as a list of restaurants to avoid going to unless you want to support republican ideals. On one side I actually think that this is at least a more valid form of expression or protest than sitting on Wall Street. For those interested, here is the list of places to avoid if you want to not wish to support some implied political position.
However, I wonder if anything is not political anymore? Where we buy our food somehow caries a political implication. There are even those who make campaigns out of which home improvement store to buy your nails and hammers at based on the political opinions of company leadership or even statements made by companies.
Is this a good thing or a bad thing? Should companies get involved politically? Does it really even mean anything if they take a certain opinion or is that just a marketing ploy?