We may have been sold the idea of the "American dream," but there are many aspects of life in the U.S.A. that make it seem more like the American nightmare. As much as there is to love about American society (like street hot dogs! And the possibility of getting extremely famous for basically no reason!), there are a lot of things that just don't make any sense at all. Can we fix these things? Or should we just keep tweeting about them until someone in charge notices, and does something about it?
1.) Alcohol is very legal while other drugs are not.