I went to an art festival today in beautiful downtown Duluth, GA called Barefoot in the Park. There were lots of people there, but I saw only 2 other people who were barefoot, both artists/exhibitors. One of them called out to me to let me know that he was barefoot too! But I was seriously disappointed. The weather was awesome and the name of the event was an invitation to go barefoot. What is wrong with people in America? Why are they so hung up about shoes?