August 15, 2011

Work ethic has been defined as a “belief in the moral benefit and importance of work and its inherent ability to strengthen character.” For many of us, a strong work ethic was instilled from an early age. If you were a part of our family, you washed cars, swept sidewalks, watered plants, and answered phones before being given any serious responsibilities. We had to work for it the old-fashioned...
Read More >