What is the point of college?
is a high end baby sitting service? Is it a rubber stamp of approval you pay and work towards? Is it there to give one a chance to develop as a person in every sense, not just intellectually?
Should degrees be pedaled as if products at a bizarre, with professors acting as ring masters? Is college a service industry? Should colleges teach societies values, or let students shape their own?
Can one degree be held above another if we are going to tell each other to "do what you love"?

Think about these questions. Don't post right away.
Read the following Article