America is home to the world’s most expensive and prestigious private universities, but even public colleges are becoming unaffordable.
Unlike the United States, many other countries consider higher education as a right, not a privilege and offer college education for free.
What is the true purpose of a college degree? Is it to help you get a job or is it to gain knowledge? Are there cheaper and more efficient ways of preparing someone for life and work? The college experience in America serves as a costly sort of prolonged adolescence, not necessarily a place to acquire both practical and humanistic knowledge. Is it worth the cost?
Do you consider a college degree necessary to have access to 'the good life', or do you think that higher education is going the way of the dinosaurs?
Watch this video to give you an idea of where this discussion might lead us.