University studies improve our knowledge and skills also give us career opportunities but is it important to have a university degree to obtain a better paying and high skilled job? What do you think guys? Let me know your opinions in the comments.