I'm at a 6th form college in the UK, I take Sociology as one of my subjects. We made a social survey the other day regarding relationships and careers.. the results of the survey showed that the most important thing to women now was to get a good career and make a life for themselves, whereas years ago it was to get married and have a family. Now I think I should have been living in the past all I really want out of life is to be with someone and have a family, because, after all the career and everything, what do you have left? Yeah ok if the career worked out for you then you might have money but you will have retired.. so in the end is it not the family and love around you that you ultimately need? Not trying to make arguements just trying to see a different point of view someone agree with me, i was the anomaly in the survey lol