I'm at a 6th form college in the UK, I take Sociology as one of my subjects. We made a social survey the other day regarding relationships and careers.. the results of the survey showed that the most important thing to women now was to get a good career and make a life for themselves, whereas years ago it was to get married and have a family. Now I think I should have been living in the past all I really want out of life is to be with someone and have a family, because, after all the career and everything, what do you have left? Yeah ok if the career worked out for you then you might have money but you will have retired.. so in the end is it not the family and love around you that you ultimately need? Not trying to make arguements just trying to see a different point of view someone agree with me, i was the anomaly in the survey lol
I want to have a stable career, but if I was given the choice to have a career or a family, I'd choose a family. Oh don't get me wrong, I want to make my mark on the world, but I'd rather have children and a husband. I've been told I was born 50 yrs too late lol
<span style="font-style: italic">things we've learned from the movies - - when they're alone, all foreigners prefer to speak English to each other....