Elizabeth - posted on 11/16/2012 ( 59 moms have responded )
I'm trying to get information from moms of all backgrounds and experiences to do independent research. After reading "The Conflict" last year, I became fascinated by American phenomena, such as the obsession our culture seems to have with women staying at home, working, breastfeeding and the like, despite being an American myself. I have found that I can walk down the same street and be told by 3 different people 3 different things about how my baby is either properly dressed, too warm or two cold. Everything is pretty subjective in this country. So, moms (and mums in UK), do you feel that getting your college degree before becoming a mother was a waste of time, resources and energy? What is your experience with employment after becoming a new mom (i.e. did you find it impossible to get a job due to childcare costs?, did you not have anyone to watch baby?, did you find frustration with not being able to get paid what you thought you were worth based on your degree?, do you feel that employers still aren't treating women fairly in terms of preferring to hire men over women (and if this is the same in small vs large businesses)?)? And for those who didn't have a degree before having baby, have you gotten a degree since? If you have ALREADY gotten a degree AFTER having baby, have you found that you can now earn a "livable wage" for your geographical location, or are the same pay rates available for people who don't have a degree? Do you think going back for your degree was truly worth it?
PLEASE share your experiences, frustrations and thoughts!