Moonika - posted on 11/20/2009 ( 12 moms have responded )
Do all parents tell their children that Santa Claus brings Christmas presents or do people give presents? Just wondering about other families and traditions. I had Santa when I was small but i dont think i gained anything by it. Some say I take the magic out of my childs life but I dont feel right to lie to her and want her to be happy about her parents working hard to provide her with a happy life and also to appreciate money and family. She will still get nice dinner, Christmas tree and all the decorations and all the presents, just presents come from her family not from some made up character from North Pole...Even tho I am an Ateist I think Christmas is meant to be a religious holiday not a time to show off your financial abilities or get yourself in a hole with debt, so for me Christmas is about Family. What are your opinions on all that?