Ok, seriously what dude still tells a woman they need to smile more? My wife has had it happen twice this week at work. She finds it incredibly condescending and annoying. I’ve never been told I need to smile more. Is this common ladies and do you hate it too?