A conversation in the lunch room one day got me thinking. One of the girls was talking about her sons swimming lesson and somehow it got around to me never having learnt how to swim properly. She looked at me in shock and said, “But you have to learn how to swim! What happens when you go to the beach with your friends?”
Well, I don’t go to the beach with my friends. I’m not really a beachy type person. Even when I had a boyfriend that loved going bodyboarding soon as the sun was up, I was quite happy to stay on the sand.
Then that got me thinking about why she thought I had to learn how to swim. I know that there are plenty of things that society thinks you should learn and I don’t understand why. Unless you’re inclined towards something and/or do reasonably well at it, then why do others think that you have to learn something? I mean, I’ve gone through over thirty years of life and my non-learning of swimming hasn’t affected anything. I don’t feel that I’ve missed out or wished that... I don’t know, wish I was more athletic or something.
Is there anything that you’ve come across that people expected you to have learnt? Or expect you to learn?
0 comments:
Post a Comment