Quote:
Originally Posted by lucyrulesok
Is it more important to accept yourself for who you are, or to try and change the things abour yourself that you don't like?
Recently I've been feeling pretty down about myself (like every other boardie seems to be too - I thought summer was supposed to be a happy time?) and I don't know whether I should just work on being happy with the way I am, or try and change my life (this would have to be a pretty radical change).
When and how do you know that your life is 'wrong' or 'bad' as opposed to just having a lack of confidence in yourself and the things you do?
Sorry just rambling a bit here, but it would be good to get some other opinions, it's really hard to look at your own life objectively.
|
If you feel down about certain aspects of yourself, be they physical, mental, social etc, probably it's better if you go to the root cause of why they are making you feel disatisfied in the first place.
If there's something physical/mental about yourself that scratches on the surface of your confidence, perhaps it's good to check that it is coming from the way you want to be perceived by others, rather than the way others perceive you.
You get people who spend way too much time worrying about what their friends/family have to say/think about them, without even stopping to think that those criticisms might be coming from people who have a hidden lack of confidence in themselves, yet they manage to disguise it by being more vociferous in their criticism of others.