Thursday, December 10, 2009

Is it human nature for people to want to tell others how to live their lives?

What do you think?Is it human nature for people to want to tell others how to live their lives?
Only certain human beings.





Myself, I never had any desire to. If you ask my opinions, I might venture a platitude or 2. But I can barely figure out how to live my own life. I got no business telling other people how to live theirs.





My spouse is the same way. I could never live with a person who thought he should tell me how to live my life.Is it human nature for people to want to tell others how to live their lives?
People learn as they go threw life. They go threw bad and good times. They want to guide you so you so your life will be all good. The more someone loves you,the more they speak up,so your life will be better then your. I think it is human nature to insist on learning things the hard way! You especially can't talk to those who need it most...Teenagers! They are too excited about becoming adults,they want to slam threw it all and spread their wings and fly. They do not see the predators on the ground if they try to fly before the adult feathers come in.We all know what happens to them. Teenagers only see blue skies.
I feel like people are mostly good and think that if their lives are good they want you to have a good life too. It is human nature to take things as you understand them, so maybe if you don't understand how someone else lives their lives you can't understand how they can be happy. So by telling them what makes you happy, you are just wanting them to be happy.








Here's a great quote:


';Advice is a form of nostalgia. Dispensing it is like fishing the past from the disposal, wiping it off, painting over the ugly parts, and recycling it for more than it's worth.'; ~ Everybody's Free to Wear Sunscreen by Baz Luhrman
Of course it is human nature to do that. We're fear-based animals. I agree with an above statement that we feel we have to be 'right.' The object is to surpass these animalistic features and go on to an evolved nature, which, in the end, will help us tame our human nature and then we will be 'right' because there will be no fear, biggotry, and so on. The answer to your question at this point in our time is yes, I'm sad to say.
No. It is through Divine intervention that makes some people want to warn not tell others how to live their lives. We just know something they don't know and give a damn about them.
Social Nature, so probably
For democrats it certainly is.
no, human nature cares only about itself.
Yeah probably. Like when moms tell their kids what to do. :] It's normal and it will happen forever!!! ^__^
I think it tends to be human nature to think that you are always right.
Are you telling me to think about this question? well i refuse. Stop telling me how to live my life!!
i dont know but it is really annoying

No comments:

Post a Comment

 
unemployment rate