Truth About Lies

Would it really be beneficial to tell the truth or will it eventually end everything? What would you do when you come face to face with the bitter reality?


Is saying the truth always the right thing to do? What would you do when you come across something you were not supposed to come across?

Something your peers or your partner was hiding from you for your own good?

Something that might shake your life up if you came to know about it?

What if your peers or partner feel that you are not ready to experience reality at its best?

What would you do when you come face to face with the bitter reality? Do you just shrug it off and stay in denial by clinging upon the hope string of it being a nightmare for you to forget?

Do you have a plan to figure out a way to confront the ugly truth?

Truth about Lies

For ages now, people (read: partners and friends) have fought with their inner selves about whether to break the news or just keep a hush about it. We all go through this time at least once in our lives when we ask ourselves:

Would it really be beneficial to tell the truth or will it eventually end everything?

When we are being honest, we expose off layers of guilt to others; layers we would rather want to hide. Once exposed, we become weak and vulnerable in front of the ones we care about. We all know that by exposing, we will have to endure the line of fire which will be their reaction to out honesty. So aren’t we better off with engraved past, and twisted confession chits under our beds, or is it better to just let it all out and feel good about life for once? Is it better to die once and for all (emotionally) rather than holding on to the things we don’t like, things that act as a slow poison and kill us every second we breathe for the one we love? Or should we take a stand and demand that we are told the truth, only the truth, and nothing but the whole truth? Should white lie cease to exist?

We all know love is blind. We all, in one way or another, accept that every relationship we get into is based on blind faith. Living things have a tendency of deliberately losing their senses when it comes to relationships. Many a times a lot of things hurt us to a great extent but we purposely forget them just to hang on to the relationships we so dearly love and care about, ignoring the reality that exists…but is that all we ignore? Deep down, we do have a feeling its not best for us, don’t we? Accepting the reality as its not, is that the sign of a weak bond or a strong one? Is forgiveness and moving on just to be with ones you love, best for everyone? What are these bonds made of anyways? Trust? Honesty? Mutual understanding? Love? Care? Do relationships really stand on these pillars?

I would love to know what you think!

Let's Engage In Healthy Talk