What happens to a culture when the women are told they all have to look and act like a select few people who have achieved fame by selling themselves to the general public? What happens to a culture when the men live under the shadow of image and of fear of not measuring up, so they hide behind fantasy and video games filled with empty heroism? What happens to a culture when this behavior is accepted and encouraged? What would you do if you woke up and found that that world is the one in which you live?
2 comments:
So true. Our culture cannot bear the weight of it's mindsets for very much longer. People are urged to mutilate themselves mentally, physically, emotionally, spiritually, for the sake of achieving something unnatural and unfulfilling, and then convinced to take a certain kind of drug that will counter the reprocussions of their lifestyle.
I happened across a commerical for heartburn relief. in this commerical a woman was eating all kinds of junk. Upon feeling heartburn she pops a tablet and instantaniously feels better. But what we fail to see is that heartburn, headaches, nausia, etcetera is our bodies way of telling us that something is wrong with what we are consuming or doing. Though not always true, most of the time our body doesn't need more medicine, it just needs a change of lifestyle on our part.
In our culture we fail to see that violent violent homes, broken hearts, loneliness and depression are all results of something being out of wack. Crazy. Help us Lord.
The question is, what can we do to be world changers? Where do we start? Are we willing to be different? Let's roll!
Post a Comment