I'm not bothered if anything of this is true. Most of them gave me a good laugh, and that's what counts :p
But I still can imagine that at least some of this is true; it all depends on the environment you live in. If your parents would have told you that it is okay to have sex with animals and all the people around you act accordingly - why shouldn't it be normal for you then?
Why should our "modern" societies have the right to judge about such things which may be part of another culture or society...
Have you seen the movie "In China they eat dogs" (I Kina Spiser de Hunde)?
|