I watch shows like Survivor and the lying and deceit that takes place and it really does shock me. I know it is only a game but these people blatantly lie straight to someones face. The shake hands on a deal and have no intention of keeping it. I don't think I would be able to do that.
What does this sort of behavior teach our children?
So true, I used to love watching it but it made me feel strong feelings of antagonism towards the players. I never watch it anymore. What is good about such a show? all it teaches is hypocrisy and being prepared to do anything for money. It is poisonous in my opinion