They don't tell you how to feel or what to think. They just show you a serious of events or even just images. What you see and how you interpret it is just a reflection of your own understandings about life and your own mind. I mean, true, things get dramatized or highlighted. Or even the fact that certain images or subject matter get shown in the first place tell you something. So maybe movies aren't just inkblots. Are there even any truths to be found in movies? I don't know..