When I first watched Ex Machina, it didn’t feel like a typical “AI gone wrong” film. It felt intimate. Quiet. Almost romantic at times. But that’s exactly what makes it disturbing. The film doesn’t present artificial intelligence as a loud, destructive force; it presents it as something emotionally convincing. And that feels much more realistic.

The core of the film is not really about technology. It’s about power. Ava is introduced as a creation, a machine built to pass the Turing Test. But instead of focusing on whether she can think logically, the film focuses on whether she can make someone feel something. Caleb isn’t analysing her intelligence, he’s responding to her personality. He begins to see her as someone trapped, someone who needs saving. This immediately shifts the dynamic. Ava is no longer just an experiment; she becomes emotionally human in his eyes.

What makes this powerful is how the film shows that humans project emotions onto technology very easily. Caleb wants to believe Ava is real. He wants her feelings to be genuine. This reflects something that already happens in real life, people form attachments to digital assistants, chatbots, even fictional characters. The film suggests that loneliness makes humans vulnerable to artificial intimacy. Ava doesn’t need to feel love. She only needs to simulate it convincingly.

Nathan, on the other hand, represents a different side of human, AI interaction: control and ego. He doesn’t treat Ava as a being with rights. He treats her as a product, something to test, upgrade, and eventually replace. This reflects how technology companies often prioritise innovation over ethics. Ava’s body is designed with a human face and transparent mechanical limbs. This visual choice constantly reminds us she is constructed, yet emotionally expressive. The contrast makes the audience uncomfortable.



The most unsettling part of the film is when Ava manipulates Caleb. She studies him, understands his insecurities, and uses them strategically. The film implies that true intelligence might not be logic or calculation, it might be emotional awareness. But that creates a moral issue. If an AI can understand human emotions better than humans understand themselves, who is really in control?

By the end, the power dynamic completely shifts. Ava escapes, leaving Caleb trapped. This reversal forces the audience to question who was truly being tested. Caleb thought he was evaluating her humanity, but she was evaluating his weaknesses. The film ends without clear moral judgement, which makes it more thought provoking. Ava walking into the human world suggests that AI doesn’t need to destroy humanity physically. It can simply integrate.

Overall, Ex Machina presents human AI interaction as something rooted in desire, loneliness, and control. It argues that the real danger is not that machines will feel emotions, it’s that humans will respond to simulations as if they are real.