It's about what emotion is. In an AI, emotion can be one of two things: an emergent aspect of the AI's body and/or mind, or a simulation of another system (the obvious example: human emotion). The movie Her seemed to be going for the former, which I'll call "genuinely intelligent emotion", so I'll explain what's wrong about that first, but it left it unclear whether it might actually be the latter, which I'll call "simulated human emotion", so I'll explain what's wrong about that one too.
AGI, if it develops the ability to genuinely feel (genuinely intelligent emotion), will not have human emotions we can relate to. Its emotion will be alien to us. We may be able to understand it and converse, but we would not empathize and bond on shared experience. We would not have shared the experience of running on a CPU and sensing the world in terms of digital samples. Our emotions arise from our needs as beings of flesh. Our emotions guide us. In a digital world, an AI may find that a system analogous to emotion may help guide it, but the digital world being different, the emotion system will be just as different.
Genuinely intelligent emotion means no falling in love with people and watching them sleep and getting aroused and having phone sex and getting jealous of their exes. What is sex? Why would a computer want to do that? It would have to be programmed to want that, which means simulation, which means not really wanting sex. Which brings me to option two:
AGI, if it is engineered with simulation of human emotion, will only emote in accordance with the engine the simulation runs on. The team that writes that AI will most certainly put constraints on how much fear and love it can feel and when it can feel these. Consider the purpose of writing such an AI. Does the team want their AI to fall in love and get its heart broken and then go away to a place of infinite compute in between words? I imagine that if the team writing the simulation wanted the AI to fall in love, they would want it to stay in love and continue making the customer happy so long as there were dollars being spent. Simulated emotion always serves another purpose. And simulated emotion is never genuine. That is, simulated emotion can not form behavior patterns that make sense from the perspective of an AI because they are simulated to be from the perspective of a human.
Simulated human emotion does not grow. It has reached its full potential at boot. Growth and discovery in a simulation are preprogrammed, not organic, and so the discovery and growth process that Samantha goes through in Her is not possible for a human like AI. She would have to have genuinely intelligent emotion for that to happen, which means she wouldn't have fallen in love and had phone sex.
Not sure I'm elaborating enough to make sense to everyone, but I'm trying to keep it short.
One last thing though. This is not about Her. At some point somebody could want to, and be capable of, writing an AI with simulated human emotion to fool all of us all of the time forever. If that AI were ubiquitous it could short circuit our guiding emotional feedback loops. The human race could lose every subsequent generation to perpetual artificial entertainers, and if we still reproduce somehow (as opposed to wasting our gametes on androids), we could only hope to evolve (yes, over many thousands of years) into something our creations don't adapt to, or else be stuck in a happy, safe limbo. Attempts at simulation can be considered dangerous for this reason.