Two stars is “I think it was ok” but in the realm of reviews, which have somehow normalized a three to be normal and two to be HAZARD AHEAD (and one star, obviously, bad in a way that is worthy of extreme note) this seems harsh to be rated as two stars even though at the end of the day it is, simply, ok.
I really liked Never Let Me Go where I didn’t know the ending but did know the twist (aka I knew that all the main characters were clones bred for organ donation but didn’t know that they all end up dying at the end). You’d think that here, as I knew neither, I’d be more engrossed but as it were it took me a solid three days to finish the last ~15-20 minutes of the book. And that, despite the fact that things do happen towards the end and change the course of the novel and how the characters’ lives play out.
The other issue might, of course, be a surefit of “kindly/intelligent AI robots confused by human beings” books that I’ve read recently, and seeing this book stack against them. My favorite Murderbot series, for one, and Set My Heart to Five more recently, touch upon similar topics and, honestly, do so in a way that’s more touching and meaningful than Klara here. While some reviewers said they wanted to give Klara a hug, I more wanted to verbally berate the creators of the Artificial Friend series of companion robots. Her kindly misunderstandings of human interactions come off as less “pure of heart robot comes face to face with casual human cruelty” or “exasperated AI must deal with human flaws” and more “robot given no guidelines for well-known but nevertheless confusing human norms runs afoul of said norms.”
A good, non-spoilery example: in an interaction between Josie (her human child owner), Klara, and Josie’s friends (whom she does not particularly get along with, hence the artificial friend), Klara refuses to engage with Josie’s friends being introduced to them. This is because her programming says to prioritize Josie’s commands, and Josie isn’t directly commanding her to interact with her friends. All Josie has done is say, “hey everyone meet Klara.”
We’re meant to feel sad(?) or some such that Klara is doing the wrong thing and therefore creating tension in her relationship with Josie through no fault of her own–she’s trying to do the best by Josie, but instead hurts her. But. Why didn’t someone teach her how human interactions work? She comes off as badly programmed, not out of sync with terrible humans.
At the end of the day, I think Ishiguro is a writer very in line with Ted Chiang re: elaborating upon worlds that have slight (or large) modifications. This particular one, though, was a swing and a miss for me.