Finished: August 3, 2023
Why I read this
I have a bit of a unique problem now with the recent book care package by my mother. Normally I can take books from my to read list and split between audio books and physical ones, usually taking the more thought intense ones (like psychology or history) for physical copies and lighter fiction for audio books. However, now that my list of 30 something books for the year was all purchased in physical I have to add new books to fill my audio queue. Moreover, I’ve now caught up with the Enderverse and I’m not sure if I will dive into a new chain series in the same universe by Card. Regardless, the idea for this book came from a discussion with my girlfriend’s brother who had mentioned it was the hardest book he had read in English before. Between that recent comment and the general cultural impact of Blade Runner, I figured it would be a nice addition to my list.
What I learned
Reading this at the same time as Sapiens by Yuval Noah Harari was a really interesting experience. On the one side I am learning about how humanity has made up a bunch of myths about what is right and wrong or how we are the rulers of the universe and it is unclear where we will go from there. Philip K. Dick then proposes me a possible future answer to the questions of Harari. What happens when we produce near human androids and what are the ethical problems associated with it?
The “us” vs. “them”, question is turned over and over in Blade Runner. The constant review of what empathy really means, and to whom (or what) it can (and should) apply places the reader on a ethics roller coaster. I found that throughout the book I changed my opinion on the ethics of androids numerous times forcing me to think and rethink my positions. On top of that the constant comparison to the ethics of animals in world that we have created that cannot support them raises many more questions and theories.
Overall this book is absolutely rife with ethical and moral dilemmas, which to me makes it an engaging and interesting book. Like a quote from the preface to 1984 that I recently read I found that contrary to most books this story has become more relevant with the passing of time than less. In a world now dominated by the buzz words of AI, machine learning, animatronics, etc. the problems raised in Blade Runner will likely have real life challenges in the rather near future. It is not science fiction now. In our lifetimes we will likely have to answer the question of whether or not AI is truly alive and if so what rights does it have? This will likely come as well with a revolution of how we see our existing systems. If we provide rights to AI how must we re-evaluate our current rights to animals, or even other humans?
What I didn’t like
One challenge I had with this book was to follow the characters constant shifts in opinion or action. On one page Rick Deckard has empathy for the androids, the next he says he absolutely cannot kill a certain model, then next he does just that. I understand that of course characters are meant to change with additional information, but the speed at which it happens is very fast. This could also have been a challenge with listening to the audio version instead of reading the physical book. Having less ability to review and look back at the pages where majors changes happen quickly I may have missed some of the development.
Additionally, a problem I have with many books recently is that some of the minor characters do not appear to act as people would. For example Deckard’s wife doesn’t seem to have very realistic reactions to events. It is the future and I’m no expert on humans, but it just appears to me that characters like her are mostly built in just to have character development for the main characters like Deckard.
Questions I asked
Can too much empathy be bad for society?
If in this universe Mercerism had become popular before the atomic war, would the increased empathy and combined human spirit have prevented this war?
If giving a chemical way of controlling our emotions, would it be improper to use it all the time to maintain positive moods? For example, if the negative aspects of heroin could be removed, would it be unethical to use it constantly?
My Favorite Quote
“You will be required to do wrong no matter where you go. It is the basic condition of life, to be required to violate your own identity.”
Philip K. Dick
Books I liked like this one
Dune: Frank Herbert (For high science fiction that combines modern ethical problems with futuristic back-drops)
Humankind: Rutger Bregman (For the questions about the role empathy plays in our society)

