• Login
  • Register

Work for a Member company and need a Member Portal account? Register here with your company email address.

Article

Why Westerners fear robots and the Japanese do not

Copyright

Paper Boat Creative/Getty Images

Paper Boat Creative/Getty Images

As a Japanese, I grew up watching anime like “Neon Genesis Evangelion,” which depicts a future in which machines and humans merge into cyborg ecstasy. Such programs caused many of us kids to become giddy with dreams of becoming bionic superheroes. Robots have always been part of the Japanese psyche—our hero, Astro Boy, was officially entered into the legal registry as a resident of the city of Niiza, just north of Tokyo, which, as any non-Japanese can tell you, is no easy feat. Not only do we Japanese have no fear of our new robot overlords, we’re kind of looking forward to them.

It’s not that Westerners haven’t had their fair share of friendly robots like R2-D2 and Rosie, the Jetsons’ robot maid. But compared to the Japanese, the Western world is warier of robots. I think the difference has something to do with our different religious contexts, as well as historical differences with respect to industrial-scale slavery.

The Western concept of “humanity” is limited, and I think it’s time to seriously question whether we have the right to exploit the environment, animals, tools, or robots simply because we’re human and they are not.

Sometime in the late Eighties, I participated in a meeting organized by the Honda Foundation in which a Japanese professor—I can’t remember his name—made the case that the Japanese had more success integrating robots into society because of their country’s indigenous Shinto religion, which remains the official national religion of Japan.

Shinto, unlike Judeo-Christian monotheists and the Greeks before them, do not believe that humans are particularly “special.” Instead, there are spirits in everything, rather like “The Force” in Star Wars. Nature doesn’t belong to us, we belong to Nature, and spirits live in everything, including rocks, tools, homes, and even empty spaces.

The West, the professor contended, has a problem with the idea of things having spirits and feels that anthropomorphism, the attribution of human-like attributes to things or animals, is childish, primitive, or even bad. He argued that the Luddites who smashed the automated looms that were eliminating their jobs in the 19th century were an example of that, and for contrast he showed an image of a Japanese robot in a factory wearing a cap, having a name and being treated like a colleague rather than a creepy enemy.

The general idea that Japanese accept robots far more easily than Westerners is fairly common these days. Osamu Tezuka, the Japanese cartoonist and the creator of Atom Boy noted the relationship between Buddhism and robots, saying, ''Japanese don't make a distinction between man, the superior creature, and the world about him. Everything is fused together, and we accept robots easily along with the wide world about us, the insects, the rocks—it's all one. We have none of the doubting attitude toward robots, as pseudohumans, that you find in the West. So here you find no resistance, simply quiet acceptance.'' And while the Japanese did of course become agrarian and then industrial, Shinto and Buddhist influences have caused Japan to retain many of the rituals and sensibilities of a more pre-humanist period.

In Sapiens, Yuval Noah Harari, an Israeli historian, describes the notion of “humanity” as something that evolved in our belief system as we morphed from hunter-gatherers to shepherds to farmers to capitalists. As early hunter-gatherers, nature did not belong to us—we were simply part of nature—and many indigenous people today still live with belief systems that reflect this point of view. Native Americans listen to and talk to the wind. Indigenous hunters often use elaborate rituals to communicate with their prey and the predators in the forest. Many hunter-gatherer cultures, for example, are deeply connected to the land but have no tradition of land ownership, which has been a source of misunderstandings and clashes with Western colonists that continues even today.

It wasn’t until humans began engaging in animal husbandry and farming that we began to have the notion that we own and have dominion over other things, over nature. The notion that anything—a rock, a sheep, a dog, a car, or a person—can belong to a human being or a corporation is a relatively new idea. In many ways, it’s at the core of an idea of “humanity” that makes humans a special, protected class and, in the process, dehumanizes and oppresses anything that’s not human, living or non-living. Dehumanization and the notion of ownership and economics gave birth to slavery at scale.

In Stamped from the Beginning, the historian Ibram X. Kendi describes the colonial era debate in America about whether slaves should be exposed to Christianity. British common law stated that a Christian could not be enslaved, and many plantation owners feared that they would lose their slaves if they were Christianized. They therefore argued that Blacks were too barbaric to become Christian. Others argued that Christianity would make slaves more docile and easier to control. Fundamentally, this debate was about whether Christianity—giving slaves a spiritual existence—increased or decreased the ability to control them. (The idea of permitting spirituality is fundamentally foreign to the Japanese because everything has a spirit and therefore it can’t be denied or permitted.)

This fear of being overthrown by the oppressed, or somehow becoming the oppressed, has weighed heavily on the minds of those in power since the beginning of mass slavery and the slave trade. I wonder if this fear is almost uniquely Judeo-Christian and might be feeding the Western fear of robots. (While Japan had what could be called slavery, it was never at an industrial scale.)

Lots of powerful people (in other words, mostly white men) in the West are publicly expressing their fears about the potential power of robots to rule humans, driving the public narrative. Yet many of the same people wringing their hands are also racing to build robots powerful enough to do that—and, of course, underwriting research to try to keep control of the machines they’re inventing, although this time it doesn’t involved Christianizing robots … yet.

Douglas Rushkoff, whose book, Team Human, is due out early next year, recently wrote about a meeting in which one of the attendees’ primary concerns was how rich people could control the security personnel protecting them in their armored bunkers after the money/climate/society armageddon. The financial titans at the meeting apparently brainstormed ideas like using neck control collars, securing food lockers, and replacing human security personnel with robots. Douglas suggested perhaps simply starting to be nicer to their security people now, before the revolution, but they thought it was already too late for that.

Friends express concern when I make a connection between slaves and robots that I may have the effect of dehumanizing slaves or the descendants of slaves, thus exacerbating an already tense and advanced war of words and symbols. While fighting the dehumanization of minorities and underprivileged people is important and something I spend a great deal of effort on, focusing strictly on the rights of humans and not the rights of the environment, the animals, and even of things like robots, is one of the things that has gotten us in this awful mess with the environment in the first place. In the long run, maybe it’s not so much about humanizing or dehumanizing, but rather a problem of creating a privileged class—humans—that we use to arbitrarily justify ignoring, oppressing and exploiting.

Technology is now at a point where we need to start thinking about what, if any, rights robots deserve and how to codify and enforce those rights. Simply imagining that our relationships with robots will be like those of the human characters in Star Wars with C-3PO, R2-D2 and BB-8 is naive.

As Kate Darling, a researcher at the MIT Media Lab, notes in a paper on extending legal rights to robots, there is a great deal of evidence that human beings are sympathetic to and respond emotionally to social robots—even non-sentient ones. I don’t think this is some gimmick; rather, it’s something we must take seriously. We have a strong negative emotional response when someone kicks or abuses a robot—in one of the many gripping examples Kate cites in her paper, a U.S. military officer called off a test using a leggy robot to detonate and clear minefields because he thought it was inhumane. This is a kind of anthropomorphization, and, conversely, we should think about what effect abusing a robot has on the abusing human.

My view is that merely replacing oppressed humans with oppressed machines will not fix the fundamentally dysfunctional order that has evolved over centuries. As a Shinto, I’m obviously biased, but I think that taking a look at “primitive” belief systems might be a good place to start. Thinking about the development and evolution of machine-based intelligence as an integrated “Extended Intelligence” rather than artificial intelligence that threatens humanity will also help.

As we make rules for robots and their rights, we will likely need to make policy before we know what their societal impact will be. Just as the Golden Rule teaches us to treat others the way we would like to be treated, abusing and “dehumanizing” robots prepares children and structures society to continue reinforcing the hierarchical class system that has been in place since the beginning of civilization.

It’s easy to see how the shepherds and farmers of yore could easily come up with the idea that humans were special, but I think AI and robots may help us begin to imagine that perhaps humans are just one instance of consciousness and that “humanity” is a bit overrated. Rather than just being human-centric, we must develop a respect for, and emotional and spiritual dialogue with, all things.

Related Content