The following is the full interview with Yuval Noah Harari, author of the international bestsellers "Sapiens" and "Homo Deus." ― ED.

Q. How do you think that the development of algorithms, artificial intelligence (AI), will affect North Korea's political system that is dominated by third-generation dictator Kim Jong-un?

A. It could affect it in various and even contradictory ways. One scenario is that the North Korean regime will not be able to adapt to the new realities. Since North Korea at present has no advanced Infotech industry, as AI becomes more and more critical both for the economy and for the military, North Korea might become economically poorer and militarily weaker than ever before. Unable to provide for its own citizens or to blackmail its neighbors, the Kim regime will collapse.

Another scenario is that North Korea will leap-frog ahead, becoming for example the first country in the world where all vehicles are self-driving. There are advantages to being an underdeveloped centralized dictatorship. Consider what might happen if South Korea tries to ban human drivers and switch to a completely self-driving transport system. South Korean citizens privately own millions of vehicles, and many might object to losing their freedom and their property. There will also be objections from taxi drivers, bus drivers, truck drivers and even traffic cops, who will all lose their jobs. There will be strikes and demonstrations. The initiative could also be forestalled by legal and even philosophical conundrums if a self-driving car causes an accident, whom do you sue? Or suppose a self-driving car lost its breaks due to some malfunction, and has to choose between driving forward and killing five innocent pedestrians, or swerving to the side and endangering its own passengers. What should the car do?

In a free market democracy like South Korea, it will not be easy to confront all these challenges. Now consider North Korea. There are very few vehicles there, taxi drivers cannot demonstrate, truck drivers cannot strike, and all legal and philosophical difficulties can be solved in a single afternoon with the stroke of a single pen. You really need to convince just one person, and the country could switch overnight to a fully automatic transport system.

A third scenario is that North Korea uses AI to become a full-fledged Orwellian dystopia. Each citizen will be required to wear a biometric device that not only monitors everything you do and say, but could even monitor your blood pressure and brain activity. In Soviet days the KGB could never really know everything about you, and could not be sure what you were thinking, because the KGB did not have the data and the computing power necessary. But using our growing understanding of the human brain, and using the immense powers of machine learning, the North Korean regime might be able for the first time in history to know what each and every citizen is thinking each and every moment. If you look at a picture of Kim Jong-un on the wall, and the biometric sensors pick up the telltale signs of anger (higher blood pressure, increased activity in the amygdala), you'll be in the gulag tomorrow morning.

Q. How do you think that the fast rise of AI will affect the South-North relationship?

A. In the immediate future, the greatest impact might come from cyberwarfare. The South will face a constant danger of cyber-attacks from the North, and will find it difficult to deter the North from such attacks or retaliate to them. The North's very backwardness will protect it. If the North disrupts online bank services in the South, what will the South do? It cannot disrupt online bank services in the North, because there aren't any. Will it bombard Pyongyang in retaliation?

The rise of AI might also make any future integration of North and South more difficult, for both cultural and economic reasons. AI is likely to transform the culture and even psychology of South Koreans, and if North Koreans do not undergo a similar revolution, the gap between the populations would become bigger than ever before. It is beginning to happen even today. Just think of the cultural gap between a South Korean teenager glued to her smartphone, Youtube, Instagram and Twitter and a North Korean teenager who might well be dumbfounded to see people walking down the street and constantly looking at small screens in their palms.

As for the economic gap, in the past the South was apprehensive of unification because of the expected economic cost. Nevertheless, there were also potential economic benefits, because the North had one important economic asset to contribute to a united Korea: a disciplined and cheap work force. However, AI is likely to make cheap labor irrelevant. As algorithms and robots replace truck drivers, factory workers and even doctors, cheap labor would lose its value. That might make future integration even harder. South Korea might face a crisis taking care of its own unemployed masses; it will not want to take care of millions of northerners too.

Of course there might be positive influences as well. As noted earlier, the profound changes caused by the rise of AI might destabilize the North Korean regime and lead to its collapse, while the South Korean hi-tech industry may generate so much prosperity in the South that it will actually find it easier to integrate the North. Nobody really knows for sure.

Q. In your new book the "Homo Deus," you talked about "logic bombs." Is it safe to assume that North Korea has also installed many logic bombs in South Korea?

A. It is extremely likely.

Q. How would you evaluate U.S. President Donald Trump's policies such as anti-trade measures (the renegotiation of NAFTA and the KORUS FTA, and the withdrawal from TPP)? How would you evaluate Trump's policies including travel bans on immigrants from Muslim countries and refugees?

A. Donald Trump is part of a larger nationalist wave that is sweeping over much of the world. This is a very dangerous development. In the past, nationalism was dangerous because it bred war. In the 21st century, nationalism is even more dangerous, because in addition to fostering wars it is likely to prevent humankind from solving the existential problems we face.

Today all our major problems are global in nature: global warming, global inequality, and the rise of disruptive technologies such as AI and bioengineering. In order to face these challenges successfully, we need global cooperation. For example, no nation can regulate bioengineering single-handedly. It won't help much if the US forbids genetically engineering human babies, as long as China or North Korea allows it. Similarly, no nation can stop global warming by itself. Can Donald Trump build a wall against rising oceans? Because nationalism has no answer to global warming, it tends to simply deny the problem. But the problem is real. Hence I think the current wave of nationalism is a kind of escapism: people refusing to confront the unprecedented problems of the twenty-first century by closing their eyes and minds and by seeking a refuge in the fold of traditional local identities. I hope that people will wake up in time. For that, we probably need a new global ideology that can unite humankind.

There is still plenty of room in the world for the kind of patriotism that celebrates the uniqueness of my nation and stresses my special obligations towards it. Yet if we want to survive and flourish, humankind has little choice but to complement such local loyalties with substantial obligations towards a global community. If people nevertheless insist that the interests of their nation override all other loyalties and considerations, I would be curious to know how they plan to solve global warming and how they plan to handle the risks of nuclear war, artificial intelligence and biotechnology.

Q. Do you think that Google and Baidu are not doing the right thing by collecting personal data and taking advantage of them?

A. Collecting and analyzing personal data on billions of people certainly has its advantages, not just to the corporations doing it, but also to the public. It could be used, for example, to provide people with much better healthcare. If Google or Baidu monitors my biometric data and compares it with the biometric data of millions of other people, they could detect the early signs of cancer, when it is possible to nip it in the bud. They could detect the early signs of a flu epidemic, when it is relatively easy to prevent it from spreading. They could alert me that my nutrition is sub-optimal, and tailor a diet for my unique body. I believe that in the 21st century there will be a huge battle between privacy and health, and that most people will choose to sacrifice their privacy for the sake of better healthcare.

At the same time, there are obvious dangers inherent in allowing corporations and governments to know so much about us and to monopolize our personal data. Given enough data and enough computing power, external algorithms could know us better than we know ourselves. And then Google, or the government, could predict our decisions, manipulate our emotions, and gain absolute control over our lives. Not just in North Korea, but also in South Korea, Big Data might create a Big Brother that knows everything and controls everything.

In ancient times land was the most valuable resource, hence much of politics revolved around the question who owns the land. In the wake of the Industrial Revolution the question became who owns the means of production. In the 21st century the question will be who owns the data. At present, people are giving up their most valuable asset, their personal data, in exchange for free email services and funny cat videos. People should be reminded of the famous maxim that "if you get something for free, you are probably the product".

Q. In the era of a-mortals, whenever it would be, do you think that religions would lose its raison d'être and disappear? Or it will continue to exist?

A. First, we should understand what religion is. Religion is not belief in gods. Rather, religion is any system of human norms and values that is founded on a belief in super-human laws. Religion tells us that we must obey certain laws that were not invented by humans, and that humans cannot change at will. Some religions, such as Islam, Christianity and Hinduism, believe that these super-human laws were created by the gods. Other religions, such as Buddhism, Capitalism and Nazism, believe that these super-human laws are natural laws. Thus Buddhists believe in the natural laws of karma, Nazis argued that their ideology reflected the laws of natural selection, and Capitalists believe that they follow the natural laws of economics.

No matter whether they believe in divine laws or in natural laws, all religions have exactly the same function: to give legitimacy to human norms and values, and to give stability to human institutions such as states and corporations. Without some kind of religion, it is simply impossible to maintain social order. During the modern era religions that believe in divine laws went into eclipse. But religions that believe in natural laws became ever more powerful. In the future, they are likely to become more powerful yet. Silicon Valley is today a hot-house of new techno-religions. They promise all the old religious prizes – happiness, peace, prosperity, and eternal life – but here on earth with the help of technology, rather than after death with the help of celestial beings.

Q. Is there anything else you want to comment on to Korean readers?

A. I would like to emphasize that technology is never deterministic. We can use the same technological breakthroughs to create very different kinds of societies and situations. For example, in the 20th century people could use the technology of the Industrial Revolution – trains, electricity, radio, telephone – in order to create communist dictatorships, fascist regimes or liberal democracies. Just think about South Korea and North Korea: They have had access to exactly the same technology, but they have chosen to employ it in very different ways.

In the 21st century the rise of AI and biotechnology will certainly transform the world, but it does not mandate a single deterministic outcome. We can use them to create very different kinds of societies. How to use them wisely is the most important question facing humankind today. It is far more important than the global economic crisis, the wars in the Middle East, or the refugee crisis in Europe. The future not only of humanity, but probably of life itself, depends on how we choose to use AI and biotechnology.