Part 1: The challenges
We are living in ‘interesting times’, as the Chinese say, with many global challenges. Geo-politics has heated up again, having cooled off a bit after the collapse of the Soviet Union 30 years ago. Forces of populism and authoritarianism are rising in many parts of the world creating concern among liberals about the future of democracy. Terrorism of many sorts, and with many causes, is striking in many places. In a world where the slow growth of jobs and increasing inequalities have been fuelling social unrest, rapid advances in automation and artificial intelligence (AI) technologies are causing greater anxieties.
Technology is being implicated, indirectly and directly, for these global problems. Job destruction is one of the major problems attributed to technology. The spread of fake news and hatred on social media is another. The wise solution is not to stop the development of new technologies, but to understand how they can be used and misused and to regulate them. Indeed, all new technologies with immense potential have been regulated to prevent them from causing harm: nuclear power, electricity, chemicals, new medicines, etc. Technologies are developed by humans. They are used by humans. And humans must harness them to produce the best outcomes for humans.
Technology and Jobs
First, consider the bundle of technologies that, it is feared, will cause large-scale destruction of jobs—robots, AI, 3-D printers, etc.—collectively bundled into ‘Industry 4.0’. The anticipation of its consequences, and the preparation for it, has become a multi-billion-dollar consulting industry. Governments in developed and developing countries are being advised to develop strategies. It seems they may be sold medicines for a disease they may get in the future, when they should be diagnosing the disease they already have, of slow job growth in their economies even before these technologies have spread.
The recent report from the World Bank, Trouble in the Making?, says that the impact of these new technologies may have been exaggerated. It estimates that technology will eliminate less than 8% of present jobs in any country in the foreseeable future. Therefore, governments should be focused on why jobs are not being created now, to understand the mix of forces which is creating so-called ‘jobless growth’, of which technology is only one. A joint report by Ficci and Nasscom on The Future of Jobs in India-2022, prepared by EY, says that, whereas new technologies will be disruptive for the IT/IT enabled services (ITES), retail and financial services sectors, their effect on sectors such as apparel, textiles, leather, etc. which are the primary sources of jobs in India, will be relatively marginal in the short term.
‘Sewbots’ developed by SoftWear Automation, based in Atlanta, US, can replace human beings in sewing apparel. Sewbots can make simple items like pillow cases and bath mats. A sewbot will be able, as soon as next year, to tailor a T-shirt. However, this will not reduce jobs in garment factories in Bangladesh, says Palinaswamy Rajan, the firm’s founder. While a sewbot can produce 17 times the number of T-shirts a traditional garment worker can, it may not be economically sensible to replace cheap Bangladeshi labour with expensive sewbots. Rajan says that sewbots will automate only 20-25% of the garment industry even 20-25 years in the future.
Why is India not creating more jobs in apparels and textiles, industries in which it should have competitive advantages? A special report in The Economist on premature de-industrialisation in emerging markets, suggests many causes, including poor logistics, and an import policy to protect some Indian manufacturers that raises the cost of man-made fibres which are the principal staples of the global textile and apparels business. The point is, all governments, including India’s, must look inwards into the condition of their economies and their policies and improve them before they rush to strategies to cope with Industry 4.0 which providers of technology would love to sell them.
Technology and Democracy
‘Will technology destroy democracy?’ was an alarming question this year at Forum 2000, the annual conference in Prague, founded 21 years ago by Vaclav Havel, the visionary leader of the Prague Spring and the Velvet Revolution that began the fall of the Soviet Union. The internet and social media were expected to be liberating forces. They enabled the Arab Spring in 2008. Which turned too soon to winter again with a return of authoritarian forces. The internet and social media are tools, like all technologies are, that can be used by both sides: those who oppose and those who defend; for good and for evil.
The internet and social media have the potential, theoretically, to bring the whole world together by enabling people anywhere to reach out to people everywhere. However, the world of social media is not a harmonious world. It is forcing people into tribes of common, often very visceral interests. Because it is practically impossible for a human mind to pay attention to everything, we are compelled to choose what we will follow and what we must ignore. Indeed, walls around ‘people like us’ have become higher with the spread of social media. People throw hate bombs over the walls. They shout at, and they do not listen at all to those on the other side.
A hundred years ago, long before the invention of social media, Rabindranath Tagore had feared that ‘a world broken up into fragments by narrow domestic walls’ would not be a world of freedom, in ‘which the mind would be without fear and the head could be held high’. Walls between nations are rising again in the new millennium. And walls within nations too, between people of different religions and different cultures. The internet and social media are not the primal causes of peoples’ dissatisfactions and fears. However, they have become accelerators.
The speed with which communication technology operates now is causing ‘the clear stream of reason to lose its way in the dreary desert sand of dead habit’, quoting Tagore again. We must think fast and act fast to keep up with the barrage of information thrust upon us. Algorithms developed by technologists direct our minds to more of what we have indicated we like. Thus, they harden the ‘dead habits’—the stereotypes, and the unthinking responses of our minds. The ‘thinking fast’, instinctual portion of our minds (as Noble Laureate economist Daniel Kahneman called them) is used more, and the ‘thinking slow’, reflective portion less. In fact, neuro-psychologists are observing measurable changes in the brain structures of children who have become active users of internet and social media compared with brain structures of children two decades ago. They can multi-task and respond to stimuli faster and they may be reflecting less.
The thinking fast portion of all animals’ (including human beings) minds helps them to decide very quickly whether to fight or to flee. Whereas the thinking slow portion accepts questions to which there are no immediate answers and it enables reflection. The ‘thinking slow’ part of our minds makes us pause to look behind the stereotype into the actual reality. Empathy—the ability to put oneself in another’s shoes—can arise when we pause and reflect. Therefore, not surprisingly, social scientists have observed a 40% decline in the last ten years in markers of empathy in college students in the USA who are active users of social media. Estimates of how much less time students are spending in the physical company of others range in some surveys from 40% to 80%. Moreover, when they are together, they are very likely to be ‘alone together’, looking into their smartphones, rather than at each other. They are connected to someone other somewhere else.
Will democracy regulate technology?
Technology has the power to make the world better. It also has the power to make it unsafe. Therefore, powerful technologies must not be allowed to fall into the wrong hands and their use must be regulated. Just as nuclear energy, powerful chemicals, and new medicines are regulated, rapidly developing communication, computational, and AI technologies will have to be regulated too.
Democracy demands that human beings (and all of them equally), should be able to determine the rules by which they will be governed. Those who own technologies have great incentives to monopolise their use and prevent others from using them. The incentives can be financial, as they are with intellectual property. Hence the expensive legal battles between companies, and the emerging trade battles about intellectual property between rich and poorer countries. Or the incentive to own the technology can be security, as it is with nuclear weapons. Whereas there must be regulation, those who have the power to make the rules will make rules that preserve their power. Which is not democratic.
Democratic deliberation requires that everyone has an equal voice. Moreover, since the global problems humanity is being challenged by in these ‘interesting times’ are complex, many points of view must be combined to understand their causes and to find sustainable solutions. Therefore, people must listen to each other, across the walls that divide them, and understand each other’s perspectives. More reflective conversations are necessary among people who are not like each other and may not even like each other. Whereas social media is making listening across the walls harder, not easier.
Part 2: How to find solutions?
Who will regulate technology?
The premise of democracy is that the regulations that govern citizens are developed by a democratic process in which they can participate. Democratic governments are elected by people within defined boundaries. All the people within those boundaries must elect their government. Governments cannot have jurisdiction over people outside those boundaries who have not participated in the election of the government. Of course, if the people within those boundaries are not free to elect their own government, either because they are ruled by an autocratic, non-elected government, or if they are a colony of another country, there is no democracy (even if the ‘colonising’ country has a democratically elected government).
It follows from this premise that citizens must have a right to determine the policies that will govern the use of technology in their country. Even if the country’s government is not democratically elected, another country cannot impose its policies on it for that would not be democratic.
Citizens have many expectations from their governments, whether democratically elected or not. They expect that a good government will create conditions for them to obtain what they need to live good lives—decent jobs with adequate incomes, adequate housing and infrastructure, and good social services for health and education, etc. They would also expect their government to ensure their safety from external or internal threats. And they would prefer that their government gives them liberty, freedom of expression, and the right to criticise their government too, which are the visible markers of democracies.
Tolstoy said, “Happy families are all alike; every unhappy family is unhappy in its own way.” People in all countries want to have even better conditions for themselves and for their children and grandchildren. That is the path to progress. However, countries are at different places along the path, and may be even on different paths towards a better future. Therefore, the mix of what citizens of each country need for progress is different. Some may value greater economic security now and a stronger government to deliver it. Others may want more liberty now. Indeed, the rise of populism in the West is an indication that people have requirements that even democratic governments were not fulfilling.
Like Tolstoy’s unhappy families, unhappy in their own ways, countries will need policies and mechanisms to regulate technology that fit their requirements. One-size-fits-all solutions will not be appropriate. Champions of democracy must encourage the development of solutions with the participation of stakeholders within countries. The test of democracy’s strength must be the engagement of stakeholders within countries. Imposition of solutions developed by experts, from outside or within the country, would be undemocratic.
With increasing globalisation of finance and trade since the 1970s, and with it the increasing power of corporations who want freedom to be anywhere, and whose loyalty does not lie with any country, a parallel world of global governance had grown. In this parallel world, sometimes called the ‘Davos world’, leaders of governments, large multi-national corporations, and economists evangelising globalisation, were seen to be actively consulting each other. The agenda on the mountain top was to develop policies for providing more freedom for capital and investors. On the ground, in many countries, citizens began to feel disconnected with the global elite. Their alienation from the elites’ agenda has spurred the rise of populist, and nationalist, political forces.
Who should be trusted to regulate the use of internet and social media technologies that have great power over citizens’ minds? Can governments be trusted by their citizens? Or, can owners of the companies, sitting in some other country, that make money from these technologies, be trusted more? It is also an ideological issue: more private enterprise and less government, or more socialism? It is a practical issue too: how will the regulations be enforced? Can corporations be freed from regulation by national governments and trusted to regulate themselves?
Dialogues about what citizens’ needs are, and what sort of regulations of technology they will support, are necessary for democratic solutions. Such dialogues must be not only among private business and government leaders. They must include civil society. Many points of view must be considered. No doubt, ideologies will complicate these deliberations, as well as defenses of vested interests. Therefore, the quality of the dialogues will determine the quality of the regulations.
The consensus at Forum 2000 was that the democratic conversations which have become necessary to determine how to sensibly regulate the use of new, digital, computational and communication technologies will have to be conducted ‘offline’, in old-fashioned, analogue formats. People must switch off their smartphones and online chatter; and learn to switch off the chatter of dead habits in their minds. We must pause and listen to other points of view. We must learn to reflect together on the shape of the elephants in the room that will emerge when we combine our perspectives and think systemically.
Listening is the first of the three wisdom tools in Buddhist tradition, His Holiness the Dalai Lama says in his Foreword to my book, Listening for Well-Being: Conversations with People Not Like Us. The other two are contemplating and meditating. If one listens deeply, one can learn something new, about the world and about other people, that is not already in one’s mind. From listening comes something to contemplate and to understand. Without listening, the clear stream of reason is lost in the dreary desert sand of dead habits. And the world is broken into fragments, as Tagore warned.
The skills of communication taught in schools and leadership programmes are oriented heavily towards how to get one’s point of view across forcefully, in inspiring speeches, in debates, and in tweets. Skills for the other side of communication, for listening, are hardly taught. When everyone is shouting and tweeting, there is only a cacophony. There is no communication. There is no understanding. There is no wisdom.
The conclusion is that the technology most urgently required to develop better regulations of technology democratically, is improvement and application of an old-fashioned technology of dialogue with deep listening to many points of view to find solutions that will be fair for all. Listening to and dialogue with people not like us will also strengthen democracy.