A special report in The Economist (March 29-April 4, 2014) about the rise of robots took me back to an article I had written in December 2010 about NAO.
I cited from the website which started by asking not ‘what’ but ‘who’ is NAO? The answer was ‘NAO is a humanoid robot that is 58cm tall and weighs 5kg developed by Aldebaran Robotics. NAO is an autonomous and interactive robot that is completely programmable. NAO is used today for research and education around the world in prestigious universities and research institutes.’
The website went on to describe what NAO could do: ‘NAO has the ability to see, hear, speak, feel and communicate. Using the latest and most innovative technology, NAO is a unique combination of hardware and software…’ and referred to ‘NAO’s special characteristics of fluid movements, the ability to sense and avoid obstacles and the capability to be fully programmed.’ (italics added)
In more recent additions, NAO is described as the ‘most widely used humanoid robot for academic purposes worldwide. It is a versatile platform used to explore a wide variety of research topics in robotics as well as computer science, human-machine interaction, and the social sciences. NAO’s many sensors and actuators, convenient size, and attractive appearance, combined with sophisticated embedded software, makes it a unique humanoid robot ideal for many research fields. NAO boasts, for example, face and object recognition, automatic speech recognition, text-to-speech in seven languages, and whole body motion.’
I had commented that ‘The italicized words beg the question – who has written the programme? Obviously not NAO, but human beings.’ Even if we come to the day in future, which is not an impossibility, when robots may be able to write their own programmes too, the fact remains that they will still have to be programmed to do so – by humans. Nevertheless it is also a fact that because of innovations in robotics driven by a combination of factors which include ‘exponential growth in computing power, progressive digitization of things that people work with…, and the opportunities for innovators to combine an ever-growing stock of things, ideas and processes into ever more new products and services’, future robots being developed will have increasing sophistication ‘as machines that sense their environment, analyse it and respond accordingly.’
We are already familiar with robots which perform routine tasks in the industry: production lines in various sectors, most notably in car manufacturing, where they function as extensions of humans, with parts resembling arms and hands, capable of picking up and placing/positioning things in a pre-programmed manner. These were the early versions, but gradually robots have become more humanoid, like NAO, because ‘form has to follow function’: not only are robots being required to ‘function in an environment shaped to human specifications,’ they will now have to fulfil ‘one of the main aims of current robotic research: learning to operate flexibly in an environment designed for humans, not robots.’
In hospitals, for example, where Tug robots made by Pittsburgh company, Aethon, are in use in nearly 150 hospitals across America. They are ‘limbless, faceless… but reliable heavy duty trundlebots designed to move hospital trolleys around.’ More robots are in the pipeline to function in a non-factory environment where automation can make significant contributions to the needs of that environment.
In Japan, work begun in the early 1990s by Takanori Shibata has led him to the invention of Paro, a 57 cm long robot which ‘looks like a baby harp seal.’ And what can Paro do? ‘…it responds amiably to stroking; and although it cannot walk, it can turn its head at the sound of a human voice and tell one voice from another.’ Even more, ‘it is a comfortable and gentle presence in your arms, on your lap or on a table top, where it gives the impression of following a conversation. The best thing about it is that it seems to be helping in the care of people with dementia and other health problems.’
It is foreseen that robots such as Paro will make it easier to look after old people in homes. Further, general purpose home-help robots that are being developed will ‘make it possible for old people to stay independent in their own homes for longer.’ Machines have been present in our environment for so long that we have come to accept them as almost both desirable and accepted extensions of ourselves, without any deep qualms about how we interact with them.
If we just think about the gadgets around the house, whose range expanded considerably with the advent of electronics, we soon realise that they have become an indispensable part of our lives. We feel impotent without them: just one has to break down – and, well, we almost break down too, so used and dependent we have become on our machines. And these are mechanical machines, objects that respond to our commands at the press of a button. Unless there is a defect in the machine or a problem with the power supply, they will ‘obey’ instantly.
But imagine a machine that can feel, can decide to respond or not depending on what it is sensing and what analysis it makes of that information. Will we be in a position to fight back?
The future will tell. Until such a time comes, we can perhaps still take comfort in the fact that, as I wrote in my previous article, ‘a robot can be humanoid – human-like – but can never become a human being, however many characteristics of the latter we may programme into it. Similarly, a human being can behave like a machine – but can never become a machine. However, it is a pity that we have become so mechanical in our habits, behaving like machines in an automatic way.’ Humanoid in shape, and a semblance of humane perhaps – but human? Most probably never.
I found these words by late Jacob Bronowski, famed mathematician, biologist and author of ‘The Identity of Man’ (in which the issue of man as machine was discussed) and ‘The Ascent of Man’ very relevant to our current human predicament: ‘We have to cure ourselves of the itch for absolute knowledge and power. We have to close the distance between the push-button order and the human act. We have to touch people.’
* Published in print edition on 1 May 2014