@ roberfgay: Yes ... what's your point though?
'It is the application of robots by people that concerns me and not the robots themselves.'
Let's assume that the doomsday scenarios portrayed in popular culture are indeed far off, that robots are far from outsmarting humans, and that they are no threat to humanity. If what we are worried about is truly the application of robots by people, then how is this problem different from the application of other types of technology?
Let's look at a few examples:
According to the Straits Times article you linked,
Professor Sharkey worries how robots - and particularly the people who control them - will be held accountable when the machines work with 'the vulnerable', namely children and the elderly ...
What makes robots special? Television sets also "work" with children (to stick with the strange terminology). They have been around for many years, and as we became accustomed to the technology we've learned how to integrate it into our lives. This did take some time, but only required little in the way of ethical guidelines or special legislation.
As a second example, let's think about the semi-autonomous war robots currently on duty in Afghanistan and Iraq. They are just another form of a smart weapon, like a torpedo, a laser guided missile, or a smart bomb. Again, similar systems have been around for many years.
Should we be having a much larger discussion about taking humans out of the loop in any - robotic- or non-robotic - system: cars, airplanes, or tanks?