Is nanotechnology inherently dangerous? Calls for regulation of this rapidly developing and diverse discipline seem to imply that it is. Only last week, for instance, Britain's Better Regulation Task Force, in a report on the regulation of scientific research, urged the UK government “to demonstrate it has clear policies in place to ensure the safety of individuals, animals and the environment”, in the face of developments in the field.

The task force's recommendations are unremarkable, even anodyne: pleas for openness, informed public debate, and foresight over potential risks. More noteworthy is the fact that nanotech was singled out for attention in a report that otherwise dealt with established flashpoints for public concern, such as genetically modified crops and human embryonic stem cells. The task force's fear seems to be that nanotechnology might soon become the topic of public unease — and that the resulting debate will take place in an informational vacuum that will quickly be filled with hot air and hysteria.

This is a valid point. Possible risks of nanotech are already under a media spotlight, but the debate is informed largely by science fiction. Literally so: Michael Crichton's new thriller Prey, in which self-replicating nanoscale robots run amok, has for weeks been riding high in the best-seller lists. When the inevitable movie appears, a nanotech Armageddon will become popular fare.

Stories told by scientists are always fair game for novelists looking for a racy plot device. And this is an old story, related most famously in 1986 by Eric Drexler in his book Engines of Creation. This visionary account of what nanotech might offer to medicine and other human endeavours also warned of a darker side: the possibility that 'nanobots' created to build structures and materials, including copies of themselves, atom by atom, might start replicating endlessly.

These rogue nanobots, Drexler envisaged, would then pull apart everything around them and use it to build copies of themselves, ultimately turning the world into a 'grey goo'. This has become a favourite scenario, and was revived most prominently by Bill Joy, chief scientist at Sun Microsystems, in an article in Wired magazine in April 2000.

But the way in which nanotech has developed since Drexler's words of warning makes the grey-goo nightmare an unlikely prospect. No one is considering building nanobots that replicate by manipulating atoms one at a time, and several leading figures in nanotech research argue that the whole idea is unfeasible. Self- replication is certainly one focus of nanotechnology, but it is hard to see how any of the avenues currently being explored could even become autonomous, let alone get out of control. Anyone wishing to make mischief with self-replicators would start with cells or viruses, not hypothetical nanobots.

There may well be dangers in nanotechnology, as in any emerging area of research. We should certainly look at the potential toxicity of nanoparticles being touted as medical diagnostic tools. But nanotechnology is a diverse field, united only by the factor of scale. So it is not even clear how one would go about regulating nanotech in a manner unique to the discipline.

It is tempting to conclude that nanotech would not be subjected to suspicious scrutiny at all were it not for the enduring but outdated image of grey goo. But that is all the more reason why the Better Regulation Task Force's proposal for good public information should be heeded. It would do no one any favours to leave public opinion, and government regulations, to be shaped by the writers of science-fiction thrillers.