Home > About Nanotechnology

About Nanotechnology

Nanotechnology comprises any technological developments on the nanometer scale, usually 0.1-100nm. (One nanometer equals one thousandth of a micrometer or one millionth of a millimeter.) The term sometimes applies to any microscopic technology. The size scale of nanotechnology yields to quantum-based phenomena, which yields often counterintuitive results. These nanoscale phenomena include quantum size effects and molecular forces such as van der Waals forces. Furthermore, the vastly increased ratio of surface area to volume opens new possibilites in surface-based science, such as catalysis. The device density of modern computer compoents (i.e. the number of transistors per unit area) continues to grow exponentially, but fundamental electronic limitations prevent the trend of Moore's law to continue. Current estimates predict ten to fifteen years of continued improvement before economic costs grow exponentially. Nanotechnology is seen as the next logical step for continued advances in computational architecture. The term nanotechnology is often used interchangeably with molecular nanotechnology (also known as "MNT"), a hypothetical advanced form of nanotechnology that is believed will be developed far into the future, although estimates vary. The term nanoscience is used to describe the interdicplinary field of science devoted to the advancement of nanotechnology.

History

The first mention of nanotechnology (not yet using that name) occurred in a talk given by Richard Feynman in 1959, entitled There's Plenty of Room at the Bottom. Feynman suggested a means to develop the ability to manipulate atoms and molecules "directly", by developing a set of one-tenth-scale machine tools analogous to those found in any machine shop. These small tools would then help to develop and operate a next generation of one-hundredth-scale machine tools, and so forth. As the sizes get smaller, we would have to redesign some tools because the relative strength of various forces would change. Gravity would become less important, surface tension would become more important, van der Waals attraction would become important, etc. Feynman mentioned these scaling issues during his talk. Nobody has yet effectively refuted the feasibility of his proposal. The term 'Nanotechnology' was created by Tokyo Science University professor Norio Taniguchi in 1974 to describe the precision manufacture of materials with nanometer tolerances. In the 1980s the term was reinvented and its definition expanded by K Eric Drexler, particularly in his 1986 book Engines of Creation: The Coming Era of Nanotechnology. He explored this subject in much greater technical depth in his MIT doctoral dissertation, later expanded into Nanosystems: Molecular Machinery, Manufacturing, and Computation. Computational methods play a key role in the field today because nanotechnologists can use them to design and simulate a wide range of molecular systems. Early discussions of nanotechnology involved the notion of a general-purpose assembler with a broad range of capability to build different molecular structures. The possibility of self-replication, the idea that assemblers could build more assemblers, suggests that nanotechnology could reduce the price of many physical goods by several orders of magnitude. Self-replication is also the basis for the grey goo scenario. More recent thinking has focused instead on a more factory-oriented approach to construction. The smallest elements of a product would be built on assembly lines, then assembled into progressively larger assemblies until the final product is complete.