Richard Jones over at the excellent Soft Machines takes a look at the ranking of nanotech countries based on publications, and more importantly how this fails to reveal the true picture. The same is true when looking at government spending â€“ how do you compare one nanotechnology program that adds â€˜nanoâ€™ to an existing facility with one that pays for the construction and staffing of a new facility?
If we were to go a little deeper, then publications in high impact journals such as Nature should be weighted more heavily when compared to low impact ones which are often the forum for publishing research results whose interpretation is still open to discussion.
You could take a similar approach to funding, looking at the effectiveness of the funding to create a value for the ROI (return on investment), which would factor in publications (and their impact/citations), the institutions ability to transfer technology from universities to industry, number of links with external bodies, and take into account number of researchers per capita and purchasing power parity.
That may give you a semi quantitative measure of nanoscience activity but it would be only as useful as an indicator of chemistry or physics. If one country decided to spend 90% of its resources on nanoelectronics, while another went for drug delivery would we be able to rank them?
While rankings may be essentially useless, due to the sheer diversity of nanotechnologies, they do make good press. Everyone likes a story about how their country leads the world, or usually in the case of the UK press, trails it, so they do at least provide some sport.