24 December 2006

A year in review

It being year's end, I'd like to reflect on what has gone into this blog. I began with a broad agenda to advocate research and describe research efforts around the world.

I followed a few hot topics, for example biodiversity and the application of systems biology to biomedical research. But science communications is just one facet of my blogging interests.

I am also interested in the centers that do the research, and especially, the research policy behind these centers. I am keen to compare the policy experiments in Europe, the US and Asia.

Places like Biopolis, in Singapore, are successfully forging links between government and industry to support research. Biopolis is well funded and now employs high calibre researchers trained in Europe and the US.

Attempts to fund research with joint public/private partners can be found everywhere. Biopolis represents one approach to joint funding, with research interests extending across the biological sciences.

Another approach is found at the Kluyver Centre in the Netherlands, which focuses on a particular application, namely industrial fermentation. Kluyver's strategy has also been successful, as is evident from the generous support added by its new industrial partner Tate and Lyle.

I learned a lot while writing this blog. I discovered Euractiv, Cordis and AlphaGalileo, three excellent news distribution organizations with a focus on research news. Euractiv and AlphaGalileo are independent, while Cordis is an official news channel for European Commission research activities.

All three channels are rich sources for breaking news about scientific discoveries and for science policy developments. I gained from them a steady stream of facts and figures with which to form opinions about European research.

What defines a good research environment? Should research be conducted in fast- paced research hotels with precise goals and short to medium term time horizons? What role should institutes and university faculties play in research?

Should research aim to bring a direct commercial benefit to the funding agencies that support them? Should the private sector pay for fundamental research, or should it become involved only during the final commercialization phase of research and development?

These are key questions towards understanding where research is headed, and the kinds of research cultures we should expect to see in future.

21 November 2006

Europe a laggard in research?

Europe is sometimes described as a laggard when it comes to the commercialization of intellectual property. New research comparing technology transfer on both sides of the Atlantic shows that the reality is rather different.

Researchers at the United Nations University (UNU) recently published what they believe is the first critical comparison of patent and commercialization activity between Europe and the US. They sought to shed light on the so-called European Paradox, the notion that Europe has good ideas but doesn't make any money from them.    

They find that for every million dollars invested, Europe produces 20% more licenses, 40% more startups and earns within 10% of overall return on investment enjoyed by investors in the US. Measured in terms of these formal forms of technology transfer, Europe is not such a laggard after all.

In characteristic self-deprecating fashion, the European researchers hail the findings as a warning rather than a sign of success. They believe Europe may be focusing too much on formal technology transfer, in turn threatening the open exchange of ideas.

Borrowing from the language of software development, the UNU publication calls for a strengthening of Open Source Science, that is, for greater sharing of basic knowledge and intellectual property.

The US, they believe, is actually much stronger in this regard. Despite an explicit orientation favouring the Market over sharing, technology transfer in the US is conducted with a strong orientation towards sharing knowledge.

If anything, this is the paradox that deserves attention. How is that the country that believes in unfettered competitiveness could simultaneously be home to the planet's greatest achievements in open source software?

Might this come down to the difference between talk and action?

I recently attended a public lecture in Zurich, Switzerland, in which talk and action came into sharp contrast. A member of Switzerland's seven-member governing committee was warmly acknowledging Europe's communitarian instincts.

The Swiss governor explained that competition created jealousy, and that jealousy wasted precious human energies. Much better to do research as a network and share, said the Governor.

There followed a talk by a high profile Harvard professor. He began with the words "I luuurve competition". The lecture hall froze as he described how competitiveness was the foundation of the US dominance in research.

The Swiss Governor's face tightened visibly.

But as the Harvard professor went on it became clear that competitive advantage was not the only factor behind the US's superior research profile. The professor acknowledged a much simpler, and insidious factor behind US successes.

That factor was access to journals.

US journals, he acknowledged, had a clear bias towards US research. Everyone else was somewhat off the radar, he explained.

You can talk the talk, but the action came down to jealous US journal editors.

Talk. Action.

If the US can't resolve the disparities between the two, let's hope Europe can, and that Europe's Open Source Science will flourish.

19 November 2006

Filling the gap between R&D and commercialization

The European Institute of Technology (EIT) moved back onto the agenda on October 18th with the publication of a revised proposal for its funding and organizational structure. According to its supporters, the EIT will bridge the gap between Europe's rich knowledge-base and commercially valuable innovation.

What began as a plan to create Europe's answer to the MIT has now changed considerably. In the new proposal, the EIT is a two-tier institution comprising a small organizational body and a network of collaborative groups called Knowledge & Innovation Communities (KICs).

The KIC's comprise researchers and entrepreneurs employed at universities and other public sector bodies, and the private sector. They will be expected to come up with the innovation, as well as €2.1Billion of the €2.4Billion budget.

Industry and academic interest in the proposal is lukewarm. Industry groups wonder why they should give generously to something that is, in effect, little more than an administrative department of 100 people.

Supporters of the EIT battle on. One name that stands out at the moment is Polish MEP Jerzy Buzek. Buzek talks about Europe's poor ability to deliver innovation, by which he seems to mean, products and services that generate a direct profit.

"It is impossible to finance innovation directly through FP7", remarked Buzek in a reference to the EC's research funding program. He believes the task comes down to filling a gap between research and commercialization.

The EIT would fill this gap, claims Buzek, and would not participate directly in either research or education. This would create unnecessary competition between the EIT and Europe's universities and research institutes, he claims.

Some comment that the latest proposals for the EIT pose more questions than answers. Eurochambres, the 17 Million member association of European chambers of commerce raise several questions.

They believe the proposal lacks clarity about how the KICs will be organized. They also claim that the proposal leaves open the question of how the EIT will rate, and therefore rank, the projects it will become involved with.

Buzek claims these questions reveal the EIT's strengths rather than its weaknesses. On the issue of how KICs would be organized, Buzek believes that the innovation programs will benefit from being able to decide themselves on their composition and organization.

This is a curious response from Buzak.

If the EIT will not participate directly in research, then its existence will be justified in terms of its guidance in technology transfer and commercialization. Filling the gap and all that.

But if it cannot describe how those goals would be translated into some kind of organizational formula or plan, how can we evaluate its quality as an organisation? Seems that the gap between research and commercial success is as empty and unclear as ever.

31 October 2006

Around the world in patents

This month's international research roundup looks at patents, the deceptively simple method for earning money from ideas.

First stop India. India's Council for Scientific and Industrial Research filed 542 US patents between 2002 and 2005. With the cost of filing a patent standing at about $25 thousand, and maintenance costs of $4 thousand per year, it's no surprise that this feverish activity has come to receive scrutiny from a skeptical public.

Nature ran an editorial arguing that India's state subsidized program for registering US patents is being abused and having a detrimental effect on research. Successful patent applications have come to be used in leu of peer-review publications. They provide a short cut to promotion and grant success.

The Indian newswire teluguporta.com carried a story (September 7th) entitled "Public Money Wasted on Useless Patents". The story described a patent for a substance extracted from cow pee that purportedly conferred an antibiotic action. The claim was not supported by any experimental evidence.

Moving East to China. 20 Chinese delegates flew even further east to San Francisco on the 9th of October for a training course on intellectual property rights. The course was given by Berkeley's Haas School of Business, no less than a paragon of righteous money making.

The US, and Europe too, would like to curb infringements of their patents in China. Beyond platitudes about getting to know each other, the press release on the event mentioned the need to address "Chinese misunderstanding of US values and priorities".

Now let's spin the compass around to examine Europe and its West-ward gaze towards the US. Here the issue is not about policing patent infringements. It's about Europe's desire to replicate the US's success with making money from ideas.

What caught my eye recently was a report published by the EC in September about the benefits of intellectual capital reporting for small to medium sized enterprises (SME's). The EU is pulling hard on all the levers to help SME's. This particular was report gave the advice that "articulating intangible resources" (intellectual property) could drive value creation.

Unfortunately the report seemed to struggle to back the claim. It cited evidence that investors presented with information about a company's conceptual crown jewels were more likely to give the company a thumbs down, in the form of lower forecasts. And anecdotal evidence suggested that fund managers and financial analysts don't take the information seriously.

Keep pulling on those levers, everyone! After all, money makes the world go around.

23 October 2006

Productivity and Innovation

Briefly:
The focus on productivity issues in pharmaceutical research seems to ignore the trade off between efficiency gains and the investment required to refine production methods

---

Productivity is an oft used word. In drug research, it is the focus on nearly all discussions about how the pharmaceutical industry will cope with a combination of rising research costs and a downward trend in the number of drug discoveries.

If the goal is obvious, what then should be the approach to reducing the costs of discovering new drugs?

The logic of productivity is attractively simple. Reduce the cost of a single production step, and the overall productivity will rise. Want improved cupcake productivity? Then head over to the cake factory and have a detailed look at the process in which the ingredients are mixed, placed in a small paper cup, cooked, packed, and distributed.

None of this is rocket science. Or even science. It's process engineering and the engineer's job is to study a process that works reasonably well and make it work even better.

An obvious difference between cupcake production and drug research is that between 99%-99.9% of the products will fail somewhere along the development "pipeline". As cupcake industry insiders will know, the occasional cupcake does go awry before it reaches the light of day. But nothing like 999 out of 1000.

Which pipeline?
Alarm bells should be ringing. Investment in productivity enhancement is a trade off between the cost of studying a process and the benefit of making the process work better. Drug discovery, indeed innovation in general, is just not the place for it. For those few successful drug research pipelines, the efficiency gains will pay out. But for the other 999 cases, the whole exercise is little more than a wasted overhead.

Once I thought that the productivity mantra was confined to the pharmaceutical trade press. Now I see that it dominates international research conferences and falls from the lips of research heads, even in private conversation.

Why focus on increasing the productivity of the drug discovery pipeline if it means that we will spend most of our time peering long and hard down the wrong pipe?

11 October 2006

Europe takes lead in safe chemical production

Briefly:
New safety rules applying to Europe's chemical industry will squeeze low value-added chemical production

The European Commission will shortly take the final vote on REACH, a brave new world in safety regulations for chemicals in Europe. The REACH initiative, which stands for Registration, Evaluation and Authorisation of Chemicals, aims to improve Europe's industrial competitiveness and prompt innovation towards the use of safer chemicals.

Industry has reacted strongly to the plan, which will require that as many as 100 thousand chemicals undergo a round of health and safety testing at the expense of manufacturers. Europe's chemical industry employs 1.7 million people and creates a trade surplus of €41 billion annually.

The Commission received 6000 responses from industry, NGO's and governments during a short consultation period in 2003. And a trial of the program in 2004, involving 29 chemical producers, spawned a report with more than 40 recommendations on how the program could me made "workable".

Of the two stated aims of REACH, the safety argument is mentioned most frequently in reports and press releases made by the Commission. One press release claims that safety information is "sketchy for around 99%" of chemicals in the market place, "raising questions about the possible impact on human health".

Supposing this is true, the question on my lips is how REACH will prompt innovation towards safer chemicals? The toxicologists I speak to are fairly divided about which direction this innovation could take. Some talk about the new field of toxicogenomics, which combines conventional toxicology insights with genome-wide experimental investigations.

Others talk of computational approaches involving machine learning algorithms, Bayesian prediction and other exotic methodologies. REACH makes no mention of these new methods. Indeed, the only statement I could find about how REACH would work in practice was a lonely objective that it should not increase the amount of animal testing.

Wading through the flurry of recent reports and press releases on REACH, I found a small section that compared the new proposals with the existing chemical safety rules. REACH will exempt chemicals used in quantities of less than 1 tonne from the new screening requirements. Currently, all new substances produced in quantities of more than 10 kilograms require safety screening.

With this change in policy, only 30 thousand of the 100 thousand chemicals classed as "sketchy" by the Commission will qualify for screening. So safety compliance will actually become easier under REACH, assuming today's annual production rates.

My guess is that this new rule will have an impact on the second stated aim of REACH, namely, improving Europe's industrial competitiveness. Many of the nano-particle producers should be able to satisfy demand by producing less than the 1 tonne annual threshold. An exotic component in a high-end memory device might weigh less than a microgram per unit.

The same rule will make large scale production of low added-value products unprofitable under REACH. That is because each product made currently could contain 10's or even 100's of individual chemicals that require testing under REACH. With testing costs eating into low profit margins, production is likely to move away from Europe.

So in summary, REACH looks to me like a friendly move for manufacturers of low-volume, high value-added chemicals. For my money, this could have a positive impact on Europe's tradition as a place of high value-added chemical manufacturing.

22 September 2006

Arise, European entrepreneurs

This month's international research roundup focuses on policies designed to foster entrepreneurialism in European research.

Ján Figel, the European Commissioner for Education, Training, Culture and Multilingualism called for stronger support for entrepreneurial mindsets through training and education. He cites figures showing that 60% of EU citizens had never considered starting a business and that 50% were overly averse to taking business risks.

Meanwhile the European Science Foundation has announced an new initiative to foster "a more coordinated approach to R&D investment", according to its Director, Wouter Spek. EuroBioFund will have an annual conference, this year in Helsinki December 14-15th, and separate divisions to take care of networking and brokerage, the organization of grass roots research communities, and joint investment and funding tasks.

EuroBioFund's launch has been timed to coincide with the inauguration of the EU's new Framework Program 7, due January 1st 2007. Organizers hope the initiative will re-dress fragmentation in the funding of life science research in Europe.

France declared a success of its public-private innovation and technology clusters, the 'poles de compétitivité', in a press release made on September 4th. The purpose of the Poles is to raise the international profile of French technology and promote regional growth and job creation in high value-added industries.

A total of 67 clusters, 6 of which were deemed internationally competitive, have been selected to share €1.5 over 3 years. The French government claimed that small to medium sized enterprises (SME's) account for 40% of the business beneficiaries. The money is split between corporate tax exemptions, lower social security charges and direct funding. Funding will cover up to 35% of R&D costs incurred by business partners.

Finally, the European Commission's (EC) plans to establish a European Institute of Technology met opposition from Euroscience, a grass-roots research advocacy organization with 2100 members across 40 European countries. Euroscience argue that the EC's proposal would not achieve its goals of promoting innovation and would ignore existing structural problems with academic research and education in Europe.

Euroscience cite large student numbers, dispersed research capabilities and "a serious lack of differentiation" among the woes of the existing research and education system in Europe. They call for centers of excellence and a more "bottom-up" research policy and believe the EC should instead establish a European agency that would stimulate innovative companies, provide training and foster technology transfer.

13 September 2006

A narrative, darkly

Nature published some interesting pieces on narrative over the last few months. What is narrative, you might ask. For me, it is the idea of sharing knowledge by telling a (true) story. I tend to see narrative as a way to help my audience understand and remember. But there is a darker side of narration, as Nature discusses.

The discussion (Nature 441, pp922) revolves around a new film, A Scanner, Darkly (Dir. Richard Linklater), which is based on a Philip K. Dick novel. The film exploits a technique called rotoscoping, in which real film images are overlaid with a cartoon-like skin, frame-by-frame, to create a cartoon film based on real-life action.

This unreal-imagery is experienced to be somehow more real than the real thing. Various lines of evidence suggest that people find things more believable when the original content is papered over with an engaging exterior. In the Nature piece, several prominent neuroscientists claim this to be evidence that "the brain will swallow almost anything, provided it comes in the form of a story".

A scary conclusion. Can it be that the act of creating a narrative is motivated, deep down, by the desire to manipulate? Next came a piece on "interactional expertise" (Nature 442,pp8), in which sociologist Harry Collins, of Cardiff University, claims that a non-expert can develop a kind of scientific expertise without possessing the underlying scientific knowledge.

As evidence, Collins duped several physicists into believing that his treatise on gravity-waves could have been written by one of their own. One of the physicists admitted that "it's not obvious that [Collin's brief explanation of gravity-wave measurement was] not by a graduate scientist".

Collins claims that interactional expertise might be important for grant reviewers, who must evaluate topics outside their immediate field. And the author of the piece claims that interactional expertise constitutes evidence that one can understand a culture vastly different to one's own, a hot topic among anthropologists who claim that we don't.

I must confess to being mystified by all this. Did Collin's text make sense from a physics perspective, or not? If the reasoning is flaky, then his text must surely be regarded as neither interactional, nor "contributory": the other kind of expertise discussed, and the stuff that is required for "doing experiments and developing theories".

These themes are close to my heart. If I could distill my writing activities to a single sentence, it would be that I make digestible stories from indigestible lists of technical content. But I have developed a special review process to ensure the content is valid. And I'd like to think that my theme, that research is valuable, isn't an especially sinister message to get people to swallow.

05 September 2006

Who watches the Watchperson?

Nature's letters section recently included a piece entitled "Reviewers' reports should in turn be peer reviewed" (Nature, July 6th 2006, pp26). The letter explains how peer-review of reviewers' comments would hold reviewers more accountable and result in a fairer process.

At first, this might sound like another example of over-regulation: an endless regress over "Who watches the Watchperson?" ...until, finally, researchers have no time left to do research.

But I have recently come to wonder whether peer review, the 350-year-old foundation of research publishing, is in need of a health check.

I recently read the peer-review comments returned on a paper submitted by a former colleague. Having been out of the business for a couple of years, reading the comments gave me the impression that I had landed on Mars.

It was not just the menacing tone I found alien, it was also the lack of any meaningful review commentary on how the paper could be improved, or how the methods could be refined; experimental controls added and so on.

Indeed, within the space of 4 lines the reviewer had opined that the paper would damage my colleague's reputation forever, and moreover, had proposed totally new experiments and recommended that, in all reasonability, the paper should apply itself to a rather different question. This was not peer-review as I recalled it.

Would the scrutiny of peers have moderated this tirade, or at least encouraged the reviewer, perhaps, to address the paper rather than focus on his/her own research agenda?

I can only hope so. Somehow, extra checks and balances in peer-review doesn't seem like such a bad idea....

27 August 2006

Research roundup - August 2006

Here is another roundup of international news about research. Some problems confront the entire global research community, while other issues seem to be more unique and country-based.

Singapore continues to flex its muscles in the international research arena with the announcement, on July 7th, that it would increase its research budget to 3% of GDP over the next 3-5 years. Its newly established National Research Foundation will have S$13 Billion in the pot for the budget period 2006-2010 and will focus on biomedical research, the environment and digital media.

Then there was news, also in July, from Nigeria that oil revenues have made possible a new $500 million annual research budget. To put that into an African context, South Africa spends $200 million annually. President Olusegun Obasanjo has asked that publicly funded research be "one of his legacies", according UNESCO science policy advisor Folarin Osotimehin.

First step will be to mint, freshly, a US-style national science foundation for Nigeria. Organizers are working quickly to launch the plan before the end of Obasanjo's presidency, now only months away. "It has to be set up before he leaves. Otherwise we could have a president without enthusiasm for science", a key Organizer said.

Meanwhile the EU's Innovative Medicines Initiative (IMI) has seen cuts to its future budget. An announcement was made in June that total funding would be cut and that current funding would be "backloaded": released at a later stage than initially planned.

I find the news saddening because of the potential knock-on effect it could have on the EU's efforts to encourage partnerships with small to medium sized enterprises (SME's). SME's, - companies with less than 250 employees and < €50M annual turnover, make up 99% of the companies in Europe and create 50% of new jobs.

06 August 2006

Innovation: the new business mantra

Harvard Business Review's June edition carries a long interview with Jeffrey Immelt, CEO of General Electric since 2001, about the importance of innovation-driven growth.

I enjoyed reading it as a welcome relief from the productivity mantra uttered by so many captains of industry. Immelt believes that growth and future contributions to shareholder value will be achieved by "innovation", or research as it used to be known, rather than by increased productivity alone.

But what interests me most is Immelt's comments on where this innovation will come from: India and China. With developed nations growing only very slowly, Immelt is talking about developing technologies "in China, for the Chinese market".

John Thackara's December 2005 blog (www.doorsofperception.com) has a lot to say about the movement of innovation to developing nations. He cites a UK trade and industry report (the Cox Review of Creativity in Business) that heralds this process as all but complete.

Thackara describes a benchmarking exercise revealing that "innovation processes taking 24 steps in the US took seven steps in Bangalore", and concludes, "They are cheaper, and better".

In this regard, I read with interest a recent article in Nature's business section (K. S. Jayaraman, Nature, July 6th 2006, pp 17) about IBM's activities in India. According to the article, Cold War-time India saw IBM leave the country completely in 1978 and operate little more than a skeleton crew during the 1990's.

It is only since 2003 that IBM has had a serious Indian presence, totaling 43 000 employees at the last head count - its largest outside the US. Compare this to the 2200 scientists and engineers at GE's John F. Welch Technology Centre in Bangalore.

But according to Jayaraman, research and major product development in India is "modest". Consider the facts: a mere 110 IBM employees in India are involved in basic research - 3% of IBM's research staff. The remainder work in Zurich, Switzerland, Yorktown, NY and Almaden, CA.

So. Does Big Business have a new answer to servicing the globe's need for research and innovation? Let's wait and see.

30 July 2006

All in the genes?


Two startling things can be said about the picture to the right, which shows two generations of a high pedigree race horse.

First, notice that there is no genetic father. Smart Little Lena stands alone as the genetic forbearer. The five horses pictured beneath Smart Little Lena are her clones, produced by surrogate mothers. That is startling enough, because it has been a long hard road to producing substantial numbers of horse offspring from adult somatic cells.

But there is something else that I hope has not escaped your attention. Ask yourself: do any of these kiddies look like their mother? (it might have helped if they had used a baby photo of Smart Little Lena ;-).

The clones don't even look like each other.

But don't go rushing to the conclusion that I am breaking a scandal about faked clones. These are clones, alright. Their obvious differences arise from what are broadly called "epigenetic factors". This is a nice neat term for a host of poorly understood mechanisms that affect how the genetic code, which is identical in all 6 horses shown, is translated into a living, breathing, animal.

The picture shows the state of the art in horse cloning. The situation would, in all likelihood, be the same for humans. Serious ethical questions aside, would anybody want to clone themselves under these circumstances? No matter what you look like, your cloned offspring could turn out looking just a little bit like Alex, Bogy, Camby, Dave and Eli.

Source: Publication by Stephanie L. Church, Nature Biotechnology 24, 605 - 607 (2006)

20 July 2006

Massively Multi-User Medicine

I've just had a medical check-up. In my working life I am saturated with news about biomarkers and translational medicine, so I was curious to know just what goes into a health assessment these days. To what extent has medicine embraced biomolecular science?

There are thousands of genetic variations associated with disease and disease states. I wanted to put myself under this microscope. I figured here in Basel, Europe's pharmaceutical capital, I would have as good a chance as any to experience the state of the art.

I was not prepared for the perfunctory taps to the knee, and the ear, nose and eye spying that awaited me. As a child, I had tapped and spied using the same tools taken from my father's medical kit.

I told my check-up story to a colleague. He had developed software for a medical diagnostic kit that used 5 markers to detect periodontitis from a mouth swab. What struck him about the project was the sheer triviality of the analysis and the enormous emphasis placed on usability: it all had to be dead simple.

What can we conclude here? Things being as they are, translational medicine is progressing at a snails pace. It will need to speed up if we want to see anything resembling the state of the art in a clinical setting. Change is unlikely to be driven internally, from doctors. It will only come when people ask for it.

More than that, people will themselves need to develop, collectively, methods of using this information. It doesn't take a genius to do this and there are ample numbers of qualified people who could make a start. Genome-wide assays of the transcriptome and, more recently, DNA, are now in the $1000 range. All that's needed beyond that is a familiarity with the data itself.

High throughput biological data, such as SNiP (single nucleotide polymorphism) chips, can be analyzed at different levels of detail. A simple analysis needs no more than a spreadsheet and some gene annotation data, which is readily available on the Internet.

Using these resources alone, a SNiP assay could reveal all kinds of insights about ones susceptibility to disease and sensitivity to different treatment alternatives.

This simple kind of analysis is the sort of thing that could develop very quickly in a collaborative setting. With some basic coordination and peer review, important gene annotation could be gathered and developed quickly as people pooled their knowledge.

04 July 2006

Data's in on importance of diversity

Nature has published (Vol. 441:629-32) an empirical study linking diversity to ecological stability.

Tilman et al show that ecosystem stability, - the ratio of mean above ground biomass to temporal standard deviation, is positively correlated with the number of separate perennial grassland species growing at Cedar Creek Natural History Area, Minnesota, USA.

Skeptics won't have much wiggle-room with the findings of this 12-year field study. The methods section describes an experimental labour of love involving 30 replicates per experimental condition and a design that painstakingly manipulated biodiversity by carefully weeding 168 9m x 9m plots so that there was either 1, 2, 4, 8 or 16 grasses, legumes and woody species in each plot.

Be glad that scientists have now had sufficient time to silence dubious arguments that modern industrial agriculture is sustainable. But have we enough time left to save the rainforests and myriad other sensitive ecosystems harboring the secrets of a stable biosphere?

30 June 2006

Lacking a sticky solution

I was recently reading a bbc news item describing an orb spinner spider trapped in amber 115-121 million years ago (the lower Cretaceous period). The webs of the orb spinner have a combination of strong, rigid silk and weaker, but more stretchy silk. This combination is ideal for netting fast moving insect prey. The authors emphasize the evolutionary success of this adaptation. This ancient specimen's descendants are still spinning 121 million years later, and are represented by more than 2800 separate species.

It made me think about gene-based evolution and just how sticky (!) it can be. Once an adaptation is in the genome, it is there for the duration.

What about cultural adaptations? Generally speaking we don't talk of cultural adaptations being inherited directly via our genes. Each cultural adaptation must be passed from generation to generation through schooling. This provides much more scope and flexibility for adaptation but comes at a cost: each new generation runs the risk of missing out on beneficial schooling.

Compared to spiders, humans lack a sticky solution to guarantee that their most important adaptations are passed on.

23 June 2006

Three types of models...

I distinguish three kinds of models used in research.

The first kind generates unexpected predictions that can be tested and thus reveal new insights. These models are necessarily quite simple, because simplicity and elegance is a precondition for acceptance of a model that makes novel predictions. Why? Because simple models can be understood readily. This makes it possible for people to evaluate how the model's unexpected predictions come about.

The second kind are modifications of simple, established models that explain things the established models couldn't explain. An established model may fall from favour because it does not predict important new observations. Models that are modifications of existing models can save the original model, which may otherwise have been abandoned for a completely new, and perhaps immature model. The usefulness of this second kind of model lies in conserving and preserving valuable accepted wisdom.

Finally, there is a third kind of model. For my taste, all models belonging to this category should undergo refinement until they fall in either of the first two categories.....

So. Don't you agree there are only two kinds of useful models?

17 June 2006

Taking the high throughput plunge

Scientists hesitate before embarking on a genome-wide investigation because some future shift in technology could reduce massively the time and cost of doing large scale work. But without making a start, these technologies won't come into being.

Without a genome-wide, high throughput approach an investigation could remain confined to a search beneath some lamp-lit corner of the genome, when the true key could be languishing elsewhere…

Or as Thomas Jenuwein of the Research Institute of Molecular Pathology, Vienna puts it "One can never be 100% ready... The rest will happen once the momentum is built up" (ref).

Jenuwein is referring to proposals by the International Human Epigenome Project to catalogue epigenetic features including DNA methylation and histone modification, the most well understood mechanisms underlying epigenetic influences on gene activity.

DNA methylation and histone modifications are tissue and developmentally specific. That means that they must be studied in each tissue separately, and at each developmental phase.

And neither process can currently be studied in a high throughput context. Methylation assays are accurate but slow and expensive, while large scale identification of histone marks is prone to problems with accuracy.

As with the Human Genome project, there is no alternative but to take the plunge.

Ref: Jane Qiu, Editor Nature Reviews Neuroscience in Nature May 11th 2006.

31 May 2006

Solving differential equations?

Just now I wandered up to a colleague, a physicist with the bioinformatics team, lost in thought.

He was standing in the canteen line for lunch, face expressionless and oblivious to all around him.

"Solving differential equations?" I asked.

"No," he replied, calmly. "I've got to work out the model first".

10 May 2006

Funding novel technologies

Australia's research universities received additional funding in this year's budget. For example, the government announced Aus$200M in new initiatives to assist small to medium sized businesses to commercialize new technologies.

Virginia Walsh, executive director of Australia's Group of Eight top research universities, welcomed the increased spending. But Walsh also highlighted the lack of funding for so-called "proof of concept", pre-commercial research investment, arguing that this lack "restricts the flow of new technology ventures".

It's a brave advocate of research that raises such a point during the relatively tight economic conditions that prevail today. But if Walsh is right, shifting investment from pre-commercial to commercial "innovation investment" might stymie the very conditions necessary for such investment to be a success.

The Group of Eight has also been outspoken about the way research budgets are allocated, giving special attention to the Research Quality Framework. Group of Eight Chair Glyn Davis pointed to the “lack of detail about the amount of funding to be distributed on the basis of RQF performance”.

I found Davis’ quote most interesting of all. When performance measures are used to determine funding levels, one should be reassured that performance would be rewarded. Otherwise, it would seem that the stick remains in place, but the carrot is nowhere to be seen.

27 April 2006

Future shock

We look to advanced technologies to solve our problems, but what happens when we are presented with a solution that is cutting edge?

Talk to a researcher these days and they will tell you that the rate limiting step in research is no longer the gathering of data, but rather the interpretation of it. Gathering information is highly automated, but analyzing it remains a fairly manual process.

This problem is felt acutely in industrial research. I have some experience with two industrial research fields confronting this problem: pharmaceutical research in toxicology and financial risk management.

Both industries claim to be overwhelmed by the volumes of data that they must analyze and interpret. This has triggered interest in quantitative methods to analyze large, complicated (high dimensional) datasets.

These fields produce experimental results that are just too large to hold in your head. Methods such as Self Organising Maps, principal components analysis (PCA) and supervised machine learning algorithms are ideal for analyzing such data. They have been around for years, but fast, inexpensive computers and user friendly interfaces have made them available on everyone's desktop.

PCA can be used to create a view of complex (high dimensional) data based on a few dimensions. Supervised machine learning can fish out patterns in data that exist in high dimensional spaces; patterns that have a complexity that our mind cannot grasp.

So you might expect that industry would jump at the chance to use these methods. Well, not so fast.

Before I describe my experience of industry's reaction to such methods, it's worth looking at a bit of background on the life of a company toxicologist or financial risk manager.

Toxicologists and financial risk managers don't have it easy. Suppose that a medicine produces a serious unexpected side effect. All drugs have been very carefully tested in development to reduce the chance of this happening. The buck stops with the toxicologist that performs these tests.

In finance there is a similar problem. Fund managers invest money using pre-agreed strategies. The strategies have been assessed in terms of their profitability and their risk. Financial risk managers must answer for financial losses caused by events that were not anticipated in their risk estimates.

Within this context, it should not come as a surprise that these industries do not exactly leap at the chance to use PCA and machine learning algorithms. I would go as far as to say that there can be a disconnect between the eager mathematical physicist pitching a statistical method and the industrial practitioners that are their target audience.

I suspect that the problem goes deeper than mere suspicion of a new and relatively untested approach. Arguably, it lies with the very nature of the methods themselves.

Consider this. Most data analysis begins with a hunch about what the data will ultimately show. This hunch might be a correlation between two known variables, or perhaps a simple pattern of results across two or three experimental conditions. We can create a picture of these results in our head.

PCA and machine learning algorithms don't work this way. They produce projections from high dimensional spaces onto low dimensional spaces.

It is beyond our minds capacity to visualize the exact combination of the original variables that is captured by a principle component. The low dimensional projection is delivered to us without a name. The same goes for patterns derived by machine learning.

Without names, these results don't tell a story.

Skillful interpretation of a PCA can yield a story of sorts. But the power of these methods is that they create an unbiased view of the data, one that doesn't need to adhere to a pre-existing story.

I don't know about you but I find this distinction rather deep. And if you compare these approaches with more familiar analytical approaches - ones that involve hunches, stories and easy intuitions for the patterns of results, then its not surprising that my friends in industry tend to shy away from them.

But analytically, this unbiased view is a big, big plus. These are methods that can reveal unexpected patterns in results and they scale well with the size of the data set.

That is what excites the mathematical physicists.

Collaboration between Biopolis and RIKEN

My eyes are on Biopolis, Singapore's biosciences research initiative occupying a futuristic campus next to the National University of Singapore. Championed by Philip Yeo, Chair of Singapore's Agency for Science, Technology and Research (A*Start), Biopolis is still relatively early phase. But it already includes a crop of several shiny buildings connected by soaring above-ground walkways.

Biopolis has been making impressive connections lately, and actively working itself into the public imagination. Recent news describes a collaborative agreement with Japan's RIKEN (translated as "Institute of Physical and Chemical Research", but these days doing plenty of top biological research).

Biopolis goes into the RIKEN collaboration with an interest to expand its biomedical research focus beyond infectious disease to cancer drug development. The collaboration will focus on the exchange of ideas as well as training programs that will send Singapore's burgeoning supply of enthusiastic trainee scientists to Japan.


Biopolis


RIKEN

22 April 2006

Reporting research news

In journalism one must report Who, What, When, Where, Why, and How. Journalists have methods to obtain this information, and I am interested in tailoring these methods to reporting research news.

A typical reader of research news expects high quality background information surrounding the news. This information gives the reader a crash course on scientific aspects of the news topic.

This raises the reporting challenge. The journalist must describe what, why and how in depth, but can not hope to have first hand knowledge in all cases.

The journalist therefore needs a way to capture expert knowledge to include as background. I'm interested in refining methods of capturing expert knowledge.

To a professional journalist, the methods will probably end up looking like "good old-fashioned journalism". But please humour me while I find out how to give researchers a good hearing in the Press.

16 April 2006

mission statement

I am interested in research and how it can be fostered. Virtually all
kinds of research interest me and I believe research should be defined
broadly to include any activity that creates knowledge.

Why should research be fostered? It's difficult to fund research,
train people in research methods and to do research.

But worthwhile.

Researchers find things out for us. They gather information with which
to make sound decisions. They discover new things. They refine the
methods that make these discoveries possible. If we lost our
researchers we would make less informed decisions, fail to discover
new things and would lack the tools to train new people to replace
them.

My aim is to raise awareness of research efforts and fuel debate about
how these efforts can be refined and improved.