This wiki service has now been shut down and archived

AHM 2007

From ESIWiki

Jump to: navigation, search

Tuesday 11 September 2007

www.allhands.org.uk

It was suggested by David Wallom at this morning's BoF on Public Understanding of e-Science, in the context of using Web 2.0 tools to communicate what e-Science is doing, that we should use the eSI Wiki to keep a running commentary on thi year's AHM 2007. So here is the starter page to kick things off.

Contents

Keynote Plenary Sessions

Keynote 1: Malcolm Atkinson

The Future of e-Science

Powerpoint

e-Science is uncharted territory - we have to be adventurous navigators.

Thus far, e-Science is being very successful, but it also has to develop and change.

There are three interlocking elements: research using e-Science, research enabling e-Science, and e-infrastructure supporting research and innovation. e-Science has to focus on making them work together more effectively, accelerating the process of learning from one another over a variety of disciplines. e-Scientists are the experts in making effective collaboration happen.

"Computational thinking" is a new kind of intellectual approach, enabled by ubiquitous computers, analogous to the intellectual revolution brought on by the printing press.

The past year in e-Science has seen significant successes over a wider range of disciplines than ever before. Our responsibility now is to make sure we don't waste that momentum, but build on it. The whole process must be driven by research goals. We have to educate people in order to build the capacity of computational thinking: articulate the challenges and attract the researchers. We should engage more researchers, understand their requirements, and make sure the e-infrastructure we provide really does support research needs.

Today's research already stretches what we can provide: tomorrow's research will be even more demanding. As a community we need to take on the challenge, and we need to work together. Our long-term strategy must be dynamic and responsive. Our goal is to raise the level of engagement and provision for UK research.

Keynote 2: Satoshi Sekiguchi

A Design of the GEO Grid: Systems of Systems Federating Geospatial Data and Services

PDF

GEO Grid aims at providing e-Infrastructure for worldwide earth sciences, integrating geospatial data with an easy-to-use interface. It uses elevation data from satellite remote sensing instruments, geology archives and in situ sensors to build up 3-d maps of the Earth's surface at high spatial resolution.

Example: before and after images of an earthquake region in Afghanistan shows that there has been a huge landslide. This is particularly valuable because the political situation in that country makes conventional surveying essentially impossible.

Example: change of land use around Suvarnabhumi airport due to development. This is very useful for environmental scientists.

Use case: volcano monitoring. Generating 3-d elevation model, combine with simulation of pyrosclastic flows of ash from peak of volcano, can assist with disaster prevention and mitigation by creating hazard map for evacuation planning. A similar thing can be done for landslides, combining high-resolution elevation data with large-scale computer simulation to create early warning system for evacuation.

Other applications include environment monitoring (global warming, CO2 flux estimation) and natural resource exploitation (oil, gas). hehe

Keynote 3: Thomas Kirkwood

The Grand Challenge of Population Ageing: e-Science to the Rescue

Increasing lifespans have led to an extraordinary change in demographics. Impact of population aging over the next century will be greater than climate change, globalisation or terrorism.

Life expectancy continues to increase, even though forecasts have said it should level off. By the time we'd got to the 1960s-70s, there was little more to do in terms of preventing early deaths, so it was expected that life expectancy would hit a ceiling. It did not. In the UK, life expectancy is increasing by 5 hours a day. But what are all those accumulated 5 hours going to be like when we come to use them at the ends of our lives?

Twin studies have shown that longevity is about a quarter genetic - in other words, three quarters of the factors influencing lifespan come from non-genetic sources. Some of the genes that affect lifespan have been identified, but it turns out to be a very complex picture.

Organisms invest resource in mainteneance and repair of their tissues sufficient to keep them in good shape for as long as they might reasonably expect to survive in the wild, but not much more. For example, a mouse lives up to 3 years in the lab or as a pet, while in the wild it would be lucky to reach its first birthday.

Aging is caused primarily by damage: longevity is regulated by resistance/repair. Therefore there are multiple mechanisms of aging. It is a complex and inherently stochastic process. Traditionally reductive methods struggle to deal with this.

This is where e-Science comes in. Collaborative projects across several UK universities are engaged in data management and integration, modelling and developing statistical methods for studying ageing.

BASIS (Biology of Aging e-Science Integration and Simulation system) provides tools for the quantitative study of aging, modelling and stochastic simulation. The system features user interfaces designed to be used by people with no knowledge of modelling techniques. The idea is to be able to explore the complexity and inherent stochasticism, not to run away from it. BASIS aims to be as user-friendly, adaptable and extensible as possible.

We need to understand the complexity of the underlying mechanisms, so we can exploit the intrinsic malleability of aging progress to extend our lifespans.

Keynote 4: Anders Ynnerman

Medical Visualization Beyond 2D Images

Why visualise? Images enable high bandwidth between computer and brain (up to 50% of neurons in the brain are involved with vision) - this is very important when dealing with huge amounts of data. Computer graphics are getting better and cheaper, thanks to computer games - keep your kids playing! Humans are vastly better than computers at understanding visual information.

Data explosion. Yesterday dealing with 100 slices per patient, 50 MB. Today, 24,000 slices, 20GB. Tomorrow, looking at time varying images, 1 TB per patient!

Can't put 24000 images on the wall and look at them, have to do something more intelligent -> e-Science solutions.

Combine 2-D slices into a 3-D volume. Different types of tissue are better or worse at absorbing x-rays. The transfer function maps scalar values of absorption to colour and opacity in the visualisation. Changing the transfer function allows you to display specific types of tissue: you can show the veins, remove skin to look at bone, etc.

But transfer functions are blunt tools. If the knowledge of domain experts is encoded into the visualisation system, the results can be made more precise and reliable.

Encoding domain knowledge also helps in dealing with large volumes of data, by enabling you to decompress only the data that will be of interest to the radiographer. You can have the highest level of detail at boundaries between different types of tissue, with lower resolution in more homogeneous parts, and can simply throw away data that won't contribute to the image at all. This lets you deal with tens of gigabytes of data on relatively limited computers.

Virtual autopsies: high power CT scan of body bag prior to normal autopsy - only takes a matter of minutes. Forensic scientists and radiologists collaborate in these investigations. This is of value to the police because it enables them to come back and do autopsy again if necessary in the course of an investigation. This technique does not replace a standard autopsy, but it does have certain advantages. It can detect fractures, that are difficult to detect in normal autopsy, and can readily detect alien objects such as bullets.

Looking at advanced ways of introducing lighting and shading, giving improved depth perception. Can even make tissue glow, which can be helpful in some applications: for example, making the brain glow can show up a crack in the skull.

Knowledge encoding is next step forward for visualisation, in order to properly interpret images. Neighbourhood analysis: instead of dealing with one pixel at a time, exploit the fact that adjoining pixels will be related. This uses a priori knowledge to limit statistics.

Haptics: stereoscopic images with mechanical pen device so that you can feel the image - can feel surface of skin, then push through and feel skull, then push further and feel around the brain, for example. Very useful in determining stream lines within the heart. With time resolved data, you can touch a beating heart and feel how it is moving. Needs sophisticated interpolation methods to ensure that the pen doesn't go astray between frames.

Keynote 5: John Wood

Building a UK e-Infrastructure

Powerpoint

JISC's role in developing a UK e-Infrastructure.

JISC already provides SuperJANET, UKLight and a new dark fibre testbed for photonics research. It also offers access management for researchers, and advice and guidance through initiatives like the Digital Curation Centre.

JISC working with government on an e-infrastructure strategy. A recent report from the Office of Science and Innovation concluded that all researchers should have easy access to e-infrastructure, and confidence in the authenticity and quality of data. Authentication of data is a real challenge - how do you know if online data is trustworthy, and hasn't been corrupted? Longevity of preservation is another major issue, and one that isnot being tackled by governments at the moment. This needs to be raised politically.

UK was well ahead of the game six years ago, but now we're slipping and other countries are catching up.

In global terms, we need much more coordination and faster decision making in Europe. There is too much duplication of large facilities in Europe, which is inefficient and inhibits the formation of a critical mass of resources and expertise at any one site. Europe has long standing tradition of excellence in research and innovation, but no longer feasible to compete globally as 27 individual member states. Interoperability across Europe and across the world is key.

This period of e-Science is a dramatic change in the way people think: future sociologists and historians will look back on what we are doing now and try to understand how we did it - and why.

Keynote 6: Timothy Foresman

Digital Earth: The New Digital Commons

Powerpoint

Ancient commons were supplanted by enclosure - private/corporate ownership. This stimulated the agricultural economy, but some people lost out in the process. Alternative is the Tragedy of the Commons, where individual selfish action destroys the resource for everyone.

There are now major challenges to planet: global fisheries collapse by 2048; loss of biodiversity - the so-called "sixth mass extinction"; increasing human population; 1 billion people without safe water access; peak oil; record high in CO2 (now at 380 ppm - igher than at any time in the last million years); temperature rise; ice caps melting.

Is the solution the new Digital Commons? Humans have great capacities for thinking about problems - if all that brainpower can be harnessed, perhaps we can be optimistic about the future. Digital communications are already empowering grass-roots ecological protests.

The Digital Earth story starts with Buckminster Fuller's idea of a "geoscope", to recognise global patterns and predict consequences of decisions. Vice President Gore started a programme at NASA to provide a global vision of the world - society, technology, and sustainability. China essentialy copied the American program word-for-word, with government commitment at the highest levels. Consequently, the International Society for Digital Earth was inaugurated in 2006.

Geobrowsers can act as visualisation tool for protecting ecological resources. This has already been done successfully in Africa.

Most of us naturally work in silos - we're just wired that way. This isn't a problem as long as we create good metadata, and maintain the provenance and credibility of information. This combined mass of information from an array of silos creates the Digital Earth Commons. We then have to focus on pulling this information together and integrating it in geobrowser-type applications, such as Google Earth. Computing per se is not the problem, given the supercomputing power we now have available. The big challenge is in organising information.

Currently, proprietary or classified information is a minority of the internet - most of it is open source / free. A proposed corporate control scenario would set up a two-tier internet. This social discrimination would not be in our best interest. The commons must be fought for, by us. This will not be a trivial task.

We need some organising principles to boldly go into the future. There are two potential paths: evolution or design. An evolutionary approach would leave progress up to disconnected entrepeneurs, with successes determined by market feedback. An intelligent design approach would be better, because evolved systems don't care if we survive.

It is important to find ways to effectively communicate not just amongst ourselves in the scientific community, but with people in pubs.

We find commonality when we empower people to create maps of how the world looks to them.

Keynote 7: Thomas Hartkens

IXI(CO): Progressing a scientific GRID project to an end-to-end solution

IXICO is an imaging serive for the pharmaceutical community, based on grid project called IXI. The past two years have seen its journey from a scientific project to a business solution.

Grids are very valuable for medical image analysis. There is a huge volume of data, both source data and intermediate results. The algorithms used in medical imaging are computationally expensive. Analysis generally involves multiple processing steps - workflows play a central role. Many different institutions collaborate on this kind of work. All of these lead us towards grid solutions.

The project began with a grid demonstrator - the dynamic brain atlas (2001-2), based on a large set of MRI scans of brains. This got a lot of press coverage, encouraging the setting up of a follow-up project called IXI: Information eXtraction from Images. This extended the dynamic atlas beyond the brain. It included a workflow engine and a web portal interface.

One application is the automatic delineation of bones. Identifying specific structures in images is time consuming task, especially when it involves hundreds of images at a time. IXI incorporated an automatic structure identification system, which compares subject image to an appropriately deformed reference image.

The business idea was to use this technology to do automated image analysis for drug development. Taking a baseline image, at the start of the trial drug treatment, and then taking images periodically throughout test period, can reveal what effects the drug is having. For example, changes in bones during arthritis treatment can be observed.

The drug development market is currently in a crisis. Costs have increased rapidly: it now costs about 1 billion dollars to bring a drug to market. This is not sustainable, and industry is searching for ways to reduce costs. Imaging could help to accelerate the development process and reduce the costs of clinical trials.

IXICO's customers are pharmaceutical companies. Imaging suppliers are not selling software, they are providing a service. The competitors are small companies that are focused on traditional, subjective radiological reading.

Pharmaceutical companies require a full end-to-end solution. This meant IXICO had to add site management, data logistics, and documentation for regulators to the core imaging package in order to sell it.

Good Clinical Practice requires an audit trail, a quality management system, and software validation. The audit trail has to meet the stringent requirements of regulators. A quality management system preserves and improves your knowledge base: it is dangerous to rely on knowledge in the head of a single employee.

Innovation in technology is not necessarily the selling point. Clients don't care how you are doing the analysis, they care about the quality of the product. An electronic company cannot be implemented as a business without quality standards. There are big differences between research implementation and a production system.

BoF Sessions

Public Understanding of e-Science

After a very successful session on Public Engagement in e-Science here are the outputs which we had agreed to at the end:

Need science tools for blogging

Wiki open to the public etc.

Web 2.0 technology should be the scientists friend

Public as well as scientific contributions

School engagement

Desirable to be in the curriculum (Keystage)

Teaching guides

Need specific PeS funding onto all research awards

Journalists need education

Balance shown is sometimes not representative

Use of content identification

Royal Met Soc Met Link join in!

Two different types of communication

Outreach

Media

Not just the end result but the underlying components and results need to be made public


cameroon

Addiction treatment and recovery resources for the addict and their families.

http://www.addictiontreatment.net

Views
Navigation
This is an archived website, preserved and hosted by the School of Physics and Astronomy at the University of Edinburgh. The School of Physics and Astronomy takes no responsibility for the content, accuracy or freshness of this website. Please email webmaster [at] ph [dot] ed [dot] ac [dot] uk for enquiries about this archive.