A new analysis has found that supplementing the diet with fish oil may prevent muscle and weight loss that commonly occurs in cancer patients who undergo chemotherapy. Published early online in Cancer, a peer-reviewed journal of the American Cancer Society, the study indicates that fish oil may help combat cancer-related malnutrition.
Chemotherapy can cause cancer patients to lose muscle mass and become malnourished, leading to fatigue, a decreased quality of life, an inability to receive necessary treatments, and shorter survival.
Researchers suspect that supplementing the diet with fish oil—which contains omega-3 fatty acids such as eicosapentaenoic acid—may help patients maintain or gain muscle. To test the hypothesis, Vera Mazurak, PhD, of the University of Alberta in Edmonton, Canada, led a team that compared the effects of fish oil with that of standard care (no intervention) on weight, muscle, and fat tissue in newly referred non-small cell lung cancer patients.
The trial involved 16 patients who took fish oil (2.2 grams of eicosapentaenoic acid/day) and 24 patients who did not. The study ran until patients completed their first-line (initial) chemotherapy treatments, which lasted about 10 weeks. Muscle and fat were periodically measured using computed tomography images. Blood was collected and weight was recorded at the start of the study and throughout chemotherapy.
Patients who did not take fish oil lost an average of 2.3 kilograms whereas patients receiving fish oil maintained their weight. Patients with the greatest increase in eicosapentaenoic acid concentration in the blood following fish oil supplementation had the greatest gains in muscle. Sixty-nine percent of patients in the fish oil group gained or maintained muscle mass. Comparatively, only 29 percent of patients in the standard care group maintained muscle mass, and overall, patients in this group lost 1 kilogram of muscle. No difference in total fat tissue was observed between the two groups.
The authors concluded that nutritional intervention with two grams of fish oil per day provides a benefit over standard care, allowing patients to maintain their weight and muscle mass during chemotherapy. "Fish oil may prevent loss of weight and muscle by interfering with some of the pathways that are altered in advanced cancer," said Dr. Mazurak. "This holds great promise because currently there is no effective treatment for cancer-related malnutrition," she added. Dr. Mazurak noted that fish oil is safe and non-toxic with virtually no side effects. It may be beneficial to patients with other forms of cancer and other chronic diseases that are associated with malnutrition, as well as to elderly individuals who are at risk for muscle loss.
Developed by a Franco-Israeli partnership, this innovative solar power technology introduces a new paradigm in energy production. Solar power plays a dominant role in the world-wide effort to reduce greenhouse gases, it is considered a clean energy and is an efficient source of electricity. Yet several obstacles have been undermining the expansion of this sector and many of its actors are looking for a new approach towards the markets.
The project results from a collaboration between Solaris Synergy from Israel and the EDF Group from France. EUREKA provided the supporting platform which allowed to enhance both companies' partnership. After receiving the "EUREKA label" the project, called AQUASUN, found also support from the Israeli Ministry of Industry, Trade and Labor. "We are very pleased with the collaborative dimension of the project", says Dr. Elyakim Kassel, coordinator of the AQUASUN project and business development manager at Solaris Synergy.
A win-win Situation
Soon after the design phase was over, at the end of March 2010, the fabrication of a prototype began and the team is now aiming to launch the implementation phase in September 2011. The tests will take place at Cadarache, in the South East of France, the site having a privileged position on the French electric grid and being close to a local hydro-electric facility providing the water surface to be used for the installation of the system. It will operate on-site during a period of nine months, while assessing the system's performances and productivity through seasonal changes and various water levels. The research team members believe that by June 2012, they will have all the information required to allow the technology's entry on the market.
As even leading photovoltaic companies struggle to find land on which to install solar power plants, the project team identified the almost untouched potential of solar installations on water. The water basins, on which the plants could be built, are not natural reserves, tourists' resorts or open sea; rather they are industrial water basins already in use for other purposes. By that, it is assured that the new solar plants will not have a negative impact on natural landscapes. "It's a win-win situation", declares Dr. Kassel, "since there are many water reservoirs with energy, industrial or agricultural uses that are open for energy production use".
After solving the question of space, the team also took on the problem of cost. "It sounds magical to combine sun and water to produce electricity, but we also have to prove that it carries a financial logic for the long run", explains Dr. Kassel. The developers were able to reduce the costs linked to the implementation of the technology by two means. First they reduced the quantity of solar cells used thanks to a sun energy concentration system based on mirrors, while keeping steady the amount of power produced.
Maid of modules
Secondly, the team used a creative cooling system using the water on which the solar panels are floating. Thanks to this efficient cooling method, the photovoltaic system can use silicon solar cells, which tend to experience problems linked to overheating and need to be cooled down in order to allow the system to work correctly, unlike standard type more expensive cells. The particular type of solar cell used also allows a higher efficiency than the standard ones, achieving both reliability and cost reduction.
Still for the purpose of making the technology efficient and ready to market, the system is designed in such way that on a solar platform it is possible to assemble as many identical modules as needed for the power rating desired. Each module produces a standard amount of 200 kiloWatt electricity, and more power can be achieved by simply adding more modules to the plant.
The team also worked on the environemental impact of the technology. It works in fact as a breathing surface through which oxygen can penetrate to the water. This feature ensures that sufficient oxygen will maintain the underwater life of plants and animals. Dr. Kassel adds: "One of the implementation phase's goals is to closely monitor the possible effects of this new technology on the environment with the help of specialists" and "a preliminary check shows no detrimental environmental impact on water quality, flora or fauna. Our choices of materials were always made with this concern in mind".
Great ambitions
The project was featured at the 4th International Eilat-Eilot Renewable Energy Conference in Israel giving the public the occasion to observe the functioning of the first floating concentrated photovoltaic system.
According to Solaris Synergy's CEO, Yossi Fisher, the installation in Eilat has been a milestone, opening the way for "many future implementations in Israel and throughout the world".
Dr. Kassel sees to his project a last benefit: "Today, each country must consider the best resources it has in order to produce clean energy. For example: hydroelectric power is good where there are waterfalls, geothermic is productive for countries with thermal springs and solar power is very efficient where there is sun. Our system could be of great use in places that are exposed to sun, but not necessarily have sufficient natural water. Even dry countries, such as Israel or the North African countries, have industrial waters that are not rain dependent. This fact makes the floating solar power plant a reliable method for them to produce renewable energy".
Thriving coral triangle depends on S. China Sea and Solomon Is. for reef diversity.
University of Miami (UM) Rosenstiel School of Marine & Atmospheric Science faculty were part of an international scientific team to show that strong links between the corals reefs of the South China Sea, West Pacific and Coral Triangle hold the key to preserving fish and marine resources in the Asia-Pacific region.
Rosenstiel School researchers Drs. Claire Paris and Robert Cowen and colleagues from the ARC Centre of Excellence for Coral Reef Studies at James Cook University and University of California - Los Angeles, have established that the richest marine region on Earth – the Coral Triangle between Indonesia, Malaysia and the Philippines – depends vitally for its diversity and resilience on coral and fish larvae swept in from the South China Sea and Solomon Islands.
"By evaluating the directionality of larval transport over multiple generations, we could describe the signature of the extraordinary genetic diversity of the Coral Triangle. Preserving diversity is key to the health of marine systems," said Claire Paris, Rosenstiel School assistant professor of Applied Marine Physics. "This kind of work will help us anticipate and manage changes of connectivity networks in the future."
The authors provide evidence showing the regions' biology is closely inter-connected suggesting that it is in the interests of all Asia-Pacific littoral countries to work together more closely to protect it.
"Maintaining the network of links between reefs allowing larvae to flow between them and re-stock depleted areas, is key to saving coral ecosystems threatened by human pressure and climate change," said the paper's lead author Johnathan T. Kool of James Cook University, who is also an alumnus of UM. "The science shows the region's natural resources are closely interconnected. Nations need to cooperate to look after them – and that begins with recognizing the resources are at risk and that collective action is needed to protect them."
The Coral Triangle is home to more than one third of all the world's coral reefs, including over 600 different species of reef-building coral and 3,000 species of reef fish. These coral ecosystems provide food and income for more than 100 million people working in marine-based industries throughout the region.
Six nations within the Coral Triangle – Indonesia, the Philippines, Malaysia, Papua New Guinea, The Solomon Islands and Timor L'Este – are now working together to strengthen coral reef governance and management, under an arrangement known as the Coral Triangle Initiative.
The paper, titled "Connectivity and the development of population genetic structure in Indo-West Pacific coral reef communities" by Johnathan T. Kool, Claire B. Paris, Paul H. Barber and Robert K. Cowen is available online and will be published in the March issue of the journal Global Ecology and Biogeography.
The University of Miami's mission is to educate and nurture students, to create knowledge, and to provide service to our community and beyond. Committed to excellence and proud of the diversity of our University family, we strive to develop future leaders of our nation and the world. Founded in the 1940's, the Rosenstiel School of Marine & Atmospheric Science has grown into one of the world's premier marine and atmospheric research institutions. Offering dynamic interdisciplinary academics, the Rosenstiel School is dedicated to helping communities to better understand the planet, participating in the establishment of environmental policies, and aiding in the improvement of society and quality of life.
Scientists have reviewed the potential for worldwide development of geothermal energy systems in old, unused mines. The technology is proven in many sites and could therefore help increase the share of renewable energy sources in the energy mix, offering sustainability and job creation, which may make mining operations more appealing to investors, communities and policymakers.
The mineral industry is energy intensive and is therefore focusing on developing energy efficiency and cleaner production technologies. It is also expensive to open, operate and close mines, but few are ever considered as being useful once they have been closed. They are often located on marginal, unproductive land in remote, harsh environments and nearby communities may only live in the area as a direct result of the mine. Being often highly dependent on the mine and on imports of (fossil) fuel, such communities are highly unsustainable.
The researchers argue that mines could be used post-closure for energy generation (heating and cooling) using the natural heat contained in the mine water. Geothermal energy systems could be implemented to extract this heat using heat pumps, from both closed and potentially working mines. This would offer local employment and energy resilience to the surrounding communities. Other uses of the energy may include melting snow on icy roads (instead of using salt) or supplying heat for fish farms and greenhouses.
Several other technologies can extract energy from abandoned mines, including compressed air storage or the direct use of warm mine water to regulate the temperature of microalgae raceway ponds, allowing a longer growing season for cultivation; key products that can be obtained from the microalgae include nutraceuticals and biofuels. If biodiesel is used to fuel the energy extraction operations, there could be further CO2 and air pollution reduction, with particular health benefits for workers if used in working mines. Competition with other fuels may limit geothermal development in less remote locations, but rising prices and CO2, reduction incentives could see this barrier decline over time.
Without ongoing dewatering, mines may fill with water permeating from surrounding rocks, possibly leading to contaminated floodwater escaping into local land or water bodies. Measures which contribute to on-going economic viability of mines will increase the motivation to continue water monitoring and pumping operations, preventing such events and offering energy resources from the controlled flows.
The type, size and flexibility of the geothermal energy system depends on the water quality, volume and temperature and must be designed to avoid degrading the energy reserve by extracting too much heat. However, detailed site and water balance models can provide heat maps of geothermal resources and identify opportunities and customers for heat recovery. Additional efficiency measures to maximise benefit include, design to minimise water loss in the distribution system and upgrading of customer buildings insulation.
Worldwide there may be over 1 million abandoned mines. The study lists several feasibility studies for potential geothermal projects across North America and Europe, plus many examples of operational systems worldwide, including:
a district heating system at Heerlen, Netherlands, with multiple heat pumps, built as part of regeneration scheme for an area devastated by the closure of coal mines;
a Norwegian copper/zinc mine closed in 1941, providing heat since 1998 to an underground cavern used for concerts and banquets;
small coal mines heating a few tens of houses in Scotland, UK.
Source: Hall, A., Scott, J.A. & Shang, H. (2011) Geothermal energy recovery from underground mines. Renewable and Sustainable Energy Reviews. 15: 916-924.
National Forest Programmes (NFPs) aim to incorporate the views of a wide range of stakeholders into the management of national forests. However, an analysis of NFPs in Bulgaria and Germany found they had little impact on forest policy. Despite this, stakeholders who took part in the NFP negotiations welcomed the opportunity to contribute to the policy-making process, even though they realised they may have little impact.
Since their introduction in the 1990s, NFPs have been seen as one way in which governments can establish a robust policy for their forests in the absence of a legally-binding global regime for sustainable forest management1. Ideally, NFPs allow many different national stakeholders to contribute to the development of future forest management policies. They have been established in many European countries and are often seen as excellent examples of open and participatory government. However, the latest results suggest that even the most successful NFPs might have had little effect on forest policy or on policy-making processes.
The researchers used an approach called an 'Advocacy Coalition Framework' to analyse the policy change caused by NFPs in Bulgaria and Germany from the point of view of the political coalitions involved – for example, coalitions of different policy makers, NGOs, forest workers and industries may club together to advocate nature protection in forests or to promote forest-based industry development. Using interviews with stakeholders and evidence from scientific papers, political articles and press releases, they studied the roles of these coalitions in creating and implementing the NFPs.
The results reveal that, although both NFPs met the goals of the 'ideal' NFP by involving a wide range of stakeholders in an inclusive policy process, they did not lead to a change in forest policy. For example, the Bulgarian NFP successfully involved three interested coalitions, but resulted in an unworkable compromise. Eventually, all three coalitions abandoned the process in favour of other political strategies.
The findings also suggest that NFPs are sometimes used by political 'coalitions' to achieve strategic political goals. For example, the highly-regarded German NFP was established at a time when a coalition with strong interests in nature conservation gained political influence and challenged the dominance of the forest industry coalition. The NFP led to a long-lasting dispute, which continued until the political situation changed. The result was that few changes were made to German forest policy. The researchers therefore recommend that future analyses of NFPs and other participatory forms of government should consider the broader political role they can play, rather than focussing exclusively on how they increase participation.
However, participants in both countries' NFPs appreciated the opportunity to take part in the decision-making process, even though they thought the impact on policy of the NFP was low. This led to greater expectations that stakeholders would be invited to take part in future policy-making, which could lead to governments gradually adopting a more open and democratic forest policy process in future.
Source: Winkel, G. & Sotirov, M. (2011). An obituary for national forest programmes? Analyzing and learning from the strategic use of 'new modes of governance' in Germany and Bulgaria. Forest Policy and Economics. 13: 143–154.
New research has reviewed the Irish implementation of the first phase of the EU Noise Directive. So far 31 different organisations have been involved and this will increase throughout the second phase of the Directive's implementation. More standardisation is needed to harmonise activities, perhaps by establishing a national expert steering group.
New research has reviewed the Irish implementation of the first phase of the EU Noise Directive. So far 31 different organisations have been involved and this will increase throughout the second phase of the Directive's implementation. More standardisation is needed to harmonise activities, perhaps by establishing a national expert steering group.
Noise pollution is estimated to cause about €12 billion of financial losses per year in the EU and exposure to environmental noise is thought to reduce the quality of life of more than 25 per cent of the EU population. To address this, the EU's Environmental Noise Directive (END)1 requires Member States to produce maps of noise levels in cities and around transport hubs and roads. It also requires the production of action plans to manage noise issues and disseminate information to the public.
The study investigated the implementation of the END in Ireland after the first phase of noise mapping and action planning, which the Directive requires of all EU Member States by June 2007. This can inform the second phase of implementation, which will require a greater scope of noise mapping and action planning by July 2012. It may also be useful to other Member States who could be experiencing similar problems in implementation.
The first phase involved mapping noise levels and creating action plans for one major city (Dublin), its airport and about 600km of major roads outside Dublin. This was the minimum compliance with the END, as there was a low level of expertise and data available. The overall responsibility of implementation lies with the Irish Environmental Protection Agency (EPA), but mapping involved five different organisations and action planning involved 26 organisations.
The EPA has now issued guidance notes on noise mapping. However, there are still issues with calculation as there was no standard method available. In addition, exposure estimates were not made on the basis of the most exposed façade of the building, as suggested by the END.
Most of the authorities have developed action plans and the EPA has issued guidance notes on these. These state the plans should aim to avoid significant health impacts and preserve 'quiet' areas. However, the guidelines have a number of flaws, such as the lack of consideration of the different annoyance levels for different noises, e.g. aircraft noise is usually more irritating than road noise. Lastly, although noise maps are available to the public, there is little coherent dissemination and communication.
The researchers made several recommendations to counteract some of the issues from the first implementation phase. Firstly, they suggest creating a noise expert steering group to ensure a universal approach. The group should have expertise in mapping techniques and measurement of exposure. Secondly, there should be greater harmonisation of calculation methods. This is being addressed by a common European assessment method, but the demands required by this method may be high and practical implications of applying it should be considered in terms of available resources and data. To achieve complete standardisation, authorities would be required to use the same noise prediction software.
Thirdly, there needs to be greater standardisation on the measurement of exposure and some assumptions may need to be addressed, such as assuming that all individuals reside and sleep in their dwelling all year round. Finally, a centralised body to maintain a national dataset could help ensure data accuracy and information sharing.
Source: King, E.A., Murphy, E. & Rice, H.J. (2010) Implementation of the EU environmental noise directive: Lessons from the first phase of strategic noise mapping and action planning in Ireland. Journal of Environmental Management. 92(3): 756-764.
The number and impacts of disasters have increased in Europe in the period 1998-2009, a new report by the European Environment Agency (EEA) concludes. The report assesses the frequency of disasters and their impacts on humans, the economy and ecosystems and calls for better integrated risk management across Europe.
The report addresses three different types of hazards: hydrometeorological or weather related (storms, extreme temperature events, forest fires, droughts, floods), geophysical (snow avalanches, landslides, earthquakes, volcanoes) and technological (oil spills, industrial accidents, toxic spills from mining activities). These hazards caused nearly 100,000 fatalities and affected more than 11 million people during 1998-2009. Natural hazards caused significant financial losses, estimated at over €150 billion in the 32 member countries of the European Environment Agency. However, technological hazards caused more damage, both immediate and long-term damage, to ecosystems than natural hazards.
The report finds that the 2003 heat wave over western and southern Europe was the hazard that caused the highest number of fatalities, estimated at around 70,000, while the 1999 earthquake in Turkey caused 17,000 fatalities. Flooding (213 events), storms (155 events) and extreme temperatures (101 events) caused the highest numbers of disastrous events, while industrial accidents (339 events) topped the list of technological hazards.
In terms of economic impacts, flooding cost about €52 billion and storms about €44 billion during the study period. Extreme temperature is thought to have cost around €10 billion and drought around €5 billion. Ecosystems were hit hardest by oil spills from tankers (Erika in 1999 and Prestige in 2002) and toxic waste spills from mining activities (Aznacollar, Spain in 1999 and Baia Mare in Romania in 2000).
The report suggests that increases in human activity, accumulation of economic assets in hazard-prone areas and better reporting contributed to an increase in the number and impact of disasters. It also suggests that the potential harm caused by a hazard depends crucially on how vulnerable an exposed community is to the hazard. Policy makers have a role to play in developing measures that can reduce the impact of hazards on human health and the economy. For example, stricter regulation of the oil shipping industry appears to have reduced the number of oil spills.
It is not known to what extent climate change has contributed to the increase in disasters. However, projections reveal that the severity and frequency of extreme weather events are expected to increase and thus the share of losses attributable to climate change could also increase in future.
The report calls for better Integrated Risk Management1 across Europe, covering prevention, preparedness, response and recovery for all hazards. Policymakers should in particular consider actions to reduce the vulnerability of communities to hazards, including:
Improved early warning systems and increased efforts to raise public awareness. This could help reduce the impact of storms and floods in the future.
More precise forecasting of extreme temperature events and new tools to help integrate forecasts with other data, such as socio-economic factors that affect vulnerability to extreme heat. Heat related deaths are largely preventable, so strategies should focus on delivering the infrastructure needed to support those at risk.
Better prevention of forest fires, an issue which is inter alia addressed in the EC Green Paper on Forest Protection and Information2 in the EU.
A recent study has proposed changes to the way REDD+ strategies are categorised, from an implementation perspective. This will simplify the monitoring, reporting and verification of the schemes. In addition, to encourage countries to make an early start on the REDD+ programmes, the study suggests that monitoring should initially focus on forests where it is easiest to implement REDD+ actions.
REDD (Reducing emissions from deforestation and forest degradation) is a policy approach negotiated under the UN Framework Convention on Climate Change (UNFCCC) which is designed to support developing countries slow, stop and reverse the loss of forest cover and forest carbon, and manage their forest resources more sustainably by rewarding reduced deforestation and degradation of forests, to help mitigate climate change. REDD has been expanded to REDD+ 1 to include forest enhancement, sustainable management of forests and forest conservation.
In order to appropriately credit countries who have increased (or decreased 'less than in a business as usual scenario') their forest carbon stocks, it is necessary to determine changes in national forest stocks through monitoring, reporting and verification (MRV) systems, using a combination of remote sensing techniques and collection of forest data on the ground2. This study offers two proposals to make MRV of REDD+ programmes easier.
The first proposal would change the way different strategies of REDD+ are clustered, as far as monitoring is concerned. Degradation and deforestation are usually grouped together, sometimes on the assumption that degradation leads to deforestation. The authors argue that this is not always true. Deforestation refers to the (usually abrupt) loss of forest through land use change, for example, felling forests for agriculture or urbanisation. Degradation, on the other hand, is related to human activities that result in a gradual loss of carbon stocks in forests without completely changing land use. Therefore, in terms of climate change impacts, it may be more akin to unsustainable forest practice, better dealt with by improved and sustainable management policies. This approach fits in with two of the new elements of REDD+ (sustainable management of forests and forest enhancement).
Forest conservation is a specific type of forest management where carbon stocks remain intact, but there is uncertainty as to how to credit this carbon under REDD+. Credits for deforestation, degradation, forest enhancement and sustainable forest management are based on changes in the rate of change of carbon stock, e.g. if the rate slows down. However, the aim of forest conservation is a zero rate of change. Conservation may therefore require different instruments to those used for other elements of REDD+, possibly related to the conservation of biodiversity.
The study therefore suggests it would be more realistic to organise the five strategies of REDD+ into three categories: 1.) reduced deforestation, 2.) conservation (maintaining forest stocks) and 3.) activities resulting in positive impacts on carbon stock in forests that remain as forest (including forest enhancement, sustainable forest management and reducing degradation).
Proposal two suggests that, in the early stages, it would not be necessary to apply the same intensity of MRV of all forests in every part of a country. While full national monitoring is required, it would be up to each country to decide where it is easiest to implement REDD+ measures and which approaches are best suited to each area. Therefore, in the short-term, most MRV efforts to track human activities and support implementation would be focused on areas where REDD+ activities occur. Concentrating efforts in this way will initially reduce costs and enable nations to build capacity, so that in the longer-term, a fully developed MRV system covering the whole country and the whole range of activities can be implemented. This will also help countries to make an early start on REDD+ programmes. Monitoring could be carried out by including local stakeholders with independent verification.
Source: Herold, M and Skutsch, M. (2011) Monitoring, reporting and verification for national REDD+ programmes: two proposals. Environmental Research Letters. 6: 014002 (10pp). This study is free to view at: http://iopscience.iop.org/1748-9326/6/1/014002
Past economic activity is more likely to explain the current pattern of biological invasions across Europe than recent human activities, according to a new study. It can take several decades before a newly introduced species becomes established and spreads, which may mean that recent invasions caused by current economic activities could create an ‘invasion debt’ for future generations.
Non-native or alien plant and animal species can have a range of negative impacts on biodiversity, ecosystems and the economy: for example, competing with native species for resources and affecting human health or infrastructure. Earlier research suggests human economic activities are more important in influencing biological invasions than either climate or geography, but there is less certainty about the role of past activities on existing patterns of invasion.
In this study, the researchers used data from the EU-funded DAISIE project1 to explore the influence of socioeconomic factors in the years 1900 and 2000 on patterns of alien invaders in 10 groups (vascular plants, bryophytes, fungi, birds, mammals, reptiles, amphibians, fish, terrestrial insects, and aquatic invertebrates) currently found across 28 European countries. Known social and economic factors that affect biological invasions were measured by: population density, gross domestic product (GDP) per person and share of exports in GDP as a predictor for the intensity or openness of trade.
Typically, the current pattern of invasive species in Europe is explained more by historical activities in 1900 than by recent activities in 2000. The rate of introduction of alien species into Europe has significantly increased over the last 50-60 years, so it is likely that there could be a delay of at least a few decades before the full impact of these invaders is seen. This is described as an ‘invasion debt’ for the future.
A critical threshold of individuals is needed for a species to become established and spread, which could explain the delay after introduction. The time taken to meet this threshold depends on a number of factors, including how often and how many alien species are introduced, the routes by which they are introduced, how suitable new habitats are for alien species and how species are able to adapt to new environments.
For example, some introduced birds and insects can disperse more quickly than other types of species because they can fly and are therefore more mobile and able to find suitable habitats. However, such explanations do not always fit the detected patterns of alien invaders. Further work is needed to understand the relationships between those factors that influence the delay between introduction and establishment of alien species.
Despite current preventative measures to reduce the introduction of biological invaders in Europe, globalisation is increasing socioeconomic activity which could increase the rate of introductions. Given the likely time lag between the introduction, establishment and spread of invasive species, it is recommended that existing control measures should be strengthened.
DAISIE (Delivering Alien Invasive Species Inventories for Europe) was supported by the European Commission under the Sixth Framework Programme. See: www.europe-aliens.org
Source: Essl, F., Dullinger, S., Rabitsch, W. et al. (2011) Socioeconomic legacy yields an invasion debt. Proceedings of the National Academy of Sciences. 108:203-207. This study is free to view here.
A young female monk seal has been shot and killed on the Aegean island of Evia, reports Greek NGO MOm. An onsite necropsy by the organisation’s rescue team indicated that the still nursing pup had been in good nutritional health before being shot in the head with a hunting rifle by “person or persons unknown” on 16 February. …
Three-quarters of the world's coral reefs are at risk due to overfishing, pollution, climate change and other factors, says a major new assessment. Reefs at Risk Revisited collates the work of hundreds of scientists and took three years to compile. The biggest threat is exploitative fishing, the researchers say, though most reefs will be feeling the impact of climate change within 20 years. But, they say, there are measures that can be taken to protect at least some. …
Three-quarters of the world's coral reefs are at risk from overfishing, pollution and climate change, according to a report. By 2050 virtually all of the world's coral reefs – from the waters of the Indian Ocean to the Caribbean to Australia – will be in danger, the report warns. The consequences – especially for countries such as the Philippines or Haiti which depend on the reefs for food – will be severe. "These are dire results," said Lauretta Burke, a lead author of Reefs at Risk, a collaborative effort led by the World Resources Institute in Washington and 25 other research organisations. …
Coral reefs and their wildlife already face threats like ocean acidification and overfishing. The international trade in coral reef wildlife for “ornamental” uses like household aquariums, home décor, and jewelry also harms these ecosystems and reduces their ability to recover from other threats. …
A pioneering project applies concepts of personalised medicine to coral reefs to decode signals that corals put out when under stress from poisons. Robert Richmond never meant to go to court. Yet there he was before a judge on the Micronesian island of Yap in the Pacific Ocean. The coral forensics expert at the University of Hawaii was testifying in a case brought by Yap's tribal elders against the owner of the Kyowa Violet cargo ship, which in 2002 had slammed into a reef just outside Yap's main harbour, spewing 200,000 litres of oil into the lagoon. The elders had turned to Richmond to find out whether the spill had damaged the reef even in areas that were not visibly oiled. As a member of Coral Whisperer, a pioneering project which applies the concepts of personalised medicine to coral reefs, he was the perfect choice. …
Until now, how species such as loggerhead sea turtles manage to migrate thousands of miles across oceans with no visual landmarks has been a mystery. Now researchers from the University of North Carolina believe they have found the answer. Loggerhead sea turtles appear to be able to determine their longitude using two sets of magnetic cues. It is the first time this ability has been shown in any migratory animal. This research is published in the journal Current Biology. …
Fresh from its Japanese victory, marine conservation vigilante organisation Sea Shepherd will be back in the Mediterranean for the tuna fishing season with two vessels, not one. “We’re returning to the Mediterranean in May, June, July,” Sea Shepherd Conservation Society founder Paul Watson said. “We’ll be looking for poaching operations and if we find them we’ll cut the nets and release the tuna like we did last year.” The organisation has just celebrated a victory after last week Japan recalled its whaling ships in the Antarctic due to safety concerns in the face of stepped-up actions by the organisation, raising hopes of an outright ban on whaling in the region. …
Fifteen Spanish flagged fishing vessels will no longer operate in Montevideo because labour claims disputes with crew members can include vessel seizures or significant collateral deposits demands by the Uruguayan justice before they can return to sea. …
Uruguay’s leading and largest fish processing company Fripur S.A. claimed in full page ads in Montevideo’s newspapers that it is the victim of a “strong media offensive” that “irresponsibly” questions the way in which it processes products that are sold both in Uruguay and overseas. …
U.S. retailer CostCo improves seafood policies in win for oceans!
After eight months of Greenpeace campaigners pressuring Costco to improve their seafood policies, the wholesale giant announced a new policy aimed at helping to save the oceans rather than plunder them.
The death toll of dolphins found washed ashore along the U.S. Gulf Coast since last month climbed to nearly 60 on Thursday, as puzzled scientists clamored to determine what was killing the marine mammals. …
As darkness falls and thoughts turn to slumber, waves of sleep wash over seagulls huddling against the elements. This is not poetry, but a discovery made by a scientist who has been studying sleep in bird colonies. He found that seagulls learn from each other when it is safe to nod off, resulting in "waves of sleep" passing through seagull colonies as the birds enter differing states of vigilance. …
Penguins may owe their survival in the coldest and most inhospitable place on earth to evolutionary chance during a period of global warming millions of years ago. Far from adapting to the cold in Antarctica, where temperatures can plunge below minus 60C and wind speeds reach in excess of 200mph, they have been able to thrive because of a form of central heating of the wings they evolved when the climate on Earth was hot. …
Tawny owls turn brown to survive in warmer climates, according to scientists in Finland. Feather colour is hereditary, with grey plumage dominant over brown. But the study, published in the journal Nature Communications, found that the number of brown owls was increasing. As winters become milder, the scientists say, grey feathered tawny owls are likely to disappear. This study indicates that the birds are evolving in response to climate change. …
Shuttle Discovery: Review of 39 missions
The US shuttle Discovery has launched from the Kennedy Space Center for the last time. First launched in 1984, this is its 39th outing. When it lands back on Earth in nearly two weeks' time it will have covered a cumulative career distance of 230m km (143m miles). That is further than the distance from the Earth to the Sun (149m km).