STATS | Materials Management a Challenge for Shale Operators

Materials management has consistently been a challenge for the oil and gas industry, where the main focus has always been on ensuring material availability, regardless of costs.
Materials management has consistently been a challenge for the oil and gas industry, where the main focus has always been on ensuring material availability, regardless of costs.
01

> more news in this sector

California Resources Corporation Announces 3rd Quarter and Nine Month 2014 Financial Results

LOS ANGELES--(BUSINESS WIRE)--California Resources Corporation, a subsidiary of Occidental Petroleum Corporation (NYSE:OXY), announced net income of $188 million for the third quarter of 2014, compared with $235 million for the third quarter of 2013. Net income for the first nine [...]
LOS ANGELES--(BUSINESS WIRE)--California Resources Corporation, a subsidiary of Occidental Petroleum Corporation (NYSE: OXY), announced net income of $188 million for the third quarter of 2014, compared with $235 million for the third quarter of 2013. Net income for the first nine months of 2014 was unchanged from the same period of 2013 at $657 million. In announcing the results, Todd Stevens, President and Chief Executive Officer, said, "As we near separation from Occidental, California Resour.
03

> more news in this sector

New high-speed transatlantic network to benefit science collaborations across the U.S.

ESnet to build high-speed extension for faster data exchange between United States and Europe. Image: ESnet
This Fermilab press release came out on Oct. 20, 2014. ESnet to build high-speed extension for faster data exchange between United States and Europe. Image: ESnet Scientists across the United States will soon have access to new, ultra-high-speed network [...]
This Fermilab press release came out on Oct. 20, 2014. ESnet to build high-speed extension for faster data exchange between United States and Europe.

Image: ESnet Scientists across the United States will soon have access to new, ultra-high-speed network links spanning the Atlantic Ocean thanks to a project currently under way to extend ESnet (the U. S. Department of Energy’s Energy Sciences Network) to Amsterdam, Geneva and London.  Although the project is designed to benefit data-intensive science throughout the U.
S. national laboratory complex, heaviest users of the new links will be particle physicists conducting research at the Large Hadron Collider (LHC), the world’s largest and most powerful particle collider.

The high capacity of this new connection will provide U.
S. scientists with enhanced access to data at the LHC and other European-based experiments by accelerating the exchange of data sets between institutions in the United States and computing facilities in Europe. DOE’s Brookhaven National Laboratory and Fermi National Accelerator Laboratory—the primary computing centers for U.
S.

collaborators on the LHC’s ATLAS and CMS experiments, respectively—will make immediate use of the new network infrastructure once it is rigorously tested and commissioned.  Because ESnet, based at DOE’s Lawrence Berkeley National Laboratory, interconnects all national laboratories and a number of university-based projects in the United States, tens of thousands of researchers from all disciplines will benefit as well.
The ESnet extension will be in place before the LHC at CERN in Switzerland—currently shut down for maintenance and upgrades—is up and running again in the spring of 2015. Because the accelerator will be colliding protons at much higher energy, the data output from the detectors will expand considerably—to approximately 40 petabytes of raw data per year compared with 20 petabytes for all of the previous lower-energy collisions produced over the three years of the LHC first run between 2010 and 2012. The cross-Atlantic connectivity during the first successful run for the LHC experiments, which culminated in the discovery of the Higgs boson, was provided by the US LHCNet network, managed by the California Institute of Technology.
In recent years, major research and education networks around the world—including ESnet, Internet2, California’s CENIC, and European networks such as DANTE, SURFnet and NORDUnet—have increased their backbone capacity by a factor of 10, using sophisticated new optical networking and digital signal processing technologies. Until recently, however, higher-speed links were not deployed for production purposes across the Atlantic Ocean—creating a network “impedance mismatch” that can harm large, intercontinental data flows. An evolving data model This upgrade coincides with a shift in the data model for LHC science.
Previously, data moved in a more predictable and hierarchical pattern strongly influenced by geographical proximity, but network upgrades around the world have now made it possible for data to be fetched and exchanged more flexibly and dynamically. This change enables faster science outcomes and more efficient use of storage and computational power, but it requires networks around the world to perform flawlessly together.

“Having the new infrastructure in place will meet the increased need for dealing with LHC data and provide more agile access to that data in a much more dynamic fashion than LHC collaborators have had in the past,” said physicist Michael Ernst of DOE’s Brookhaven National Laboratory, a key member of the team laying out the new and more flexible framework for exchanging data between the Worldwide LHC Computing Grid centers.
Ernst directs a computing facility at Brookhaven Lab that was originally set up as a central hub for U. S. collaborators on the LHC’s ATLAS experiment.
A similar facility at Fermi National Accelerator Laboratory has played this role for the LHC’s U.

S. collaborators on the CMS experiment.
These computing resources, dubbed Tier 1 centers, have direct links to the LHC at the European laboratory CERN (Tier 0).   The experts who run them will continue to serve scientists under the new structure. But instead of serving as hubs for data storage and distribution only among U.
S. -based collaborators at Tier 2 and 3 research centers, the dedicated facilities at Brookhaven and Fermilab will be able to serve data needs of the entire ATLAS and CMS collaborations throughout the world. And likewise, U.
S. Tier 2 and Tier 3 research centers will have higher-speed access to Tier 1 and Tier 2 centers in Europe.

“This new infrastructure will offer LHC researchers at laboratories and universities around the world faster access to important data,” said Fermilab’s Lothar Bauerdick, head of software and computing for the U.
S. CMS group. “As the LHC experiments continue to produce exciting results, this important upgrade will let collaborators see and analyze those results better than ever before.
” Ernst added, “As centralized hubs for handling LHC data, our reliability, performance and expertise have been in demand by the whole collaboration, and now we will be better able to serve the scientists’ needs.

” An investment in science ESnet is funded by DOE’s Office of Science to meet networking needs of DOE labs and science projects. The transatlantic extension represents a financial collaboration, with partial support coming from DOE’s Office of High Energy Physics (HEP) for the next three years.
 Although LHC scientists will get a dedicated portion of the new network once it is in place, all science programs that make use of ESnet will now have access to faster network links for their data transfers. “We are eagerly awaiting the start of commissioning for the new infrastructure,” said Oliver Gutsche, Fermilab scientist and member of the CMS Offline and Computing Management Board. “After the Higgs discovery, the next big LHC milestones will come in 2015, and this network will be indispensable for the success of the LHC Run 2 physics program.
” This work was supported by the DOE Office of Science. Fermilab is America’s premier national laboratory for particle physics and accelerator research. A U.
S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance, LLC.

Visit Fermilab’s website at www.
fnal. gov and follow us on Twitter at @FermilabToday. Brookhaven National Laboratory is supported by the Office of Science of the U.
S.

Department of Energy.   The Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.
  For more information, please visit science. energy. gov.
One of ten national laboratories overseen and primarily funded by the Office of Science of the U. S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security.
Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by the Research Foundation for the State University of New York on behalf of Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit applied science and technology organization.

Visit Brookhaven Lab’s electronic newsroom for links, news archives, graphics, and more at http: //www. bnl. gov/newsroom, follow Brookhaven Lab on Twitter, http://twitter. com/BrookhavenLab, or find us on Facebook, http://www.
facebook. com/BrookhavenLab/.

The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time.
For more information, please visit science. energy. gov.

Media contacts: Karen McNulty-Walsh, Brookhaven Media and Communications Office, kmcnulty@bnl. gov, 631-344-8350 Kurt Riesselmann, Fermilab Office of Communication, media@fnal. gov, 630-840-3351 Jon Bashor, Computing Sciences Communications Manager, Lawrence Berkeley National Laboratory, jbashor@lbnl. gov, 510-486-5849 Computing contacts: Lothar Bauerdick, Fermilab, US CMS software computing, bauerdick@fnal.
gov, 630-840-6804 Oliver Gutsche, Fermilab, CMS Offline and Computing Management Board, gutsche@fnal. gov, 630-840-8909 .

04

> more news in this sector

Have we detected Dark Matter Axions?

An interesting headline piqued my interest when browsing the social networking and news website Reddit the other day. It simply said: “The first direct detection of dark matter particles may have been achieved.”
 Well, that was news to me! 
Obviously, [...]
An interesting headline piqued my interest when browsing the social networking and news website Reddit the other day.

It simply said: “The first direct detection of dark matter particles may have been achieved. ”
 Well, that was news to me! 
Obviously, the key word here is “may”. Nonetheless, I was intrigued, not being aware of any direct detection experiments publishing such results around this time. As a member of LUX, there are usually collaboration-wide emails sent out when a big paper is published by a rival group, most recently the DarkSide-50 results .
Often an email like this is followed by a chain of comments, both good and bad, from the senior members of our group. I can’t imagine there being a day where I think I could read a paper and instantly have intelligent criticisms to share like those guys – but maybe when I’ve been in the dark matter business for 20+ years I will! It is useful to look at other work similar to our own.

We can learn from the mistakes and successes of the other groups within our community, and most of the time rivalry is friendly and professional.

So obviously I took a look at this claimed direct detection. Note that there are three methods to dark matter detection, see figure. To summarise quickly, The three routes to dark matter detection Direct detection is the observation of an interaction of a dark matter particle with a standard model one
.
Indirect detection is the observation of annihilation products that have no apparent standard model source and so are assumed to be the products of dark matter annihilation.

Production is the measurement of missing energy and momentum in a particle interaction (generally a collider experiment) that could signify the creation of dark matter (this method must be very careful, as this is how the neutrinos are measured in collider experiments). So I was rather surprised to find the article linked was about a space telescope – the XMM-Newton observatory.
These sort of experiments are usually for indirect detection.

The replies on the Reddit link reflected my own doubt – aside from the personification of x-rays, this comment was also my first thought: “If they detected x-rays who are produced by dark matter axions then it’s not direct detection. ” These x-rays supposedly come from a particle called an axion – a dark matter candidate. But to address the comment, I considered LUX, a direct dark matter detector, where what we are actually detecting is photons. These are produced by the recoil of a xenon nuclei that interacted with a dark matter particle, and yet we call it direct – because the dark matter has interacted with a standard model particle, the xenon.

So to determine whether this possible axion detection is direct, we need to understand the effect producing the x-rays. And for that, we need to know about axions.

I haven’t personally studied axions much at all.
At the beginning of my PhD, I read a paper called “Expected Sensitivity to Galactic/Solar Axions and Bosonic Super-WIMPs based on the Axio-electric Effect in Liquid Xenon Dark Matter Detectors” – but I couldn’t tell you a single thing from that paper now, without re-reading it.

After some research I have a bit more understanding under my belt, and for those of you that are physicists, I can summarise the idea: The axion is a light boson, proposed by Roberto Peccei and Helen Quinn in 1977 to solve the strong CP problem (why does QCD not break CP-symmetry when there is no theoretical reason it shouldn’t?). The introduction of the particle causes the strong CP violation to go to zero (by some fancy maths that I can’t pretend to understand!). 
It has been considered as a cold dark matter candidate because it is neutral and very weakly interacting, and could have been produced with the right abundance. Conversion of an axion to a photon within a magnetic field (Yamanaka, Masato et al) 
For non-physicists, the key thing to understand is that the axion is a particle predicted by a separate theory (nothing to do with dark matter) that solves another problem in physics.
It just so happens that its properties make it a suitable candidate for dark matter. Sounds good so far – the axion kills two birds with one stone.

 We could detect a dark matter axion via an effect that converts an axion to an x-ray photon within a magnetic field.
The XMM-Newton observatory orbits the Earth and looks for x-rays produced by the conversion of an axion within the Earth’s magnetic field. Although there is no particular interaction with a standard model particle (one is produced), the axion is not annihilating to produce the photons, so I think it is fair to call this direct detection. What about the actual results? What has actually been detected is a seasonal variation in the cosmic x-ray background.
The conversion signal is expected to be greater in summer due to the changing visibility of the magnetic field region facing the sun, and that’s exactly what was observed.

In the paper’s conclusion the authors state: “On the basis of our results from XMM-Newton, it appears plausible that axions – dark matter particle candidates – are indeed produced in the core of the Sun and do indeed convert to soft X-rays in the magnetic field of the Earth, giving rise to a significant, seasonally-variable component of the 2-6 keV CXB” Conversion of solar axions into photons within the Earth’s magnetic field (University of Leicester) Note the language used – “it appears plausible”. This attitude of physicists to always be cautious and hold back from bold claims is a wise one – look what happened to BICEP2. It is something I am personally becoming familiar with, last week having come across a lovely LUX event that passed my initial cuts and looked very much like it could have been a WIMP. My project partner from my masters degree at the University of Warwick is now a new PhD student at UCL – and he takes great joy in embarrassing me in whatever way he can.
 So after I shared my findings with him, he told everyone we came across that I had found WIMPs. Even upon running into my supervisor, he asked “Have you seen Sally’s WIMP?”.

I was not pleased – that is not a claim I want to make as a mere second year PhD student.
Sadly, but not unexpectedly, my “WIMP” has now been cut away. But not for one second did I truly believe it could have been one – surely there’s no way I‘m going to be the one that discovers dark matter! (Universe, feel free to prove me wrong. ) These XMM-Newton results are nice, but tentative – they need confirming by more experiments.
I can’t help but wonder how many big discoveries end up delayed or even discarded due to the cautiousness of physicists, who can scarcely believe they have found something so great.

I look forward to the time when someone actually comes out and says ‘We did it – we found it. ” with certainty.
It would be extra nice if it were LUX. But realistically, to really convince anyone that dark matter has been found, detection via several different methods and in several different places is needed. There is a lot of work to do yet.
It’s an exciting time to be in this field, and papers like the XMM-Newton one keep us on our toes! LUX will be starting up again soon for what we hope will be a 300 day run, and an increase in sensitivity to WIMPs of around 5x. Maybe it’s time for me to re-read that paper on the axio-electric effect in liquid xenon detectors!.
05

> more news in this sector

New Insights on Carbonic Acid in Water

Though carbonic acid exists for only a fraction of a second before changing into a mix of hydrogen and bicarbonate ions, it is critical to both the health of the atmosphere and the human body.
Though carbonic acid exists for only a fraction of a second before changing into a mix of hydrogen and bicarbonate ions, it is critical to both the health of the atmosphere and the human body. Though it garners few [...]
Though carbonic acid exists for only a fraction of a second before changing into a mix of hydrogen and bicarbonate ions, it is critical to both the health of the atmosphere and the human body. Though it garners few public headlines, carbonic acid, the hydrated form of carbon dioxide, is critical to both the health of the atmosphere and the human body. However, because it exists for only a fraction of a second before changing into a mix of hydrogen and bicarbonate ions, carbonic acid has remained an enigma. A new study by Berkeley Lab researchers has yielded valuable new information about carbonic acid with important implications for both geological and biological concerns.
Richard Saykally, a chemist with Berkeley Lab’s Chemical Sciences Division and a professor of chemistry at the University of California (UC) Berkeley, led a study that produced the first X-ray absorption spectroscopy (XAS) measurements for aqueous carbonic acid. These XAS measurements, which were obtained at Berkeley Lab’s Advanced Light Source (ALS), were in strong agreement with supercomputer predictions obtained at the National Energy Research Scientific Computing Center (NERSC).

The combination of theoretical and experimental results provides new and detailed insights into the hydration properties of aqueous carbonic acid that should benefit the development of carbon sequestration and mitigation technologies, and improve our understanding of how carbonic acid regulates the pH of blood.
“Our results support an average hydration number of 3. 17 with the acid’s two protons each donating a strong hydrogen bond to solvating waters, the carbonyl oxygen accepting a strong hydrogen bond from solvating water, and the hydroxyl oxygen molecules accepting weak hydrogen bonds from the water,” says Saykally. “XAS data must be interpreted by comparing measurements to results from a calculated spectrum, which is a serious challenge.
The strong agreement between our calculated and observed X-ray spectra is a new and significant achievement.

” The molecular dynamics simulations and first principles density functional theory method used to model and interpret the XAS measurements were carried out under the leadership of David Prendergast, a staff scientist in the Theory of Nanostructures Facility at Berkeley Lab’s Molecular Foundry. The Molecular Foundry, NERSC and the ALS, are all DOE Office of Science national user facilities hosted at Berkeley Lab.
“Using our first-principles molecular dynamics model and molecular dynamic simulations, we were able to simulate how carbonic acid is solvated by water,” Prendergast says. “We then  converted this information into a predicted XAS absorption spectrum that could be directly compared with experimental measurements at the ALS. ” (From left) Richard Saykally, David Prendergast, Jacob Smith and Royce Lam were part of a team that has provided valuable new insight into aqueous carbonic acid.
(Photo by Roy Kaltschmidt) Saykally and Prendergast have published their results in Chemical Physical Letters. The paper is titled “The hydration structure of aqueous carbonic acid from X-ray absorption spectroscopy. ” Saykally is the corresponding author.
Other co-authors, in addition to Prendergast, are Royce Lam, Alice England, Alex Sheardy, Orion Shih, Jacob Smith and Anthony Rizzuto. When carbon dioxide dissolves in water about one-percent of it forms carbonic acid, which almost immediately dissociates to bicarbonate anions and protons.

Despite its fleeting existence – about 300 nanoseconds – carbonic acid is a crucial intermediate species in the equilibrium between carbon dioxide, water and many minerals.
It plays a crucial role in the carbon cycle – the exchange of carbon dioxide between the atmosphere and the oceans – and in the buffering of blood and other bodily fluids. The short life span of carbonic acid in water has made it extremely difficult to study. Saykally and his research group overcame this obstacle with their development of a unique liquid microjet mixing technology in which two aqueous samples rapidly mix and flow through a finely tipped nozzle that is made from fused silica and features an opening only a few micrometers in diameter.
The resulting liquid beam travels a few centimeters in a vacuum chamber before it is intersected by an X-ray beam then collected and condensed out.

Saykally and his group have set up their liquid microjet system at ALS Beamline 8. 0.
1, a high flux undulator beamline that produces X-ray beams optimized for XAS studies. “The key to our success was an advance in our liquid microjet technology that enables us to achieve a rapid mixing of our reactants, bicarbonate and hydrochloric acid, and immediate probing of the carbonic acid products,” Saykally says. For this study, he and his group used a variation of XAS called Near Edge X-ray Absorption Fine Structure (NEXAFS) spectroscopy, an atom-specific probe technique of both the electronic structure of a molecule and its local chemical environment.
NEXAFS is ideal for obtaining detailed characterizations of hydration interactions, however, it has largely been limited to studies in gases and solids because of the difficulties of working with liquid samples in a high vacuum. By incorporating their microjet technology into the high-vacuum environment of a synchrotron X-ray beamline, Saykally and his group are able to perform NEXAFS on liquid samples. The researchers behind this study say that their results are important for understanding and modeling how the chemical equilibrium between carbonic acid and carbon dioxide proceeds in saline aquifers and other proposed carbon sequestration media.
The same equilibrium process governs respiration in living organisms. “As carbonic acid in both the gas and solid phases has been fairly well studied, our new water solution work will facilitate the development of detailed models for the reversible gas-liquid chemistry of carbon dioxide,” Saykally says.

This research was supported by the DOE Office of Science.
Additional Information For more about the research of Richard Saykally go here For more about the research of David Prendergast go here For more about the Advanced Light Source go here For more about the Molecular Foundry go here For more about NERSC go here.
06

> more news in this sector

Scientists are skeptical of ‘brain games’ for older adults

digital_china_1

Computers closing China’s education gap

  • anorexia_concept_525
  • supernatural_1
  • beer_man_525
  • resting_state_525
  • teenager sipping soda
    Nearly 70 scientists have issued a statement saying they’re skeptical about claims that computer-based “brain games” actually help older adults sharpen their mental powers. Laura Carstensen, a Stanford University psychology professor and the director of the Center for Longevity, says [...]
    Nearly 70 scientists have issued a statement saying they’re skeptical about claims that computer-based “brain games” actually help older adults sharpen their mental powers. Laura Carstensen, a Stanford University psychology professor and the director of the Center for Longevity, says that as baby boomers enter their golden years, commercial companies are all too often promising quick fixes for cognition problems through products that are unlikely to produce broad improvements in everyday functioning. “It is customary for advertising to highlight the benefits and overstate potential advantages of their products,” she says. “But in the case of brain games, companies also assert that the products are based on solid scientific evidence developed by cognitive scientists and neuroscientists.
    “So we felt compelled to issue a statement directly to the public. ” One problem is that while brain games may target very specific cognitive abilities, there is very little evidence that improvements transfer to more complex skills that really matter, like thinking, problem solving and planning, according to the scholars.

    No magic bullet While it is true that the human mind is malleable throughout a lifetime, improvement on a single task—like playing computer-based brain games—does not imply a general, all-around and deeper improvement in cognition beyond performing better on just a particular game.
    Related Articles On FuturityComputers closing China’s education gapUniversity of TorontoBrain implant quiets severe anorexiaUniversity of Texas at AustinAdults, more than kids, rely on the supernatural University of North Carolina at Chapel HillAlcohol botches brain’s rebound from traumaScans of brains at rest confirm deep connectionsUniversity of Southern CaliforniaCan sugary drinks ruin teens' memory? “Often, the cited research is only tangentially related to the scientific claims of the company, and to the games they sell,” says Carstensen.

    Agreeing with this view were the experts who signed the Stanford-Planck consensus statement, which reads in part: “We object to the claim that brain games offer consumers a scientifically grounded avenue to reduce or reverse cognitive decline when there is no compelling scientific evidence to date that they do. . . .
    The promise of a magic bullet detracts from the best evidence to date, which is that cognitive health in old age reflects the long-term effects of healthy, engaged lifestyles. ” Lifestyle matters more As the researchers point out, the time spent on computer games takes away from other activities like reading, socializing, gardening, and exercising that may benefit cognitive functions.

    “When researchers follow people across their lives, they find that those who live cognitively active, socially connected lives and maintain healthy lifestyles are less likely to suffer debilitating illness and early cognitive decline,” as the statement describes it.
    “In psychology,” the scientists note, “it is good scientific practice to combine information provided by many tasks to generate an overall index representing a given ability. ” The same standards should be applied to the brain game industry, the experts maintain. But this has not been the case, they add.
    “To date, there is little evidence that playing brain games improves underlying broad cognitive abilities, or that it enables one to better navigate a complex realm of everyday life,” the participants state.

    One reason is the so-called “file drawer effect,” which refers to the practice of researchers filing away studies with negative outcomes. For example, brain game studies proclaiming even modest positive results are more likely to be published, cited, and publicized than ones that do not produce those affirming results.

    Source: Stanford University The post Scientists are skeptical of ‘brain games’ for older adults appeared first on Futurity.
    07

    > more news in this sector

  • Feathers have ‘custom’ shafts for flight

    ants_1
    University of California, Berkeley

    Genomes of menacing ants sequenced

  • sponge_525
  • elephant_calves_525
  • cuckoo_1
  • Fruit flies have 17 pairs of steering muscles connected to their wings, all of which play a role in the flight mechanism, but the exact role of each muscle during free flight has been difficult to pinpoint. (Credit: Pierre Kessler/Flickr)
  • avian_axe_MSU_1
    The shafts of feathers are made of a multi-layered fibrous composite material—a lot like carbon fiber—that lets the feather bend and twist in flight. Since their appearance more than 150 million years ago, feather shafts (rachises) have evolved to be [...]
    The shafts of feathers are made of a multi-layered fibrous composite material—a lot like carbon fiber—that lets the feather bend and twist in flight. Since their appearance more than 150 million years ago, feather shafts (rachises) have evolved to be some of the lightest, strongest, and most fatigue-resistant natural structures. However, relatively little work has been done on their morphology, especially from a mechanical perspective, and never at the nanoscale. Related Articles On FuturityUniversity of California, BerkeleyGenomes of menacing ants sequencedUniversity of California, Santa BarbaraEvolution clues from nerve-free spongesUniversity of SheffieldHeat stroke killing captive baby elephants University of SheffieldWhy cuckoos crack earlyCornell UniversityFruit flies sense every wing beat to steer straightMichigan State UniversitySongbird ‘cologne’ drives females wild The study, which appears in the Journal of the Royal Society Interface, is the first to use nano-indentation, a materials-testing technique, on feathers.
    It reveals the number, proportion, and relative orientation of rachis layers is not fixed, as previously thought, and varies according to flight style. “We started looking at the shape of the rachis and how it changes along the length of it to accommodate different stresses.

    Then we realized that we had no idea how elastic it was, so we indented some sample feathers,” says lead author Christian Laurent of Ocean and Earth Science at the University of Southampton.
    “Previously, the only mechanical work on feathers was done in the 1970s but under the assumption that the material properties of feathers are the same when tested in different directions, known as isotropic—our work has now invalidated this. ” The researchers tested the material properties of feathers from three birds of different species with markedly different flight styles: the Mute Swan (Cygnus olor), the Bald Eagle (Haliaeetus leucocephalus), and the partridge (Perdix perdix). “Our results indicate that the number, and the relative thickness, of layers around the circumference of the rachis and along the feather’s length are not fixed, and may vary either in order to cope with the stresses of flight particular to the bird or to the lineage that the individual belongs to,” adds Laurent, who led the study as part of his research degree in vertebrate paleontology.
    The researchers hope to fully model feather functions and link morphological aspects to particular flight styles and lineages.

    Those findings would have implications for paleontology and engineering. “We hope to be able to scan fossil feathers and finally answer a number of questions—What flew first? Did flight start from the trees down, or from the ground up? Could Archaeopteryx fly? Was Archaeopteryx the first flying bird?” asks Laurent.
    “In terms of engineering, we hope to apply our future findings in materials science to yacht masts and propeller blades, and to apply the aeronautical findings to build better micro-air vehicles in a collaboration [with] engineers at the university. ” Source: University of Southampton The post Feathers have ‘custom’ shafts for flight appeared first on Futurity.
    08

    > more news in this sector

  • Sharing the planet: Hen harrier conservation and grouse shooting

    Dr Juliette Young from the Centre for Ecology & Hydrology was on Radio 4 earlier this week, being interviewed for the Shared Planet programme. This week’s episode looked at conflicts between people over wildlife and following the programme there has [...]
    Dr Juliette Young from the Centre for Ecology & Hydrology was on Radio 4 earlier this week, being interviewed for the Shared Planet programme. This week’s episode looked at conflicts between people over wildlife and following the programme there has been a reasonable amount of online discussion of the issues raised, including a piece by former RSPB Conservation Director Mark Avery. One of the examples discussed during the recording was the conflict in a number of areas of the British Isles between hen harrier conservation interests and land management for grouse shooting. Dr Young’s comments in the programme built on research she and colleagues have carried out over the past ten years working on conflicts and stakeholder involvement in biodiversity conservation (e.
    g. Young et al.

    , 2005, 2007, 2010; Redpath et al.
    , 2013). The work involves speaking to a wide range of stakeholders with different perspectives on conflicts, including government advisers, scientists, conservation NGOs and land managers, including gamekeepers, who often feel portrayed by the media and other stakeholders in a negative light, despite their belief that their management can be beneficial to a variety of species. Dr Young said, “I sincerely hope that a solution can be found to ensure the conservation of hen harriers and other protected species.
    Whilst I condemn illegal activities against protected species my research has examined how, why and in which contexts conflicts emerge, and aims to analyse how shared understanding and solutions can be found.

    My research therefore reflects a wide range of different interests and values, all of which need to be understood to navigate through complex conservation conflicts. ” Dr Young’s research has shown that conflicts can be managed effectively through dialogue among all relevant stakeholders and this can lead to shared solutions where different human activities, including conservation, co-exist in the managed landscape (see also Redpath et al.
    , 2013, Young et al. , 2010). During the Shared Planet recording she highlighted one good example where this approach has succeeded.
    When the Scottish government implemented a seal conservation order in 2002 this was a catalyst by which all the local groups felt affected and understood the need to make changes. This catalyst led to the Moray Firth Seal Management Plan (MFSMP) that focused on the need to balance seal and salmon conservation. A local champion emerged who brought all relevant stakeholders, and their knowledge, together, to seek a shared solution to the conflict (Young et al.
    , 2012, 2013a and b). Additional informationDr Juliette Young is a social scientist at CEH’s site near Edinburgh, where she has been working since 2002.

    She initially trained as an ecologist at the University of London (BSc) and University of Leeds (MSc), spent time rehabilitating chimpanzees in Sierra Leone and chasing fig wasps in the Cook Islands before joining CEH.
    She has a PhD in political science.

    Her current work focuses on four main areas: public attitudes towards biodiversity, including views on how it should be or is managed, and the values associated with biodiversity. the communication between scientists and decision-makers. the understanding of human conflicts over nature conservation. the role of stakeholder engagement in nature conservation, particularly in the context of protected areas and species.
    ReferencesYoung, J. , Jordan, A.

    , Searle, K.
    R. , Butler, A. , Simmons, P.
    2013a.

    Framing scale in participatory biodiversity management may contribute to more sustainable solutions.

    Conservation Letters 6(5): 333-340. Young, J. , Jordan, A. , Searle, K.
    R. , Butler, A.

    , Chapman, D.
    , Simmons, P. , Watt, A. D.
    2013b.

    Does stakeholder involvement really benefit biodiversity conservation? Biological Conservation 158: 359-370. Redpath, S.
    , Young, J. , Evely, A. , Adams, W.
    M. , Sutherland, W. J.
    , Whitehouse, A. , Amar, A.

    , Lambert, R.
    , Linnell, J. , Watt, A. D.
    2013.

    Understanding and managing conflicts in biodiversity conservation.

    Trends in Ecology and Evolution 28(2): 100-109. Young, J. , Butler, J. R.
    A. , Jordan, A.

    , Watt, A.
    D. 2012.

    Less government intervention in biodiversity management: Risks and opportunities.

    Biodiversity and Conservation 21(4): 1095-1100. Young, J. , Marzano, M. , White, R.
    M. , McCracken, D.

    I.
    , Redpath, S. M. , Carss, D.
    N.

    , Quine, C. P.
    , Watt, A. D. 2010.

    The emergence of biodiversity conflicts from biodiversity impacts: characteristics and management strategies.

    Biodiversity & Conservation 19(14): 3973-3990. Henle, K. , Alard, D. , Clitherow, J.
    , Cobb, P. , Firbank, L.

    , Kull, T.
    , McCracken, D. , Moritz, R. F.
    A.

    , Niemelä, J. , Rebane, M.
    , Wascher, D. , Watt, A. , Young, J.
    2008. Identifying and managing the conflicts between agriculture and biodiversity conservation in Europe – a review.

    Agriculture, Ecosystems & Environment 124 (1-2): 60-71. Young, J. , Richards, C. , Fischer, A.
    , Halada, L. , Kull, T.

    , Kuzniar, A.
    , Tartes, U. , Uzunov, U. and Watt, A.
    2007.

    Conflicts between biodiversity conservation and human activities in the Central and Eastern European Countries.

    Ambio 36(7): 545-550. Young, J. , Watt, A. , Nowicki, P.
    , Alard, D. , Clitherow, J.

    , Henle, K.
    , Johnson, R. , Laczko, E. , McCracken, D.
    , Matouch, S.

    , Niemelä, J. 2005.

    Towards sustainable land use: identifying and managing the conflicts between human activities and biodiversity conservation in Europe.

    Biodiversity and Conservation 14(7): 1641-1661. Many scientific publications are on subscription websites. Authors may be able to send individuals full copies of their papers.
    09

    > more news in this sector

    Revised Alabama Maps Feature New Design

    caption below
    Summary: US Topo maps now have a crisper, cleaner design - enhancing readability of maps for online and printed use Newly designed US Topo maps covering Alabama are now available online for free download Contact Information: [...]
    Summary: US Topo maps now have a crisper, cleaner design - enhancing readability of maps for online and printed use Newly designed US Topo maps covering Alabama are now available online for free download Contact Information: Mark Newell, APR ( Phone: 573-308-3850 ); Bob Davis ( Phone: 573-308-3554 ); US Topo maps now have a crisper, cleaner design - enhancing readability of maps for online and printed use. Map symbols are easier to read over the digital aerial photograph layer whether the imagery is turned on or off. Improvements to symbol definitions (color, line thickness, line symbols, area fills), layer order, and annotation fonts are additional features of this latest update. The maps also have transparency for some features and layers to increase visibility of multiple competing layers.
    This new design was launched earlier this year and is now part of the new US Topo quadrangles for Alabama (840 maps), replacing the first edition US Topo maps for the states. "Users in our state are very excited about the three year revision cycle of the US Topo maps,” said George Heleine, the Geospatial Liaison for Alabama and Mississippi.

     “The Alabama Department of Transportation says that due to increased growth within the state, updated maps will significantly increase their utility across all disciplines within State Government”.
      US Topo maps are updated every three years. The initial round of the 48 conterminous states coverage was completed in September of 2012.   Hawaii and Puerto Rico maps have recently been added.
    Nearly 1,000 new US Topo maps for Alaska have been added to the USGS Map Locator & Downloader, but will take several years to complete.

    Re-design enhancements and new features: Crisper, cleaner design improves online and printed readability while retaining the look and feel of traditional USGS topographic maps New functional road classification schema has been applied A slight screening (transparency) has been applied to some features to enhance visibility of multiple competing layers Updated free fonts that support diacritics New PDF Legend attachment Metadata formatted to support multiple browsers New shaded relief layer for enhanced view of the terrain Military installation boundaries, post offices and cemeteries The railroad dataset is much more complete The previous versions of US Topo maps for these states, published in 2011, can still be downloaded from USGS web sites. Also, scanned images of older topographic maps from the period 1884-2006 can be downloaded from the USGS Historical Topographic Map Collection. These scanned images of legacy paper maps are available for free download from The National Map and the USGS Map Locator & Downloader website.   US Topo maps are created from geographic datasets in The National Map, and deliver visible content such as high-resolution aerial photography, which was not available on older paper-based topographic maps.
    The new US Topo maps also provide modern technical advantages that support wider and faster public distribution and on-screen geographic analysis tools for users. The new digital electronic topographic maps are delivered in GeoPDF ® image software format and may be viewed using Adobe Reader, available as a no-cost download.

    For more information, go to: http://nationalmap. gov/ustopo/ >> > >2014 US Topo map of the Florence, Alabama area with the shaded relief and image layter turned on. >1914 USGS legacy topographic map of the Muscle Shoals, Alabama area. .
    10

    > more news in this sector

    Wellcome Trust Science Writing Prize 2014: The winners are…

    IMG_0958
    The winners of the fourth Wellcome Trust Science Writing Prize were announced this evening at a ceremony held at Wellcome Trust HQ in London. With over 600 entries to choose from, picking a single winner in each category was [...]
    The winners of the fourth Wellcome Trust Science Writing Prize were announced this evening at a ceremony held at Wellcome Trust HQ in London. With over 600 entries to choose from, picking a single winner in each category was no simple task… “Communicating with the public in getting their insight into the work you do can help inform your research questions,” says Wellcome Trust Director Jeremy Farrar. That’s one of the reasons it is so important for us to nurture the next generation of science writers and encourage scientists to think about ways of communicating their work. The Wellcome Trust science writing prize, run in conjunction with the Guardian and the Observer, is an opportunity for aspiring science communicators to write about research that inspires them, and we’re always delighted with the high quality and number of entries that we receive.
    Split into two categories – professional scientists (postgraduate and above) and non-professionals (including undergraduates) – the entries are read by over 40 representatives from the Wellcome Trust, the Guardian and the Observer before a final shortlist is selected by a judging panel, this year headed up by materials scientist and broadcaster Mark Miodownik. Competition is stiff, so we congratulate all of those who earned a place on the shortlist.

    Pushed to make a decision, after a passionate, four hour long meeting, the judges picked Richard Stephens and Kate Széll as this year’s winners.
    Richard’s piece on smiling – ‘Don’t say cheese, say cheeks’ – earned him the crown (okay, trophy!) in the professional scientists category, while Kate’s article on facial blindness entitled ‘Prosopagnosia – a common problem, commonly overlooked’ was the winner of the non-professional and undergraduate category. Presenter Mark Miodownik commented on Kate’s piece that the judges were “quite bamboozled by it”, adding that “we fact checked it all and it checked out”. Both winners were presented with prize money of £1,000 and their articles will appear in full in the Guardian/the Observer and on the Wellcome Trust blog in the coming weeks.
    Science and health stories are on the front pages on a daily basis with the continuing Ebola outbreak in West Africa, and there is a continuing need for people who are able to accurately and engagingly explain scientific issues.

    We hope that the Wellcome Trust Science Writing Prize, schemes such as the Wellcome/New Statesman Scholarship and the science journalism funding we offer will help to develop the next generation of science journalists and communicators. For tips on how to start a science blog, how to avoid common mistakes in science writing, and even how to pitch to an editor, check out our science writing ‘How to’ guides on the blog.
    Filed under: Competition, Event, Science Communication, Wellcome Trust Science Writing Prize Tagged: Science Writing Prize, SWP2014 .
    11

    > more news in this sector

    Gordon and Betty Moore Foundation Awards Data Science Grant to Carnegie Mellon Researcher

    Carl Kingsford
    By Byron Spice / 412-268-9068                        PITTSBURGH—The Gordon and Betty Moore Foundation has announced the selection of Carl Kingsford, associate professor in Carnegie Mellon University's Lane Center for Computational Biology, as one of 14 recipients of [...]
    By Byron Spice / 412-268-9068                       PITTSBURGH—The Gordon and Betty Moore Foundation has announced the selection of Carl Kingsford, associate professor in Carnegie Mellon University's Lane Center for Computational Biology, as one of 14 recipients of its Moore Investigators in Data-Driven Discovery awards. The five-year, $1. 5 million grant will support Kingsford's efforts to develop efficient new methods for searching the massive amounts of DNA and RNA sequencing data now available worldwide. Many insights into the most basic and important processes of life are awaiting discovery within that data.
    The databases are of such scale, however, that existing search methods are inadequate to fully explore them. For instance, the National Institutes of Health alone archive more than 2 quadrillion bases of sequence.

    The latest Data-Driven Discovery Awards, totaling $21 million over five years, are unrestricted awards that will enable Kingsford and his fellow recipients to make a profound impact on scientific research by unlocking new types of knowledge and advancing new data science methods across a wide spectrum of disciplines.
    The awards are part of a five-year, $60 million Data-Driven Discovery Initiative within the Gordon and Betty Moore Foundation's Science Program. The initiative — one of the largest privately funded data scientist programs of its kind — is committed to enabling new types of scientific breakthroughs by supporting interdisciplinary, data-driven researchers. "Science is generating data at unprecedented volume, variety and velocity, but many areas of science don't reward the kind of expertise needed to capitalize on this explosion of information," said Chris Mentzel, program director of the Data-Driven Discovery Initiative.
    "We are proud to recognize these outstanding scientists, and we hope these awards will help cultivate a new type of researcher and accelerate the use of interdisciplinary, data-driven science in academia.

    " Kingsford directs a computational biology group that works on understanding protein interactions, gene expression, chromatic structure and viral evolution. His team collaborates across disciplines to create efficient computational methods that can deal with diverse and high-throughput datasets.
    Earlier this year, he and a collaborator at the University of Maryland announced Sailfish, a new computational method that dramatically speeds up estimates of gene activity from RNA sequencing (RNA-seq) data. With this method, estimates of gene expression that previously took many hours can be completed in a few minutes, with accuracy that equals or exceeds previous methods.

    "To me, Carl's work represents an outstanding example of the best approach to computational biology: careful framing of a biological problem followed by rigorous development and application of appropriate computer science methods," said Robert F. Murphy, director of the Lane Center for Computational Biology. "As the volume and complexity of biomedical data increases exponentially, his scalable approaches and commitment to open source software will be critical to enabling new and clinically important discoveries. " Kingsford is the recipient of an Alfred P.
    Sloan Research Fellowship in computational and evolutionary molecular biology and a National Science Foundation CAREER Award. He received a Ph.

    D.
    in computer science from Princeton University and also trained at Duke University. ### Carl Kingsford (pictured above), associate professor in Carnegie Mellon University's Lane Center for Computational Biology, has received a five-year, $1. 5 million grant to support his efforts to develop efficient new methods for searching the massive amounts of DNA and RNA sequencing data now available worldwide.
    12

    > more news in this sector

    Public Health News Roundup: October 22

    NPH_Public_Health_News_Roundup_Header
    EBOLA UPDATE: WHO Plans on Ebola Vaccine Tests in January ( New PublicHealth is monitoring the public health crisis in West Africa.) The World Health Organization plans to begin testing two experimental Ebola [...]
    EBOLA UPDATE: WHO Plans on Ebola Vaccine Tests in January(NewPublicHealth is monitoring the public health crisis in West Africa. )The World Health Organization plans to begin testing two experimental Ebola vaccines in West Africa by January. The vaccines will likely be tested on more than 20,000 frontline health care workers and others in the region. The global health agency also announced that a blood serum treatment could be available for use in Liberia within two weeks.
    Read more on Ebola.

    Study: Automated Tracking Improves Vaccine Compliance in Health Care WorkersAutomated tracking of influenza vaccinations increases vaccination compliance in health care personnel while also reducing the workload burden on human resources and occupational health staff, according to a new study in the journal Infection Control and Hospital Epidemiology. Researchers analyzed data on nearly 7,000 people including in a mandatory vaccination program, finding that “automated reminders and tracking accounted for more than 98 percent of compliance among healthcare personnel. ” "Mandatory vaccination programs help protect vulnerable patients, but can be tremendously time and resource dependent," said Susan Huang, MD, MPH, an author of the study, in a release. "By successfully automating a system to track and provide feedback to healthcare personnel who have not received their seasonal flu vaccine, we are providing safer places for care and reducing the administrative burden of our mandatory vaccination program.
    " Read more on vaccines.

    Study: Living with a Smoker is as Bad as Living in a Highly Polluted CityLiving with a smoker is the same as living in a smoke-free home in a heavily polluted city such as Beijing or London, with the non-smokers exposed to three times the World Health Organization’s officially recommend safe levels of damaging air particles, according to a new study in the journal Tobacco Control. In a collection of four studies, researchers determined that the concentration of fine particulate matter was approximately 10 times higher in smoking homes than it was in non-smoking homes. “Smokers often express the view that outdoor air pollution is just as much a concern as the second-hand smoke in their home,” said Sean Semple, MD, of University of Aberdeen, in a release.

    “These measurements show that second-hand tobacco smoke can produce very high levels of toxic particles in your home: much higher than anything experienced outside in most towns and cities in the UK. Making your home smoke-free is the most effective way of dramatically reducing the amount of damaging fine particles you inhale. ” Read more on air quality. .
    13

    > more news in this sector

    Smart meter analytics: Gateway to power distribution network insights

    The Utility Analytics Week conference opens today in California -- a great opportunity for utilities to consider how they can progress toward realizing full benefits from smart grid technology. Fortunately, applying analytics to smart meter data can provide a [...]
    The Utility Analytics Week conference opens today in California -- a great opportunity for utilities to consider how they can progress toward realizing full benefits from smart grid technology. Fortunately, applying analytics to smart meter data can provide a useful window into what's happening on a distribution network, even before deploying other technology for grid intelligence. For too many utilities, current conditions on their power distribution network are a "black box" -- difficult to see and comprehend. Real-time insight into a distribution network can help utilities quickly diagnose and respond to underlying issues, adapt to changes in customer demand and behavior, identify emerging opportunities, and better plan for the future.
    Comprehensive distribution network monitoring involves installing sensors and communication equipment at substations and other distribution assets in the field. But as an initial step, utilities that are rolling out smart meters today can leverage this deployment to provide some key grid analytics (in addition to traditional meter-to-cash benefits).

    Applying analytics to smart meter data can help utilities operate more smoothly and avoid considerable expenses associated with distribution.
    Specifically, smart meter data can help utilities:Understand and manage the impact of renewables on the grid. For customers with on-site solar, applying analytics to their smart meter data can help utilities monitor reverse load and power quality issues. Monitor equipment stress and minimize outages.
    With smart meter data and analytics, utilities can create "virtual meters" to track specific distribution assets.

    Tracking transformer loading in near-real-time can indicate which transformers, fuses, switches, or other assets are being overloaded, for how long. This helps utilities estimate likely outage risk as well as impacts to equipment lifespan.
    Combat energy theft. Energy thieves keep getting more creative about concealing illicit energy use. Smart meter analytics can help spot energy theft automatically.
    Machine learning technology allows analytics to spot emerging suspicious patterns that might be too subtle for utility analyst to notice. Analytics can also provide supporting documentation for decisions and actions taken in investigations. To gain these useful grid insights -- and more -- utilities can apply meter data analytics right from the start of a smart meter deployment.
    Treating analytics as an afterthought sacrifices substantial potential savings and operational benefits throughout the network. The EnergyIP Analytics Suite from Siemens includes several tools that can yield immediate, actionable insights from smart meter data.

    Learn more in this free Siemens e-book: Getting Smart about Smart Meter Analytics.
    14

    > more news in this sector

    Recommended Reading: Cities Take the Lead in Public Health Advances

    NPH_RecommendedReadingHeader
    Bruce Katz, vice president and director of the Metropolitan Policy Program at the Brookings Institution, recently spoke at the Mailman School of Public Health at Columbia University about his thesis that in the absence [...]
    Bruce Katz, vice president and director of the Metropolitan Policy Program at the Brookings Institution, recently spoke at the Mailman School of Public Health at Columbia University about his thesis that in the absence of federal leadership, cities are taking the lead on public health innovation in many ways including passing new laws that address public health concerns and partnering with university research centers. Mailman recently published an interview with Katz about belief that cities are driving public health changes and improvements.

    One topic Katz addressed was cities working together to improve population health: Cities watch each other closely. When one innovates, others replicate the innovation or adapt and tailor it to their own circumstances. We used to think if you were going to have dramatic change in a country on any number of issues, you needed the scale of the national government. Today it's more likely that a city will innovate in such a way that other cities can say, “We can do that, and maybe we can do it better.
    ” For example, when Portland, Oregon, finds a way to promote itself as the place that builds green cities and exports sustainable products and services to growing cities in Latin America or Asia, other cities begin to think, "Wait a second, we have our own clusters of clean energy or clean economy. Perhaps we can do the same branding and marketing and export promotion.

    ” The success of individual cities, or what Michael Bloomberg has done with C40 [a city-centric climate leadership group] rests on this notion that cities can learn from each other, share with each other, and then replicate the best innovations to have impact in their locales.
    That's a very different, 21st century model of how society goes about solving its major problems. Read the full interview.   .
    15

    > more news in this sector

    LEGO-inspired microfluidic blocks from 3D printer

    modular fluidic and instrumentation components
    Pictured is a microfluidic system assembled from modular components that were fabricated using 3D printing at the USC Viterbi School of Engineering. Krisna C. Bhargava et al. used stereolithographic printing techniques to manufacture standardized, interchangeable, fluidic blocks of about [...]
    Pictured is a microfluidic system assembled from modular components that were fabricated using 3D printing at the USC Viterbi School of Engineering. Krisna C. Bhargava et al. used stereolithographic printing techniques to manufacture standardized, interchangeable, fluidic blocks of about 1 cm3 and assembled them by hand to produce variations of complex 3D circuits.
    Circuit behavior was predicted using design rules analogous to those used in electronic circuit design, and alleviated design limitations imposed by 2D circuit designs. Microfluidic systems promise to improve the analysis and synthesis of materials, biological or otherwise, by lowering the required volume of fluid samples, offering a tightly controlled fluid-handling environment, and simultaneously integrating various chemical processes (applications include DNA analysis, pathogen detection, clinical diagnostic testing and synthetic chemistry).

    To build these systems, designers depend on microfabrication techniques that restrict them to arranging their designs in two dimensions and completely fabricating their design in a single step.
    This study introduces modular, reconfigurable components containing fluidic and sensor elements adaptable to many different microfluidic circuits. These elements can be assembled to allow for 3D routing of channels. This assembly approach allows for the application of network analysis techniques like those used in classical electronic circuit design, facilitating the straightforward design of predictable flow systems.
    The authors devised computer models for eight modular fluidic and instrumentation components (MFICs), each of which would perform a simple task.

    They said that their work in developing these MFICs marks the first time that a microfluidic device has been broken down into individual components that can be assembled, disassembled and re-assembled repeatedly. They attribute their success to recent breakthroughs in high-resolution, micron-scale 3D printing technology.
    Krisna C.

    Bhargava, Bryant Thompson, and Noah Malmstadt, Discrete elements for 3D microfluidics, PNAS 2014 111 (42) 15013-1501, doi: 10. 1073/pnas. 1414764111.
    16

    > more news in this sector

    Astellas Announces Revision of Package Insert in Japan for XTANDI® (enzalutamide) Capsules, a Prostate Cancer Treatment

    Tokyo, Japan– October 22, 2014 – Astellas Pharma Inc. (TSE: 4503) (“Astellas”) announced the revision of the package insert for the oral androgen receptor signaling inhibitor XTANDI® Capsules 40mg (generic name: enzalutamide, “XTANDI”). Based on the results of the [...]
    Tokyo, Japan– October 22, 2014 – Astellas Pharma Inc.

    (TSE: 4503) (“Astellas”) announced the revision of the package insert for the oral androgen receptor signaling inhibitor XTANDI® Capsules 40mg (generic name: enzalutamide, “XTANDI”). Based on the results of the Phase 3 AFFIRM trial for the treatment of advanced castration-resistant prostate cancer (“CRPC”) in patients who have previously received docetaxel (chemotherapy), XTANDI, which is being jointly developed and commercialized with US-based company Medivation Inc. (NASDAQ MDVN), obtained the marketing approval in Japan for use in CRPC treatment in March 2014 and went on sale in May 2014. The Phase 3 PREVAIL trial evaluating XTANDI as compared to placebo in chemotherapy-naïve metastatic CRPC patients recognized the clinical benefits and favorable tolerability of XTANDI for metastatic CRPC patients who have not received chemotherapy.
    The item “Indication” of the package insert has been revised based on the results of the PREVAIL trial, which means the sentence, “The efficacy and safety of the drug have not been established in patients with prostate cancer who have not received chemotherapy”, has been deleted from the “Precautions regarding indication” item. Also, the items “Side effects” and “Clinical results” of the package insert have been revised.

    In addition to the revisions, “thrombocytopenia” has been added to the “Significant side effects” in the “Side effects” item in accordance with Authorities' instruction.
    The revision of the “Precautions regarding indication” item of the package insert triggers a $45 million milestone payment to Medivation under its collaboration agreement with Astellas. .
    17

    > more news in this sector

    James Webb Space Telescope's Heart Survives Deep Freeze Test

    After 116 days of being subjected to extremely frigid temperatures like that in space, the heart of the James Webb Space Telescope, the Integrated Science Instrument Module (ISIM) and its sensitive instruments, emerged unscathed from the thermal vacuum chamber at [...]
    After 116 days of being subjected to extremely frigid temperatures like that in space, the heart of the James Webb Space Telescope, the Integrated Science Instrument Module (ISIM) and its sensitive instruments, emerged unscathed from the thermal vacuum chamber at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. The Webb telescope's images will reveal the first galaxies forming 13. 5 billion years ago. The telescope will also pierce through interstellar dust clouds to capture stars and planets forming in our own galaxy.
    At the telescope's final destination in space, one million miles away from Earth, it will operate at incredibly cold temperatures of -387 degrees Fahrenheit, or 40 degrees Kelvin. This is 260 degrees Fahrenheit colder than any place on the Earth’s surface has ever been.

    To create temperatures that cold on Earth, the team uses the massive thermal vacuum chamber at Goddard called the Space Environment Simulator, or SES, that duplicates the vacuum and extreme temperatures of space.
    This 40-foot-tall, 27-foot-diameter cylindrical chamber eliminates the tiniest trace of air with vacuum pumps and uses liquid nitrogen and even colder liquid helium to drop the temperature simulating the space environment. The James Webb Space Telescope is the scientific successor to NASA's Hubble Space Telescope. It will be the most powerful space telescope ever built.
    Webb is an international project led by NASA with its partners, the European Space Agency and the Canadian Space Agency.

    > More: NASA Webb's Heart Survives Deep Freeze Test Image Credit: NASA/Chris Gunn
    18

    > more news in this sector

    Formation and large scale confinement of jets emitted by young stars finally elucidated

    jets of matter emitted by young stars
    An international team of scientists has succeeded in explaining the formation and propagation over astronomical distances of jets of matter emitted by young stars--one of the most fascinating mysteries of modern astronomy. Using a patented experimental device and large-scale [...]
    An international team of scientists has succeeded in explaining the formation and propagation over astronomical distances of jets of matter emitted by young stars--one of the most fascinating mysteries of modern astronomy. Using a patented experimental device and large-scale numerical simulations, the team obtained data consistent with astrophysical observations.

    Full story at http: //www. inrs. ca/english/actualites/formation-and-large-scale-confinement-jets-emitted-young-stars-elucidatedSourceINRS (Institut national de la recherche scientifique)This is an NSF News From the Field item.
    19

    > more news in this sector

    Imaging electric charge propagating along microbial nanowires

     the microbe Geobacter pili
    University of Massachusetts Amherst physicists working with Derek Lovley and colleagues report in the current issue of Nature Nanotechnology that they've used a new imaging technique, electrostatic force microscopy, to resolve the biological debate with evidence from physics, showing that electric [...]
    University of Massachusetts Amherst physicists working with Derek Lovley and colleagues report in the current issue of Nature Nanotechnology that they've used a new imaging technique, electrostatic force microscopy, to resolve the biological debate with evidence from physics, showing that electric charges do indeed propagate along microbial nanowires just as they do in carbon nanotubes, a highly conductive man-made material.

    Full story at http: //www. umass. edu/newsoffice/article/imaging-electric-charge-propagating-alongSourceUniversity of Massachusetts AmherstThis is an NSF News From the Field item.
    20

    > more news in this sector

    « Back to main news page

    Previous | 1 2 3 4 | Next