New ads.

Saturday, 30 November 2019

Lack of oxygen doesn't kill infant brain cells

Lack of oxygen doesn't kill infant brain cells


Nearly 15 million babies are born prematurely, or before 37 weeks of pregnancy, around the world each year. When born too early, a baby's immature respiratory center in the brain often fails to signal it to breathe, resulting in low oxygen levels, or hypoxia, in the brain.
Research published in the Journal of Neuroscience shows that even a brief 30-minute period of hypoxia is enough to persistently disrupt the structure and function of the brain region known as the hippocampus, which is vital for learning and memory.
"Our findings raise new concerns about the vulnerability of the preterm brain to hypoxia. They are concerning for the long-term impact that oxygen deprivation can have on the ability of these preterm babies to learn as they grow to school age and adulthood," said the study's principal investigator, Stephen Back, M.D., Ph.D., Clyde and Elda Munson Professor of Pediatric Research and Pediatrics, OHSU School of Medicine, OHSU Doernbecher Children's Hospital.
In the neonatal intensive care unit, preemies can experience up to 600 short, but impactful periods of hypoxia each week. Consequently, more than one-third of babies who survive preterm birth are likely to have smaller brains, presumably due to brain cell loss, compared with the brains of full-term infants. This can increase the risk of significant life-long neurodevelopmental challenges that will affect learning, memory, attention and behavior.
Using a twin preterm fetal sheep model, Back and colleagues studied the impact of both hypoxia alone, as well as in combination with ischemia -- or insufficient blood flow -- on the developing hippocampus. The results confirm that, similar to human preterm survivors, growth of the hippocampus is impaired. However, brain cells do not die as previously believed. Rather, hippocampal cells fail to mature normally, causing a reduction in long-term potentiation, or the cellular basis of how the brain learns.
Remarkably, the severity of the hypoxia predicted the degree to which cells in the hippocampus failed to mature normally, explains Back. These findings are all the more unexpected because it was not appreciated that the preterm hippocampus was already capable of these learning processes.
"We want to understand next how very brief or prolonged exposure to hypoxia affects the ability for optimal learning and memory, " says Back. "This will allow us to understand how the hippocampus responds to a lack of oxygen, creating new mechanisms of care and intervention both at the hospital, and at home."

Story Source:
Materials provided by Oregon Health & Science UniversityNote: Content may be edited for style and length.

Microbes harvest electrons: Novel process discovered

Bacteria illustration

Ever since scientists discovered that certain microbes can get their energy from electrical charges, researchers have wondered how they do it.
Bacteria don't have mouths, so they need another way to bring their fuel into their bodies. New research from Washington University in St. Louis reveals how one such bacteria pulls in electrons straight from an electrode source. The work from the laboratory of Arpita Bose, assistant professor of biology in Arts & Sciences, was published Nov. 5 in the scientific journal mBio.
"The molecular underpinning of this process has been difficult to unravel until our work," Bose said. "This is mostly due to the complex nature of the proteins involved in this process. But now, for the first time, we understand how phototrophic microbes can accept electrons from solid and soluble substances."
Dinesh Gupta, a PhD candidate in the Bose laboratory, is the first author on this new study. "I was excited when we found that these phototrophic bacteria use a novel processing step to regulate the production of key electron transfer protein involved in this process," Gupta said. "This study will aid in designing a bacterial platform where bacteria can feed on electricity and carbon dioxide to produce value-added compounds such as biofuels."
Getting the electricity across the outer layer of the bacteria is the key challenge. This barrier is both nonconductive and impermeable to insoluble iron minerals and/or electrodes.
Bose and her collaborators, including Robert Kranz, professor of biology, showed that the naturally occurring strain of Rhodopseudomonas palustris TIE-1 builds a conduit to accept electrons across its outer membrane. The bacteria relies on an iron-containing helper molecule called a deca-heme cytochrome c. By processing this protein, TIE-1 can form an essential bridge to its electron source.
Extracellular electron uptake, or EEU, can help microbes to survive under nutrient-scarce conditions.
Now that Bose has documented these mechanisms behind EEU, she hopes to use it as a biological marker to identify other electricity-eating bacteria in the wild. The findings will help researchers to understand the importance of this functionality in metabolic evolution and microbial ecology.

Story Source:
Materials provided by Washington University in St. Louis. Original written by Talia Ogliore. Note: Content may be edited for style and length.

Pig-Pen effect: Mixing skin oil and ozone can produce a personal pollution cloud

Ozone can produce a personal pollution cloud


When ozone and skin oils meet, the resulting reaction may help remove ozone from an indoor environment, but it can also produce a personal cloud of pollutants that affects indoor air quality, according to a team of researchers.
In a computer model of indoor environments, the researchers show that a range of volatile and semi-volatile gases and substances are produced when ozone, a form of oxygen that can be toxic, reacts with skin oils carried by soiled clothes, a reaction that some researchers have likened to the less-than-tidy Peanuts comic strip character.
"When the ozone is depleted through human skin, we become the generator of the primary products, which can cause sensory irritations," said Donghyun Rim, assistant professor of architectural engineering and an Institute for CyberScience associate, Penn State. "Some people call this higher concentration of pollutants around the human body the personal cloud, or we call it the 'Pig-Pen Effect.'"
The substances that are produced by the reaction include organic compounds, such as carbonyls, that can irritate the skin and lungs, said Rim. People with asthma may be particularly vulnerable to ozone and ozone reaction products, he said.
According to the researchers, who reported their findings in a recent issue of Nature's Communications Chemistry, skin oils contain substances, such as squalene, fatty acids and wax esters. If a person wears the same clothes too long -- for example, more than a day -- without washing, there is a chance that the clothes become more saturated with the oils, leading to a higher chance of reaction with ozone, which is an unstable gas.
"Squalene can react very effectively with ozone," said Rim. "Squalene has a higher reaction rate with ozone because it has a double carbon bond and, because of its chemical makeup, the ozone wants to jump in and break this bond."
Indoors, ozone concentration can range from 5 to 25 parts per billion -- ppb -- depending on how the air is circulating from outside to inside and what types of chemicals and surfaces are used in the building. In a polluted city, for example, the amount of ozone in indoor environments may be much higher.
"A lot of people think of the ozone layer when we talk about ozone," said Rim. "But, we're not talking about that ozone, that's good ozone. But ozone at the ground level has adverse health impacts."
Wearing clean clothes might be a good idea for a lot of reasons, but it might not necessarily lead to reducing exposure to ozone, said Rim. For example, a single soiled t-shirt helps keep ozone out of the breathing zone by removing about 30 to 70 percent of the ozone circulating near a person.
"If you have clean clothes, that means you might be breathing in more of this ozone, which isn't good for you either," said Rim.
Rim said that the research is one part of a larger project to better understand the indoor environment where people spend most of their time.
"The bottom line is that we, humans, spend more than 90 percent of our time in buildings, or indoor environments, but, as far as actual research goes, there are still a lot of unknowns about what's going on and what types of gases and particles we're exposed to in indoor environments," said Rim. "The things that we inhale, that we touch, that we interact with, many of those things are contributing to the chemical accumulations in our body and our health."
Rather than advising people whether to wear clean or dirty clothes, the researchers suggest that people should focus on keeping ground ozone levels down. Better building design and filtration, along with cutting pollution, are ways that could cut the impact of the Pig-Pen Effect, they added.
To build and validate the models, the researchers used experimental data from prior experiments investigating reactions between ozone and squalene, and between ozone and clothing. The researchers then analyzed further how the squalene-ozone reaction creates pollutants in various indoor conditions.
The team relied on computer modeling to simulate indoor spaces that vary with ventilation conditions and how inhabitants of those spaces manage air quality, Rim said.
In the future, the team may look at how other common indoor sources, such as candle and cigarette smoke, could affect the indoor air quality and its impact on human health.

Story Source:
Materials provided by Penn State. Original written by Matt Swayne. Note: Content may be edited for style and length.

Babies in the womb may see more than we thought

Illustration of fetus inside womb

By the second trimester, long before a baby's eyes can see images, they can detect light.
But the light-sensitive cells in the developing retina -- the thin sheet of brain-like tissue at the back of the eye -- were thought to be simple on-off switches, presumably there to set up the 24-hour, day-night rhythms parents hope their baby will follow.
University of California, Berkeley, scientists have now found evidence that these simple cells actually talk to one another as part of an interconnected network that gives the retina more light sensitivity than once thought, and that may enhance the influence of light on behavior and brain development in unsuspected ways.
In the developing eye, perhaps 3% of ganglion cells -- the cells in the retina that send messages through the optic nerve into the brain -- are sensitive to light and, to date, researchers have found about six different subtypes that communicate with various places in the brain. Some talk to the suprachiasmatic nucleus to tune our internal clock to the day-night cycle. Others send signals to the area that makes our pupils constrict in bright light.
But others connect to surprising areas: the perihabenula, which regulates mood, and the amygdala, which deals with emotions.
In mice and monkeys, recent evidence suggests that these ganglion cells also talk with one another through electrical connections called gap junctions, implying much more complexity in immature rodent and primate eyes than imagined.
"Given the variety of these ganglion cells and that they project to many different parts of the brain, it makes me wonder whether they play a role in how the retina connects up to the brain," said Marla Feller, a UC Berkeley professor of molecular and cell biology and senior author of a paper that appeared this month in the journal Current Biology. "Maybe not for visual circuits, but for non-vision behaviors. Not only the pupillary light reflex and circadian rhythms, but possibly explaining problems like light-induced migraines, or why light therapy works for depression."
Parallel systems in developing retina
The cells, called intrinsically photosensitive retinal ganglion cells (ipRGCs), were discovered only 10 years ago, surprising those like Feller who had been studying the developing retina for nearly 20 years. She played a major role, along with her mentor, Carla Shatz of Stanford University, in showing that spontaneous electrical activity in the eye during development -- so-called retinal waves -- is critical for setting up the correct brain networks to process images later on.
Hence her interest in the ipRGCs that seemed to function in parallel with spontaneous retinal waves in the developing retina.
"We thought they (mouse pups and the human fetus) were blind at this point in development," said Feller, the Paul Licht Distinguished Professor in Biological Sciences and a member of UC Berkeley's Helen Wills Neuroscience Institute. "We thought that the ganglion cells were there in the developing eye, that they are connected to the brain, but that they were not really connected to much of the rest of the retina, at that point. Now, it turns out they are connected to each other, which was a surprising thing."
UC Berkeley graduate student Franklin Caval-Holme combined two-photon calcium imaging, whole-cell electrical recording, pharmacology and anatomical techniques to show that the six types of ipRGCs in the newborn mouse retina link up electrically, via gap junctions, to form a retinal network that the researchers found not only detects light, but responds to the intensity of the light, which can vary nearly a billionfold.
Gap junction circuits were critical for light sensitivity in some ipRGC subtypes, but not others, providing a potential avenue to determine which ipRGC subtypes provide the signal for specific non-visual behaviors that light evokes.
"Aversion to light, which pups develop very early, is intensity-dependent," suggesting that these neural circuits could be involved in light-aversion behavior, Caval-Holme said. "We don't know which of these ipRGC subtypes in the neonatal retina actually contributes to the behavior, so it will be very interesting to see what role all these different subtypes have."
The researchers also found evidence that the circuit tunes itself in a way that could adapt to the intensity of light, which probably has an important role in development, Feller said.
"In the past, people demonstrated that these light-sensitive cells are important for things like the development of the blood vessels in the retina and light entrainment of circadian rhythms, but those were kind of a light on/light off response, where you need some light or no light," she said. "This seems to argue that they are actually trying to code for many different intensities of light, encoding much more information than people had previously thought."
The research was supported by the National Institutes of Health (NIH F31EY028022-03, RO1EY019498, RO1EY013528, P30EY003176).

Story Source:
Materials provided by University of California - Berkeley. Original written by Robert Sanders. Note: Content may be edited for style and length.

Using fungi to search for medical drugs

Wild mushrooms

An enormous library of products derived from more than ten thousand fungi could help us find new drugs. Researchers from the group of Jeroen den Hertog at the Hubrecht Institute, in collaboration with researchers from the Westerdijk Institute and Utrecht University, have set up this library and screened it for biologically active compounds. They tested the biological activity of these fungal products first using zebrafish embryos. The researchers chose to use zebrafish embryos, because it allows the analysis of effects on many cell types at the same time, in a working body, and because zebrafish are physiologically very similar to humans. They have already found various known compounds, among which the cholesterol lowering drug lovastatin. The library of fungal products offers ample opportunity to search for new drugs.
The results of this research were published on the 26th of November in the scientific journal Scientific Reports.
Fungal products
We constantly need new therapeutic compounds in the clinic for various reasons, including our increasing age, with corresponding illnesses, and resistance to existing drugs. Fungi are an excellent, but underexplored source of these kinds of compounds, such as lovastatin, a compound produced by the fungus Aspergillus terreus and that is used as a cholesterol lowering drug. Jelmer Hoeksma, one of the researchers at the Hubrecht Institute, explains: "Every year new compounds produced by fungi are identified, but so far we have only investigated a very small subset of all existing fungi. This suggests that many more biologically active compounds remain to be discovered."
Ten thousand fungi
The collaboration with the Westerdijk Fungal Biodiversity Institute, home to the largest collection of live fungi in the world, enabled the researchers to set up a large library of filtrates derived from more than ten thousand different fungi. A filtrate contains all the products that the fungus excretes. To search for therapeutic compounds, the researchers investigated the effects of this large library of fungal products first on zebrafish embryos. The zebrafish embryos enabled the researchers to study effects on the whole body during development. Zebrafish are vertebrates that are physiologically very similar to humans and are often used to test drugs for a variety of disorders. Within a few days these embryos develop most of their organs, making biological activity of the fungal compounds readily detectable. In addition, comparison to known drugs may result in identification of new drugs and also point towards the underlying mechanisms of action of these compounds.
Pigmentation
The researchers found 1526 filtrates that contain biologically active compounds with an effect on zebrafish embryos, from which they selected 150 filtrates for further analysis. From these, they isolated 34 known compounds, including the cholesterol lowering drug lovastatin, which was produced by the fungus Resinicium furfuraceum. Until now it was unknown that this fungus produces lovastatin. In addition, the researchers found filtrates that affect pigmentation in zebrafish embryos. Other studies have shown that factors involved in pigmentation can also play a crucial role in the development of skin cancer. The researchers are currently isolating the active compounds that cause pigmentation defects in zebrafish embryos from the filtrates.
Tip of the iceberg
This study underlines the large variety of biologically active compounds that are produced by fungi and the importance of further investigating these compounds in the search for new drugs. Hoeksma: "The large library of fungal filtrates that we have set up can also be tested in many other systems, such as models for antibiotic resistance in bacteria and tumor development, making this study only the tip of the iceberg."

Story Source:
Materials provided by Hubrecht InstituteNote: Content may be edited for style and length.

Amazon deforestation and number of fires show summer of 2019 not a 'normal' year

Burning in Amazon rainforest

The fires that raged across the Brazilian Amazon this summer were not 'normal' and large increases in deforestation could explain why, scientists show.
The perceived scale of the Amazon blazes received global attention this summer. However, international concerns raised at the time were countered by the Brazilian Government, which claimed the fire situation in August was 'normal' and 'below the historical average'.
An international team of scientists writing in the journal Global Change Biology say the number of active fires in August was actually three times higher than in 2018 and the highest number since 2010.
Although fires in the Amazon can occur in a number of ways, the scientists show that there is strong evidence to link this year's increases to deforestation.
They have used evidence collected from the Brazilian Government's DETER-b deforestation detection system -- which calculates deforestation by interpreting images taken by NASA satellites.
This shows that deforestation in July this year was almost four times the average from the same period in the previous three years. This is important as deforestation is almost always followed by fire -- the cut vegetation is left to dry before being burned.
Professor Jos Barlow, lead author of the paper said: "The marked upturn in both active fire counts and deforestation in 2019 therefore refutes suggestions by the Brazilian Government that August 2019 was a normal fire month in the Amazon."
August's blazes occurred at a time without a strong drought. Droughts can provide conditions favourable to the spreading of human-made fires. The scientists also show that the 'enormous' smoke plumes that reached high into the atmosphere, which were captured by media footage of the blazes, could only have been caused by the combustion of large amounts of biomass.
The researchers acknowledge that the number of active fires decreased in September by 35 per cent. Though they say it is not clear whether that fall is due to rains or President Bolsonaro's two-month moratoria on fires.
Images from DETER-b show that deforestation continued at a rate well above the average in September, despite the President's moratoria.
The extent of August's fires is unclear. Although the numbers of fires are counted, their extent is not, the researchers acknowledge in their paper 'Clarifying Amazonia's burning crisis'.
Dr Erika Berenguer, a Brazilian researcher jointly affiliated with Lancaster University and the University of Oxford, said: "Our paper clearly shows that without tackling deforestation, we will continue to see the largest rainforest in the world being turned to ashes. We must curb deforestation.
"Brazil has for the past decade been an environmental leader, showing to the world that it can successfully reduce deforestation. It is both economically and environmentally unwise to revert this trend."
The paper's authors are Jos Barlow of Lancaster University, Erika Berenguer of Lancaster University and the University of Oxford, Rachel Carmenta of the University of Cambridge, and Filipe França of the Universidade Federal do Pará.

Story Source:
Materials provided by Lancaster UniversityNote: Content may be edited for style and length.

The world is getting wetter, yet water may become less available for North America and Eurasia

Drips from faucet in dry environment

With climate change, plants of the future will consume more water than in the present day, leading to less water available for people living in North America and Eurasia, according to a Dartmouth-led study in Nature Geoscience. The research suggests a drier future despite anticipated precipitation increases for places like the United States and Europe, populous regions already facing water stresses.
The study challenges an expectation in climate science that plants will make the world wetter in the future. Scientists have long thought that as carbon dioxide concentrations increase in the atmosphere, plants will reduce their water consumption, leaving more freshwater available in our soils and streams. This is because as more carbon dioxide accumulates in our atmosphere plants can photosynthesize the same amount while partly closing the pores (stomata) on their leaves. Closed stomata means less plant water loss to the atmosphere, increasing water in the land. The new findings reveal that this story of plants making the land wetter is limited to the tropics and the extremely high latitudes, where freshwater availability is already high and competing demands on it are low. For much of the mid-latitudes, the study finds, projected plant responses to climate change will not make the land wetter but drier, which has massive implications for millions of people.
"Approximately 60 percent of the global water flux from the land to the atmosphere goes through plants, called transpiration. Plants are like the atmosphere's straw, dominating how water flows from the land to the atmosphere. So vegetation is a massive determinant of what water is left on land for people," explained lead author Justin S. Mankin, an assistant professor of geography at Dartmouth and adjunct research scientist at Lamont-Doherty Earth Observatory at Columbia University. "The question we're asking here is, how do the combined effects of carbon dioxide and warming change the size of that straw?"
Using climate models, the study examines how freshwater availability may be affected by projected changes in the way precipitation is divided among plants, rivers and soils. For the study, the research team used a novel accounting of this precipitation partitioning, developed earlier by Mankin and colleagues to calculate the future runoff loss to future vegetation in a warmer, carbon dioxide-enriched climate.
The new study's findings revealed how the interaction of three key effects of climate change's impacts on plants will reduce regional freshwater availability. First, as carbon dioxide increases in the atmosphere, plants require less water to photosynthesize, wetting the land. Yet, second, as the planet warms, growing seasons become longer and warmer: plants have more time to grow and consume water, drying the land. Finally, as carbon dioxide concentrations increase, plants are likely to grow more, as photosynthesis becomes amplified. For some regions, these latter two impacts, extended growing seasons and amplified photosynthesis, will outpace the closing stomata, meaning more vegetation will consume more water for a longer amount of time, drying the land. As a result, for much of the mid-latitudes, plants will leave less water in soils and streams, even if there is additional rainfall and vegetation is more efficient with its water usage. The result also underscores the importance of improving how climate models represent ecosystems and their response to climate change.
The world relies on freshwater for human consumption, agriculture, hydropower, and industry. Yet, for many places, there's a fundamental disconnect between when precipitation falls and when people use this water, as is the case with California, which gets more than half of its precipitation in the winter, but peak demands are in the summer. "Throughout the world, we engineer solutions to move water from point A to point B to overcome this spatiotemporal disconnect between water supply and its demand. Allocating water is politically contentious, capital-intensive and requires really long-term planning, all of which affects some of the most vulnerable populations. Our research shows that we can't expect plants to be a universal panacea for future water availability. So, being able to assess clearly where and why we should anticipate water availability changes to occur in the future is crucial to ensuring that we can be prepared," added Mankin.
Researchers from Lamont-Doherty Earth Observatory of Columbia University, Richard Seager, Jason E. Smerdon, Benjamin I. Cook, who is also affiliated with NASA Goddard Institute for Space Studies, and A. Park Williams, contributed to this study.

Story Source:
Materials provided by Dartmouth CollegeNote: Content may be edited for style and length.

A runaway star ejected from the galactic heart of darkness

A runaway star ejected from the galactic heart of darkness

Astronomers have spotted an ultrafast star, traveling at a blistering 6 million km/h, that was ejected by the supermassive black hole at the heart at the Milky Way five million years ago.
The discovery of the star, known as S5-HVS1, was made by Carnegie Mellon University Assistant Professor of Physics Sergey Koposov as part of the Southern Stellar Stream Spectroscopic Survey (S5). Located in the constellation of Grus -- the Crane -- S5-HVS1 was found to be moving ten times faster than most stars in the Milky Way.
"The velocity of the discovered star is so high that it will inevitably leave the galaxy and never return," said Douglas Boubert from the University of Oxford, a co-author on the study.
Astronomers have wondered about high velocity stars since their discovery only two decades ago. S5-HVS1 is unprecedented due to its high speed and close passage to the Earth, "only" 29 thousand light years away. With this information, astronomers could track its journey back into the center of the Milky Way, where a four million solar mass black hole, known as Sagittarius A*, lurks.
"This is super exciting, as we have long suspected that black holes can eject stars with very high velocities. However, we never had an unambiguous association of such a fast star with the galactic center," said Koposov, the lead author of this work and member of Carnegie Mellon's McWilliams Center for Cosmology. "We think the black hole ejected the star with a speed of thousands of kilometers per second about five million years ago. This ejection happened at the time when humanity's ancestors were just learning to walk on two feet."
Superfast stars can be ejected by black holes via the Hills Mechanism, proposed by astronomer Jack Hills thirty years ago. Originally, S5-HSV1 lived with a companion in a binary system, but they strayed too close to Sagittarius A*. In the gravitational tussle, the companion star was captured by the black hole, while S5-HVS1 was thrown out at extremely high speed.
"This is the first clear demonstration of the Hills Mechanism in action," said Ting Li from Carnegie Observatories and Princeton University, and leader of the S5 Collaboration. "Seeing this star is really amazing as we know it must have formed in the galactic center, a place very different to our local environment. It is a visitor from a strange land."
The discovery of S5-HVS1 was made with the 3.9-meter Anglo-Australian Telescope (AAT) near Coonabarabran, NSW, Australia, coupled with superb observations from the European Space Agency's Gaia satellite, that allowed the astronomers to reveal the full speed of the star and its journey from the center of the Milky Way.
"The observations would not be possible without the unique capabilities of the 2dF instrument on the AAT," said Daniel Zucker, an astronomer at Macquarie University in Sydney, Australia, and a member of the S5 executive committee. "It's been conducting cutting-edge research for over two decades and still is the best facility in the world for our project."
These results were published on November 4 online in the Monthly Notices of the Royal Astronomical Society, and the S5 collaboration unites astronomers from the United States, United Kingdom, Australia and Chile.
"I am so excited this fast-moving star was discovered by S5," says Kyler Kuehn, at Lowell Observatory and a member of the S5 executive committee. "While the main science goal of S5 is to probe the stellar streams -- disrupting dwarf galaxies and globular clusters -- we dedicated spare resources of the instrument to searching for interesting targets in the Milky Way, and voila, we found something amazing for 'free.' With our future observations, hopefully we will find even more!"

Story Source:
Materials provided by Carnegie Mellon UniversityNote: Content may be edited for style and length.

Thursday, 28 November 2019

New study looks to biological enzymes as source of hydrogen fuel


New study looks to biological enzymes as source of hydrogen fuel


Research from the University of Illinois and the University of California, Davis has chemists one step closer to recreating nature's most efficient machinery for generating hydrogen gas. This new development may help clear the path for the hydrogen fuel industry to move into a larger role in the global push toward more environmentally friendly energy sources.
The researchers report their findings in the Proceedings of the National Academy of Sciences.
Currently, hydrogen gas is produced using a very complex industrial process that limits its attractiveness to the green fuel market, the researchers said. In response, scientists are looking toward biologically synthesized hydrogen, which is far more efficient than the current human-made process, said chemistry professor and study co-author Thomas Rauchfuss.
Biological enzymes, called hydrogenases, are nature's machinery for making and burning hydrogen gas. These enzymes come in two varieties, iron-iron and nickel-iron -- named for the elements responsible for driving the chemical reactions. The new study focuses on the iron-iron variety because it does the job faster, the researchers said.
The team came into the study with a general understanding of the chemical composition of the active sites within the enzyme. They hypothesized that the sites were assembled using 10 parts: four carbon monoxide molecules, two cyanide ions, two iron ions and two groups of a sulfur-containing amino acid called cysteine.
The team discovered that it was instead more likely that the enzyme's engine was composed of two identical groups containing five chemicals: two carbon monoxide molecules, one cyanide ion, one iron ion and one cysteine group. The groups form one tightly bonded unit, and the two units combine to give the engine a total of 10 parts.
But the laboratory analysis of the lab-synthesized enzyme revealed a final surprise, Rauchfuss said. "Our recipe is incomplete. We now know that 11 bits are required to make the active site engine, not 10, and we are in the hunt for that one final bit."
Team members say they are not sure what type of applications this new understanding of the iron-iron hydrogenase enzyme will lead to, but the research could provide an assembly kit that will be instructive to other catalyst design projects.
"The take-away from this study is that it is one thing to envision using the real enzyme to produce hydrogen gas, but it is far more powerful to understand its makeup well enough to able to reproduce it for use in the lab," Rauchfuss said.
Researchers from the Oregon Health and Science University also contributed to this study.
The National Institutes of Health supported this study.

Story Source:
Materials provided by University of Illinois at Urbana-Champaign, News BureauNote: Content may be edited for style and length.

Teens are using a highly potent form of marijuana

Cannabis sativa plant

Nearly one in four Arizona teens have used a highly potent form of marijuana known as marijuana concentrate, according to a new study by Arizona State University researchers.
Among nearly 50,000 eighth, 10th, and 12th graders from the 2018 Arizona Youth Survey, a biennial survey of Arizona secondary school students, one-third (33%) had tried some form of marijuana, and nearly a quarter (24%) had tried marijuana concentrate.
Marijuana concentrates have about three times more THC, the constituent of marijuana that causes the "high," than a traditional marijuana flower. This is concerning because higher doses of THC have been linked to increased risk of marijuana addiction, cognitive impairment and psychosis, said the study's lead researcher, Madeline Meier, an ASU assistant professor of psychology.
The research team also found that teens who used concentrates had more risk factors for addiction. The researchers compared teens who had used marijuana concentrates with teens who had used some form of marijuana but not marijuana concentrates and teens who had never used any form of marijuana on known risk factors for addiction, such as lower perceived risk of harm of marijuana, peer substance use, parental substance use, academic failure and greater perceived availability of drugs in the community. They found that teens who had used marijuana concentrates were worse off on every addiction risk factor.
"This is important because it shows that teens who have a diverse array of risk factors for developing marijuana addiction may be further amplifying their risk for addiction by using high-THC marijuana concentrates," explained study co-author, Dustin Pardini, an associate professor in ASU's School of Criminology & Criminal Justice.
The study "Cannabis Concentrate Use in Adolescents," is published in the early online edition (Aug. 26, 2019) of Pediatrics.
The team -- which includes ASU researchers Meagan Docherty, School of Criminology & Criminal Justice; Scott Leischow, College of Health Solutions; and Kevin Grimm, Department of Psychology -- also found that teens who had used concentrates had much higher rates of e-cigarette use. One explanation for this might be that teens are using e-cigarettes to vape marijuana concentrate, according to Meier. Earlier studies, including those by Meier, have shown that youth put marijuana in e-cigarettes to conceal their marijuana use.
"Vaping marijuana can be passed off as nicotine vaping," Meier explained.
This finding reinforces the recent decision by the Food and Drug Administration to impose new restrictions on e-cigarettes and their constituents as a means of reducing marijuana use, according to the researchers.
Marijuana concentrates don't look like the traditional marijuana flower. Concentrates can look like wax, oil, or a brittle substance that shatters easily.
"What concerns me most is that parents might have no idea that their child is using marijuana, especially if their child is using marijuana concentrate," said Meier. "Marijuana is not harmless, particularly for adolescents."
Meier's earlier research suggests that frequent marijuana use from adolescence through adulthood is associated with IQ decline. Pardini's prior research has linked regular marijuana use during adolescence with the emergence of persistent subclinical psychotic symptoms.
The researchers' next steps are to ascertain if concentrate users do in fact exhibit higher rates of addiction, cognitive impairment and psychosis.
The Arizona Youth Survey is administered by the Arizona Criminal Justice Commission (ACJC) through funds appropriated by the Arizona Legislature.

Wednesday, 27 November 2019

Social media stress can lead to social media addiction

Social media stress can lead to social media addiction

Social network users risk becoming more and more addicted to social media platforms even as they experience stress from their use.
Social networking sites (SNS) such as Facebook and Instagram are known to cause stress in users, known as technostress from social media. However, when faced with such stress, instead of switching off or using them less, people are moving from one aspect of the social media platforms to another -- escaping the causes of their stress without leaving the medium on which it originated.
Research into the habits of 444 Facebook users revealed they would switch between activities such as chatting to friends, scanning news feeds and posting updates as each began to cause stress. This leads to an increased likelihood of technology addiction, as they use the various elements of the platform over a greater timespan.
Researchers from Lancaster University, the University of Bamberg and Friedrich-Alexander Univeristät Erlangen-Nürnberg, writing in Information Systems Journal, found that users were seeking distraction and diversion within the Facebook platform as a coping mechanism for stress caused by the same platform, rather than switching off and undertaking a different activity.
Professor Monideepa Tarafdar, Professor of Information Systems and Co-Director of the Centre for Technological Futures at Lancaster University Management School, who co-authored the study, said: "While it might seem counter-intuitive, social media users are continuing to use the same platforms that are causing them stress rather than switching off from them, creating a blurring between the stress caused and the compulsive use."
Assistant Professor Christian Maier, of the University of Bamberg, who collected the data from the Facebook users along with Professor Sven Laumer, Schöller Endowed Professor and Chair of Information Systems and the Deputy Director of the Dr. Theo und Friedl Schöller Research Center. said: "Because social network sites offer such a wide range of features, users can find they act both as stressors and as a distraction from that stress."
"Even when users are stressed from SNS use, they are using the same platforms to cope with that stress, diverting themselves through other activities on the SNS, and ultimately building compulsive and excessive behaviour. As a result, they embed themselves in the social network environment rather than getting away from it, and an addiction is formed."
The research team looked at various different forms of technostress caused by using social media, such users feeling that SNS were invading their personal life, adapting their SNS use to conform to that of their friends, experiencing excessive social demands and too much social information, and facing constant changes and updates to the SNS platform.
They further examined two separate ways of coping with the stress. The first included users creating a diversion by partaking in other activities away from social media, which is the more obvious path. They would switch off, talk to friends or family about issues they were experiencing and spend less time on the platform.
However, the other method consisted of diversion through engaging in different activities within the same SNS app itself, and potentially moving on a pathway towards SNS addiction. This method was more prevalent among those social media users who used the sites more regularly.
Professor Sven Laumer said: "We found that those users who had a greater social media habit- needed less effort to find another aspect of the platforms, and were thus more likely to stay within the SNS rather than switch off when they needed to divert themselves. The stronger the user's SNS habit, the higher the likelihood they would keep using it as a means of diversion as a coping behaviour in response to stressors, and possibly develop addiction to the SNS."
"Users go to different areas of the platform which they see as being separate and that they use in different ways. With Facebook, there are features that take you into different worlds within the same platform. You can be in many different places all from the same application, for example following friends' activities, posting pictures about daily activities, switching to a chat feature or playing games."
Professor Monideepa Tarafdar added: "The idea of using the same environment that is causing the stress as means of coping with that stress is novel. It is an interesting phenomenon that seems distinctive to technostress from social media."

Story Source:
Materials provided by Lancaster UniversityNote: Content may be edited for style and length.

Tuesday, 26 November 2019

Measuring methane from coal and gas in Pennsylvania informative

Measuring methane from coal and gas

While methane pollution caused by natural gas production in Pennsylvania is underestimated by the U.S. Environmental Protection Agency, natural gas still has half the carbon footprint of underground coal mining, according to an international team of researchers.
"At the rates we found for methane, natural gas in Pennsylvania is still much, much cleaner than coal," said Zachary Barkley research associate in meteorology, Penn State. "Obviously, renewable energy would be better, but there is no debate, switching to natural gas is worth it in the short run."
The researchers looked at methane in the atmosphere by flying transects over the southwestern portion of Pennsylvania and adjacent portions of West Virginia and Ohio. Researchers from the University of Maryland collected data from the flights.
"The southwestern part of the state has huge amounts of natural gas and coal and we were getting methane from there during a previous project in northeastern Pennsylvania," said Barkley.
The southwestern area of Pennsylvania has long been a coal mining area and drilling into the Marcellus shale also has increased the numbers of unconventional natural gas wells in the area. The EPA estimates the amounts of methane put into the atmosphere from both coal mining and natural gas drilling every year.
Because both sources of methane exist in the same area, the researchers could not just measure methane and separate the sources. To determine the split between coal and natural gas, the researchers looked at the amounts of both methane and ethane and determined that more ethane is produced from natural gas wells, than from coal mines. They reported their results in a recent issue of Geophysical Research Letters.
Using the ratio of methane to ethane, the researchers found that the amount of methane released to the atmosphere from coal mines was very close to the EPA estimates, but that the amounts of methane from unconventional natural gas wells were between two to eight times the estimated EPA amounts. The EPA estimate is about 0.1 percent leakage, while the study found about 0.5 percent leakage. However, they note that because the wells in the Marcellus shale are such high-production wells, the net impact on climate of this natural gas production is still much lower than from coal mines after accounting for emissions from energy generation.
"High-producing wells have a much lower leakage rate than older wells which only produce 2 to 3 percent of Pennsylvania gas, but are estimated to produce about 40 percent of the state's total emission of methane from natural gas," said Barkley.
The researchers suggest that using the ratio of methane to ethane in other areas where emissions are from mixed sources could help to tease out the percentages of the carbon footprint from each source.
"Burning fossil fuel, whether coal or gas, is bad for the climate," said Kenneth J. Davis, professor of atmospheric and climate science, Penn State. "These underground coal mines are clearly more damaging than Marcellus gas production, but the gas production isn't as clean as we thought. We need more data like this to inform energy policy."

Story Source:
Materials provided by Penn StateNote: Content may be edited for style and length.

Microbes harvest electrons: Novel process discovered

Bacteria illustration

Ever since scientists discovered that certain microbes can get their energy from electrical charges, researchers have wondered how they do it.
Bacteria don't have mouths, so they need another way to bring their fuel into their bodies. New research from Washington University in St. Louis reveals how one such bacteria pulls in electrons straight from an electrode source. The work from the laboratory of Arpita Bose, assistant professor of biology in Arts & Sciences, was published Nov. 5 in the scientific journal mBio.
"The molecular underpinning of this process has been difficult to unravel until our work," Bose said. "This is mostly due to the complex nature of the proteins involved in this process. But now, for the first time, we understand how phototrophic microbes can accept electrons from solid and soluble substances."
Dinesh Gupta, a PhD candidate in the Bose laboratory, is the first author on this new study. "I was excited when we found that these phototrophic bacteria use a novel processing step to regulate the production of key electron transfer protein involved in this process," Gupta said. "This study will aid in designing a bacterial platform where bacteria can feed on electricity and carbon dioxide to produce value-added compounds such as biofuels."
Getting the electricity across the outer layer of the bacteria is the key challenge. This barrier is both nonconductive and impermeable to insoluble iron minerals and/or electrodes.
Bose and her collaborators, including Robert Kranz, professor of biology, showed that the naturally occurring strain of Rhodopseudomonas palustris TIE-1 builds a conduit to accept electrons across its outer membrane. The bacteria relies on an iron-containing helper molecule called a deca-heme cytochrome c. By processing this protein, TIE-1 can form an essential bridge to its electron source.
Extracellular electron uptake, or EEU, can help microbes to survive under nutrient-scarce conditions.
Now that Bose has documented these mechanisms behind EEU, she hopes to use it as a biological marker to identify other electricity-eating bacteria in the wild. The findings will help researchers to understand the importance of this functionality in metabolic evolution and microbial ecology.

Story Source:
Materials provided by Washington University in St. Louis. Original written by Talia Ogliore. Note: Content may be edited for style and length.