Sunday, February 28, 2021

We Don’t Know Why Lungs Wheeze

But We Just Got a New Clue

From: Sciencealert.com

By David Nield, February 26, 2021 – Wheezing is a common occurrence that most of us will have experienced, whether temporarily through something like hayfever or a cold, or more long-term through a condition such as asthma or chronic obstructive pulmonary disease (COPD). But scientists don't understand much about why it happens.

New research has used a combination of modelling and high-resolution video to try and shed some light on the mechanisms of wheezing, finding that there's a "violent" process that can cause our lung pipes to make these raspy sounds.

With this new information available, the team is hoping that wheezing might be better understood and diagnosed in the future.

"Because wheezing makes it harder to breathe, it puts an enormous amount of pressure on the lungs," says engineer Alastair Gregory, from the University of Cambridge in the UK.

"The sounds associated with wheezing have been used to make diagnoses for centuries, but the physical mechanisms responsible for the onset of wheezing are poorly understood, and there is no model for predicting when wheezing will occur."

To get to the bottom of wheezing, scientists had to get to the end of the flexible bronchiole tubes that make up the branching network in the lungs. They built their own lung substitute by adapting a device called a Starling resistor, made up of thin elastic tubes of varying lengths and thicknesses.

Air was forced through the tubes at different degrees of tension and then filmed with a multi-camera stereoscopy setup. The scientists were then able to observe how wheezing might begin and be sustained, through a series of oscillations in the tubes (or in the lungs).

"It surprised us just how violent the mechanism of wheezing is," says Gregory.

"We found that there are two conditions for wheezing to occur: the first is that the pressure on the tubes is such that one or more of the bronchioles nearly collapses, and the second is that air is forced through the collapsed airway with enough force to drive oscillations."

In either case, the oscillations are sustained through a fluttering mechanism, where the travelling waves of air have the same frequency as the opening and closing of the tube. The same sort of resonance scenario can collapse bridges and cause aircraft wings to fail, which shows how damaging it could be to the lungs.

The scientists went on to develop a 'tube law', which factors in the material properties and the geometry of the tubes, as well as the amount of tension they're under, to calculate when oscillations might occur. If the law can be adapted to the human lungs, we might have a new way of analyzing wheezes and identifying both the type and the location of serious bronchial problems.

More work is going to be needed to fine-tune the system and better pick up on wheezing sounds, but the researchers are hopeful that a simple audio recording setup could in some circumstances replace X-rays and MRI scans, which are expensive and time-consuming to operate.

"Since wheezing is associated with so many conditions, it is difficult to be sure of what is wrong with a patient just based on the wheeze, so we're working on understanding how wheezing sounds are produced so that diagnoses can be more specific," says engineer Anurag Agarwal, from the University of Cambridge.

The research has been published in Royal Society Open Science.

https://www.sciencealert.com/nobody-really-understands-why-lungs-make-wheezing-sounds-but-now-we-have-some-clues

Saturday, February 27, 2021

Microbes Survive Deep Beneath Seafloor

They are sustained primarily by chemicals created by the natural irradiation of water molecules. Results of this research may have implications for life on Mars.

From:  University of Rhode Island

February 26, 2021 -- A team of researchers from the University of Rhode Island's Graduate School of Oceanography and their collaborators have revealed that the abundant microbes living in ancient sediment below the seafloor are sustained primarily by chemicals created by the natural irradiation of water molecules.

The team discovered that the creation of these chemicals is amplified significantly by minerals in marine sediment. In contrast to the conventional view that life in sediment is fueled by products of photosynthesis, an ecosystem fueled by irradiation of water begins just meters below the seafloor in much of the open ocean. This radiation-fueled world is one of Earth's volumetrically largest ecosystems.

The research was published today in the journal Nature Communications.

"This work provides an important new perspective on the availability of resources that subsurface microbial communities can use to sustain themselves. This is fundamental to understand life on Earth and to constrain the habitability of other planetary bodies, such as Mars," said Justine Sauvage, the study's lead author and a postdoctoral fellow at the University of Gothenburg who conducted the research as a doctoral student at URI.

The process driving the research team's findings is radiolysis of water -- the splitting of water molecules into hydrogen and oxidants as a result of being exposed to naturally occurring radiation. Steven D'Hondt, URI professor of oceanography and a co-author of the study, said the resulting molecules become the primary source of food and energy for the microbes living in the sediment.

"The marine sediment actually amplifies the production of these usable chemicals," he said. "If you have the same amount of irradiation in pure water and in wet sediment, you get a lot more hydrogen from wet sediment. The sediment makes the production of hydrogen much more effective."

Why the process is amplified in wet sediment is unclear, but D'Hondt speculates that minerals in the sediment may "behave like a semiconductor, making the process more efficient."

The discoveries resulted from a series of laboratory experiments conducted in the Rhode Island Nuclear Science Center. Sauvage irradiated vials of wet sediment from various locations in the Pacific and Atlantic Oceans, collected by the Integrated Ocean Drilling Program and by U.S. research vessels. She compared the production of hydrogen to similarly irradiated vials of seawater and distilled water. The sediment amplified the results by as much as a factor of 30.

"This study is a unique combination of sophisticated laboratory experiments integrated into a global biological context," said co-author Arthur Spivack, URI professor of oceanography.

The implications of the findings are significant.

"If you can support life in subsurface marine sediment and other subsurface environments from natural radioactive splitting of water, then maybe you can support life the same way in other worlds," said D'Hondt. "Some of the same minerals are present on Mars, and as long as you have those wet catalytic minerals, you're going to have this process. If you can catalyze production of radiolytic chemicals at high rates in the wet Martian subsurface, you could potentially sustain life at the same levels that it's sustained in marine sediment."

Sauvage added, "This is especially relevant given that the Perseverance Rover has just landed on Mars, with its mission to collect Martian rocks and to characterize its habitable environments."

D'Hondt said the research team's findings also have implications for the nuclear industry, including for how nuclear waste is stored and how nuclear accidents are managed. "If you store nuclear waste in sediment or rock, it may generate hydrogen and oxidants faster than in pure water. That natural catalysis may make those storage systems more corrosive than is generally realized," he said.

The next steps for the research team will be to explore the effect of hydrogen production through radiolysis in other environments on Earth and beyond, including oceanic crust, continental crust and subsurface Mars. They also will seek to advance the understanding of how subsurface microbial communities live, interact and evolve when their primary energy source is derived from the natural radiolytic splitting of water.

https://www.sciencedaily.com/releases/2021/02/210226103802.htm

Friday, February 26, 2021

Winter Storm Uri of 2021

The February 13–17, 2021 North American winter storm, unofficially referred to as Winter Storm Uri, was a major winter and ice storm that had widespread impacts across the United States, Northern Mexico, and parts of Canada from February 13 to 17. The storm started out in the Pacific Northwest and quickly moved into the Southern United States, before moving on to the Midwestern and Northeastern United States a couple of days later.

The storm resulted in over 170 million Americans being placed under various winter weather alerts being issued by the National Weather Service in the United States across the country and caused blackouts for over 9.9 million people in the U.S. and Mexico, most notably the 2021 Texas power crisis.  The blackouts were the largest in the U.S. since the Northeast blackout of 2003.  The storm also brought severe destructive weather to Southeastern United States, including several tornadoes. On February 16, there were at least 20 direct fatalities and 13 indirect fatalities attributed to the storm; by February 19, the death toll had risen to at least 70, including 58 people in the United States and 12 people in Mexico.  The damages from the blackouts are expected to reach $19 billion (2021 USD).

Meteorological History

On February 13, a frontal storm developed off the coast of the Pacific Northwest and moved ashore, before moving southeastward, with the storm becoming disorganized in the process.  During this time, the storm reached a minimum pressure of 992 millibars (29.3 inHg) over the Rocky Mountains.  On the same day, The Weather Channel gave the storm the unofficial name Winter Storm Uri, due to the expected impacts from the storm; the Federal Communications Commission later adopted the name in their reports after February 17.  Over the next couple of days the storm began to develop as it entered the Southern United States and moved into Texas.  On February 15, the system developed a new surface low off the coast of the Florida Panhandle, as the storm turned northeastward and expanded in size.

On February 16, the storm developed another low-pressure center to the north as the system grew more organized, while moving towards the northeast.  Later that day, the storm broke in half, with the newer storm moving northward into Quebec, while the original system moved off the East Coast of the U.S.  By the time the winter storm exited the U.S. late on February 16, the combined snowfall from the multiple winter storms within the past month had left nearly 75% of the contiguous United States covered by snow.  On February 17, the storm's secondary low dissipated as the system approached landfall on Newfoundland, intensifying in the process.  At 12:00 UTC that day, the storm's central pressure reached 985 millibars (29.1 inHg), as the center of the storm moved over Newfoundland.  On the same day, the storm was given the name Belrem by the Free University of Berlin.  The storm continued to strengthen as it moved across the North Atlantic, with the storm's central pressure dropping to 960 millibars (28 inHg) by February 19.  On February 20, the storm developed a second-low pressure area and gradually began to weaken, as it moved northwestward towards Iceland.  Afterward, the storm turned westward and moved across southern Greenland on February 22, weakening even further as it did so.  The storm then stalled south of Greenland, while continuing to weaken, before dispersing on February 24.

https://en.wikipedia.org/wiki/February_13%E2%80%9317,_2021_North_American_winter_storm

 

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =    

2021 Texas Power Crisis

The 2021 Texas power crisis is an ongoing crisis in the state of Texas in the United  States involving mass utilities failure, such as power outages, water and food shortages, and dangerous weather conditions.  The crisis was the result of two severe winter storms sweeping across the United States on February 10–11 and 13–17.  More than 4.5 million homes and businesses in Texas were left without power, some for several days. The cause of the power outages was initially blamed on frozen wind turbines by some Republican government officials, including Texas governor Greg Abbott, but frozen natural gas lines and instruments were found to be the main cause.  The crisis caused many experts to call into question the state's preparedness for such a storm, especially in light of its deregulated energy market.  The damages from the blackouts are expected to reach $19 billion (2021 USD.

Background of the Storm

In mid-February 2021, a series of severe winter storms swept across the United States. This outbreak was due to the jet stream dipping particularly far south into the U.S., stretching from Washington to Texas, and running back north along the East Coast, allowing a polar vortex to bring very cold air across the country and spawning multiple storms along the jet stream track as a result.  This weather phenomenon resulted in record low temperatures throughout Texas, with temperatures in Dallas, Austin and San Antonio falling below temperatures in Anchorage, Alaska.

On February 10, a winter storm formed north of the Gulf coast, dropping significant amounts of sleet and ice on many states in the Deep South and the Ohio Valley, including Texas, Georgia, Louisiana, Arkansas, Tennessee, and states on the East Coast.  A second storm developed off the Pacific Northwest on February 13 and began to gradually develop into an organized storm as it tracked southward Texas. It grew even more organized as it turned toward the northeast US before splitting in half—one half continued into Quebec and the other moving out over the Atlantic Ocean.  This storm, along with various other storms from the previous two weeks, resulted in over 75% of the contiguous U.S. being covered in snow.  This storm was directly responsible for nearly 10 million people losing power, with 5.2 million in the US and 4.7 million in Mexico.  At least 58 people lost their lives, and a tornado outbreak from the storm spanned Florida, Georgia, and North Carolina.

The winter storm caused a record low temperature at Dallas-Fort Worth International Airport of −2 °F (−19 °C) on February 16, the coldest in North Texas in 72 years.  Power equipment in Texas was not winterized, leaving it vulnerable to extended periods of cold weather.  Texas Governor Greg Abbott and some other politicians initially blamed renewable energy sources for the power outages, citing frozen wind turbines as an example of their unreliability.  However, renewable energy accounts for only 23% of Texas power output; moreover, equipment for other energy sources such as natural gas power generating facilities either freezing up or having mechanical failures were more responsible.  Viral images of a helicopter de-icing a Texas wind turbine were actually an image from Sweden taken in 2015.  Governor Abbott later acknowledged that every source of power, not just renewable ones, had failed.  Five times more natural gas than wind power had been lost.  When power was cut, it disabled some compressors that push gas through pipelines, knocking out further gas plants due to lack of supply.

During the 2011 Groundhog Day blizzard, Texas had faced similar power outages due to frozen power equipment, after which the Federal Energy Regulatory Commission reported that more winterizing of power infrastructure was necessary.  ERCOT said that some generators since then implemented new winter "best practices," but these were on a voluntary basis and mandatory regulation had not been established.

Gov. Abbott's appointees to the Public Utility Commission of Texas ended a contract with the Texas Reliability Entity in November 2020, reducing oversight of the grid.  In July, Abbott's comissioners disbanded its Oversight and Enforcement Division, dropping pending cases that ensure reliability.  While not a direct cause, the Commission's minimal oversight of utility companies, limited budget, and voluntary standards restricted its ability to secure consistent performance.

Power Outages

In addition to equipment problems, demand for electricity in Texas hit a record 69,150 megawatts (MW) on February 14—3,200 MW higher than the previous record set in January 2018.  The Electric Reliability Council of Texas (ERCOT) initiated rotating outages at 1:25am on February 15.  The rotating outages prevented electricity demand from overwhelming the grid, a scenario which could have caused equipment to catch fire and power lines to go down, potentially resulting in a much more severe blackout.

At the peak, over 5 million people in Texas were without power, some for more than 3 days.  These outages have been felt disproportionately in lower-income and minority ethnic ZIP codes.

During the period of outages, wholesale electric prices went to as high as $9,000/megawatt-hour which was limited as a "system cap", compared to a more typical $50/MWh. Customers with pricing plans based on wholesale prices who had power faced large bills.  Some Griddy customers signed up for wholesale variable rates plans allowed by the Texas deregulated electricity market, found themselves facing over $5,000 bills for five days of service during the storm.  Wholesale prices stayed at $9,000 for about four days, which is a cap set by the Electric Reliability Council of Texas.

2021 Texas power crisis - Wikipedia

Thursday, February 25, 2021

Single Cell Memory Without a Brain

How a single cell slime mold makes smart decisions without a central nervous system

From:  Technical University of Munich (TUM)

February 23, 2021 -- Researchers have identified how the slime mold Physarum polycephalum saves memories -- although it has no nervous system.

Having a memory of past events enables us to take smarter decisions about the future. Researchers at the Max-Planck Institute for Dynamics and Self-Organization (MPI-DS) and the Technical University of Munich (TUM) have now identified how the slime mold Physarum polycephalum saves memories -- although it has no nervous system.

The ability to store and recover information gives an organism a clear advantage when searching for food or avoiding harmful environments. Traditionally it has been attributed to organisms that have a nervous system.

A new study authored by Mirna Kramar (MPI-DS) and Prof. Karen Alim (TUM and MPI-DS) challenges this view by uncovering the surprising abilities of a highly dynamic, single-celled organism to store and retrieve information about its environment.

Window into the past

The slime mold Physarum polycephalum has been puzzling researchers for many decades. Existing at the crossroads between the kingdoms of animals, plants and fungi, this unique organism provides insight into the early evolutionary history of eukaryotes -- to which also humans belong.

Its body is a giant single cell made up of interconnected tubes that form intricate networks. This single amoeba-like cell may stretch several centimeters or even meters, featuring as the largest cell on earth in the Guinness Book of World Records.

Decision making on the most basic levels of life

The striking abilities of the slime mold to solve complex problems, such as finding the shortest path through a maze, earned it the attribute "intelligent." It intrigued the research community and kindled questions about decision making on the most basic levels of life.

The decision-making ability of Physarum is especially fascinating given that its tubular network constantly undergoes fast reorganization -- growing and disintegrating its tubes -- while completely lacking an organizing center.

The researchers discovered that the organism weaves memories of food encounters directly into the architecture of the network-like body and uses the stored information when making future decisions.

The network architecture as a memory of the past

"It is very exciting when a project develops from a simple experimental observation," says Karen Alim, head of the Biological Physics and Morphogenesis group at the MPI-DS and professor on Theory of Biological Networks at the Technical University of Munich.

When the researchers followed the migration and feeding process of the organism and observed a distinct imprint of a food source on the pattern of thicker and thinner tubes of the network long after feeding.

"Given P. polycephalum's highly dynamic network reorganization, the persistence of this imprint sparked the idea that the network architecture itself could serve as memory of the past," says Karen Alim. However, they first needed to explain the mechanism behind the imprint formation.

Decisions are guided by memories

For this purpose the researchers combined microscopic observations of the adaption of the tubular network with theoretical modeling. An encounter with food triggers the release of a chemical that travels from the location where food was found throughout the organism and softens the tubes in the network, making the whole organism reorient its migration towards the food.

"The gradual softening is where the existing imprints of previous food sources come into play and where information is stored and retrieved," says first author Mirna Kramar. "Past feeding events are embedded in the hierarchy of tube diameters, specifically in the arrangement of thick and thin tubes in the network."

"For the softening chemical that is now transported, the thick tubes in the network act as highways in traffic networks, enabling quick transport across the whole organism," adds Mirna Kramar. "Previous encounters imprinted in the network architecture thus weigh into the decision about the future direction of migration."

Design based on universal principles

"Given the simplicity of this living network, the ability of Physarum to form memories is intriguing. It is remarkable that the organism relies on such a simple mechanism and yet controls it in such a fine-tuned manner," says Karen Alim.

"These results present an important piece of the puzzle in understanding the behavior of this ancient organism and at the same time points to universal principles underlying behavior. We envision potential applications of our findings in designing smart materials and building soft robots that navigate through complex environments," concludes Karen Alim.

https://www.sciencedaily.com/releases/2021/02/210223121643.htm

Tuesday, February 23, 2021

A Critique of Western Child Raising Standards

A BBC article from February 22, 2021, discusses the unusual ways westerners raise children, especially compared to the practices common in the rest of the world.  The practice of infants and children sleeping in separate rooms, for example, is particularly different for the children of westerners.  See

Is the Western way of raising kids weird? - BBC Future

Dogs Synchronize Their Behavior with Children

But not as much as with adults, study finds

From Oregon State University

January 25, 2021 -- CORVALLIS, Ore. – Dogs synchronize their behavior with the children in their family, but not as much as they do with adults, a new study from Oregon State University researchers found.

The findings are important because there is a growing body of evidence that dogs can help children in many ways, including with social development, increasing physical activity, managing anxiety or as a source of attachment in the face of changing family structures, the researchers said. Yet, very little research has focused on how dogs perceive and socially engage with children.

“The great news is that this study suggests dogs are paying a lot of attention to the kids that they live with,” said Oregon State animal behaviorist Monique Udell, the lead author of the study. “They are responsive to them and, in many cases, behaving in synchrony with them, indicators of positive affiliation and a foundation for building strong bonds.

“One interesting thing we have observed is that dogs are matching their child’s behavior less frequently than what we have seen between dogs and adult caretakers, which suggests that while they may view children as social companions, there are also some differences that we need to understand better.”

The paper was recently published in the journal Animal Cognition. Co-authors were Shelby Wanser, a faculty research assistant in Udell’s lab, and Megan MacDonald, an associate professor in Oregon State’s College of Public Health and Human Sciences, who studies how motor skills and physically active lifestyles improve the lives of children with and without disabilities

The researchers recruited 30 youth between the ages of 8 and 17 years old – 83% of which had a developmental disability – to take part in the study with their family dog. The experiments took place in a large empty room. Color-coded taped lines were placed on the floor, and the children were given instructions on how to walk the lines in a standardized way with their off-leash dog.

The researchers videotaped the experiments and analyzed behavior based on three things: (1) activity synchrony, which means how much time the dog and child were moving or stationary at the same time; (2) proximity, or how much time the dog and child were within 1 meter of each other; and (3) orientation, how much time the dog was oriented in the same direction as the child.

They found that dogs exhibited behavioral synchronization with the children at a higher rate than would be expected by chance for all three variables. During their assessments, they found:

  • Active synchrony for an average of 60.2% of the time. Broken down further, the dogs were moving an average of 73.1% of the time that the children were moving and were stationary an average of 41.2% of the time the children were stationary.
  • Proximity within 1 meter of each other for an average of 27.1% of the time.
  • Orientation in the same direction for an average of 33.5% of the time.

While child-dog synchrony occurred more often that what would be expected by chance, those percentages are all lower than what other researchers have found when studying interactions between dogs and adults in their household. Those studies found “active synchrony” 81.8% of the time, but at 49.1% with shelter dogs. They found “proximity” 72.9% of the time and 39.7% with shelter dogs. No studies on dog-human behavioral synchronization have previously assessed body orientation.

The Oregon State researchers are conducting more research to better understand factors that contribute to differences in levels of synchrony and other aspects of bond quality between dogs and children compared to dogs and adults, including participation in animal assisted interventions and increasing the child’s responsibility for the dog’s care.

While research has found dogs can have a lot of positive impacts on a child’s life, there are also risks associated with the dog-child relationship, the researchers said. For example, other studies have found dogs are more apt to bite children versus adults.

“We still have a lot to learn about the dog-child relationship” Udell said. “We’re hoping this research can inform the best ways to shape positive outcomes and mitigate risks by helping children interact with dogs in a manner that improves the relationship and ultimately the welfare of both individuals.”

Based on this study, Udell also offered some takeaways for families with children and dogs.

“What we are finding is that kids are very capable of training dogs, and that dogs are paying attention to the kids and can learn from them,” she said. “Sometimes we don’t give children and dogs enough credit. Our research suggests that with some guidance we can provide important and positive learning experiences for our kids and our dogs starting at a much earlier age, something that can make a world of difference to the lives of both.”

Dogs synchronize their behavior with children, but not as much as with adults, study finds | Oregon State University

Monday, February 22, 2021

Five Clues Science Finds In Cryptocurrencies

By Ross Pomeroy, Real Clear Science

February 22, 2021

With cryptocurrency prices skyrocketing to all-time highs, digital monies based on the ultra-secure blockchain are making some people very wealthy and catching eyes across the Internet.

Scientists are also taking notice. While research into cryptocurrency is still in its infancy, dozens of studies have been published that shed light on these upstart digital assets and the people who purchase them.

Here are five takeaways:

1. Cryptocurrency consumes a lot of energy. Cryptocurrencies are energy-intensive because they require computers to solve complex puzzles to verify transactions. People who dedicate their computers to this process are rewarded with coins of a specific currency.  The University of Cambridge estimates that Bitcoin, with roughly a trillion dollars-worth in circulation, consumes around 118 TerraWatt-hours of electricity each year, about the same amount as the country of Norway.  Add in all of the other cryptocurrencies out there and the energy consumption rivals Thailand's. Some scientists contend that this process is inherently unsustainable and requires revision, while others disagree, insisting that digital currencies generate real wealth and the only "fix" needed is to replace fossil fuels with zero-emission energy sources.

2. Cryptocurrency traders tend to be more erratic.  A study published last fall conducted in South Korea compared Bitcoin investors with traditional stock investors on various psychological measures. The researchers found that Bitcoin investors tended to seek out novelty more often, bought and sold frequently, demonstrated a higher inclination to chase losses, and were more likely to gamble.

3. Cryptocurrency is a healthy part of a balanced portfolio. Whatever you think of it, cryptocurrency is undeniably innovative in the world of finance, and likely merits inclusion in a long-term investment portfolio. Last year, an international team of economists explored whether the addition of five cryptocurrencies – Bitcoin, Ethereum, Ripple, Bitcoin Cash, and Litecoin – from November 2015 to November 2019 would have boosted the returns of four traditional asset portfolios. "The results show that the diversification increased the returns in most of the cases, and reduced the portfolio volatility in all portfolios, and also provided higher returns as compared to the traditional portfolios for the same level of risk," they reported.

4. Cryptocurrency investors display significant herding behavior. In behavioral economics, herding is "when people do what others are doing instead of using their own information or making independent decisions." As cryptocurrency prices remain highly speculative, people who buy and sell these digital assets often trade based upon what others seem to be doing.  If prices rise, they rise rapidly as more people jump on the bandwagon. On the other hand, if they fall, they fall hard.

5. Ethereum might be a better long-term investment than Bitcoin. The cryptocurrency Ethereum ranks second to Bitcoin in terms of popularity, yet two studies have shown that tends to be more stable and a better "safe-haven" investment during difficult economic times. As a team of researchers from Singapore wrote in the journal PLoS ONE,  "Although both Bitcoin and Ethereum are digital tokens that serve as decentralized currency based on blockchain technology, there are crucial differences between them. While Bitcoin has positioned itself as an alternative monetary system in the financial market, Ethereum has mostly focused on monetizing smart contracts.  Also, being the first cryptocurrency, Bitcoin has been widely used for speculative purposes.  These traits are reflected in the user composition... where the behavior of Ethereum users is observed to be more stable as these users are more optimistic of the market. In contrast, the behavior of the Bitcoin users tend to fluctuate according to the trend of the market, with a loss of optimism when the market goes down."

       Five Things Science Has Told Us About Cryptocurrency | RealClearScience

Sunday, February 21, 2021

Women Who Were Great Spies

The world is all about spies like James Bond, but when it comes to female spies, how many names really come to mind? There have been some incredible heroines in espionage – you just haven’t heard of them. Here are the most infamous ones that should be on your radar.

1. Anna Chapman

Russian spy and model Anna Chapman was part of a Russian spy ring, infiltrating the U.S for years while attempting to seduce NSA whistle-blower Edward Snowden. She used her model status to gain access to covert government information and secrets, finally getting arrested in NYC 2019.

2. Ana Montes

Ms. Montes was a spy for the Cuban government and starting working for the US in 1985, with the Defense Intelligence Agency. An expert on everything Cuban, Ana had a photographic memory and memorized documents with ease. She even passed polygraph test for the U.S, but alas was sentenced to 25 years in 2002.

3. Josephine Baker

This one might have you scratching your head. Wasn’t Josephine Baker a dancer and singer in the 1920s? She was also a spy during WWII on behalf of the French Resistance, smuggling messages in her sheet music.

4. Stephanie Rader

Born to Polish immigrants in Ohio, Stephanie Rader joined the Women’s Army Auxiliary Corps in 1942 before being recruited to the Office of Strategic Services. Feigning the identity of a woman searching for lost family members in war, she got a job at the embassy. Undercover, she gathered info on Russian troop movements, political statistics, and the police. All in plain clothing, and with no gun. Badass, much?

5. Nancy Wake

At first glance, Nancy was just your everyday journalist. After marrying rich French industrialist, she joined the ranks of high-society France. A pioneer in the French Resistance, Wake established lines of communication between the French Resistance and British military, saving the lies of Allies along the way. According to some, this Nazi vigilante killed many a German soldier, one even with her bare hands.

6. Melita Norwood

Melita was a secretary at the British Non Ferrous Metals Research Association in the 1930s. Only thing is, the BNF was a cover-up organization for Britain’s nuclear weapons program known as the Tube Alloys project. She identified with the communist values of the Soviets, and hence got involved in the KGB, staying at BNF facilities after hours and removing files from the safes, sending copies to KGB handlers.

7. Noor Inayat Khan

This original baddie was the first British Indian spy, as well as the first female radio operator. Under the code name Madeleine, Noor worked for a residence movement in Paris, avoiding capture with smooth and swift relocation. When captured by the Gestapo, she refused to give any information, and was eventually tortured to death by Nazi police.

8. Christine Granville

A beauty queen turned spy, Christine Granville was a model before she got involved with World War II. Carrying messages through Poland to Allied forces, Christine risked her life as she saved soldiers from execution and used her beauty and charm as an asset.

9. Mata Hari

An undercover spy with the feigned identity of an exotic Asian dancer is something out of a movie, but it was real life for Miss Hari, who toured Europe performing dancing shows and narrating extravagantly exaggerated stories about her life. She was seductive and convincing, as well as a massager for the Allies’ opposition. She had affairs with high-up military officials and coaxed secrets from them, but was eventually found out by the Allies and killed.

https://herbeauty.co/en/entertainment/10-of-the-worlds-most-famous-female-spies/?BlackIP_Blocker&VPN&adclid=730625d8ec8dafb803c18fddb406b0de&utm_campaign=herbeauty_Proxy&utm_content=8164860&utm_medium=cpc&utm_source=herbeauty_mock&utm_term=57306943

  

Saturday, February 20, 2021

Depression, Anxiety and Loneliness Peak in College

New nationwide survey data uncovers college students' current mental health challenges and needs.

From: Boston University

February 19. 2021 -- A survey by a Boston University researcher of nearly 33,000 college students across the country reveals the prevalence of depression and anxiety in young people continues to increase, now reaching its highest levels, a sign of the mounting stress factors due to the coronavirus pandemic, political unrest, and systemic racism and inequality.

"Half of students in fall 2020 screened positive for depression and/or anxiety," says Sarah Ketchen Lipson, a Boston University mental health researcher and a co-principal investigator of the nationwide survey published on Februray 11, 2021, which was administered online during the fall 2020 semester through the Healthy Minds Network. The survey further reveals that 83 percent of students said their mental health had negatively impacted their academic performance within the past month, and that two-thirds of college students are struggling with loneliness and feeling isolated -- an all-time high prevalence that reflects the toll of the pandemic and the social distancing necessary to control it.

Lipson, a BU School of Public Health assistant professor of health law, policy, and management, says the survey's findings underscore the need for university teaching staff and faculty to put mechanisms in place that can accommodate students' mental health needs.

"Faculty need to be flexible with deadlines and remind students that their talent is not solely demonstrated by their ability to get a top grade during one challenging semester," Lipson says.

She adds that instructors can protect students' mental health by having class assignments due at 5 pm, rather than midnight or 9 am, times that Lipson says can encourage students to go to bed later and lose valuable sleep to meet those deadlines.

Especially in smaller classroom settings, where a student's absence may be more noticeable than in larger lectures, instructors who notice someone missing classes should reach out to that student directly to ask how they are doing.

"Even in larger classes, where 1:1 outreach is more difficult, instructors can send classwide emails reinforcing the idea that they care about their students not just as learners but as people, and circulating information about campus resources for mental health and wellness," Lipson says.

And, crucially, she says, instructors must bear in mind that the burden of mental health is not the same across all student demographics. "Students of color and low-income students are more likely to be grieving the loss of a loved one due to COVID," Lipson says. They are also "more likely to be facing financial stress." All of these factors can negatively impact mental health and academic performance in "profound ways," she says.

At a higher level within colleges and universities, Lipson says, administrators should focus on providing students with mental health services that emphasize prevention, coping, and resilience. The fall 2020 survey data revealed a significant "treatment gap," meaning that many students who screen positive for depression or anxiety are not receiving mental health services.

"Often students will only seek help when they find themselves in a mental health crisis, requiring more urgent resources," Lipson says. "But how can we create systems to foster wellness before they reach that point?" She has a suggestion: "All students should receive mental health education, ideally as part of the required curriculum."

It's also important to note, she says, that rising mental health challenges are not unique to the college setting -- instead, the survey findings are consistent with a broader trend of declining mental health in adolescents and young adults. "I think mental health is getting worse [across the US population], and on top of that we are now gathering more data on these trends than ever before," Lipson says. "We know mental health stigma is going down, and that's one of the biggest reasons we are able to collect better data. People are being more open, having more dialogue about it, and we're able to better identify that people are struggling."

The worsening mental health of Americans, more broadly, Lipson says, could be due to a confluence of factors: the pandemic, the impact of social media, and shifting societal values that are becoming more extrinsically motivated (a successful career, making more money, getting more followers and likes), rather than intrinsically motivated (being a good member of the community).

The crushing weight of historic financial pressures is an added burden. "Student debt is so stressful," Lipson says. "You're more predisposed to experiencing anxiety the more debt you have. And research indicates that suicidality is directly connected to financial well-being."

With more than 22 million young people enrolled in US colleges and universities, "and with the traditional college years of life coinciding with the age of onset for lifetime mental illnesses," Lipson stresses that higher education is a crucial setting where prevention and treatment can make a difference.

One potential bright spot from the survey was that the stigma around mental health continues to fade. The results reveal that 94 percent of students say that they wouldn't judge someone for seeking out help for mental health, which Lipson says is an indicator that also correlates with those students being likely to seek out help themselves during a personal crisis (although, paradoxically, almost half of students say they perceive that others may think more poorly of them if they did seek help).

"We're harsher on ourselves and more critical of ourselves than we are with other people -- we call that perceived versus personal stigma," Lipson says. "Students need to realize, your peers are not judging you."

Story Source:

Materials provided by Boston University. Original written by Kat J. McAlpine.  Note: Content may be edited for style and length.

        https://www.sciencedaily.com/releases/2021/02/210219190939.htm

Friday, February 19, 2021

Plasma Safety from A.I. for Planet Orbits

A novel computer algorithm, or set of rules, that accurately predicts the orbits of planets in the solar system could be adapted to better predict and control the behavior of the [super-heated ] plasma that fuels fusion facilities designed to harvest on Earth the fusion energy that powers the sun and stars.

From:  DOE/Princeton Plasma Physics Laboratory

February 12, 2021 -- The algorithm, devised by a scientist at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL), applies machine learning, the form of artificial intelligence (AI) that learns from experience, to develop the predictions. "Usually in physics, you make observations, create a theory based on those observations, and then use that theory to predict new observations," said PPPL physicist Hong Qin, author of a paper detailing the concept in Scientific Reports. "What I'm doing is replacing this process with a type of black box that can produce accurate predictions without using a traditional theory or law."

Qin (pronounced Chin) created a computer program into which he fed data from past observations of the orbits of Mercury, Venus, Earth, Mars, Jupiter, and the dwarf planet Ceres. This program, along with an additional program known as a "serving algorithm," then made accurate predictions of the orbits of other planets in the solar system without using Newton's laws of motion and gravitation. "Essentially, I bypassed all the fundamental ingredients of physics. I go directly from data to data," Qin said. "There is no law of physics in the middle."

The program does not happen upon accurate predictions by accident. "Hong taught the program the underlying principle used by nature to determine the dynamics of any physical system," said Joshua Burby, a physicist at the DOE's Los Alamos National Laboratory who earned his Ph.D. at Princeton under Qin's mentorship. "The payoff is that the network learns the laws of planetary motion after witnessing very few training examples. In other words, his code really 'learns' the laws of physics."

Machine learning is what makes computer programs like Google Translate possible. Google Translate sifts through a vast amount of information to determine how frequently one word in one language has been translated into a word in the other language. In this way, the program can make an accurate translation without actually learning either language.

The process also appears in philosophical thought experiments like John Searle's Chinese Room. In that scenario, a person who did not know Chinese could nevertheless "translate" a Chinese sentence into English or any other language by using a set of instructions, or rules, that would substitute for understanding. The thought experiment raises questions about what, at root, it means to understand anything at all, and whether understanding implies that something else is happening in the mind besides following rules.

Qin was inspired in part by Oxford philosopher Nick Bostrom's philosophical thought experiment that the universe is a computer simulation. If that were true, then fundamental physical laws should reveal that the universe consists of individual chunks of space-time, like pixels in a video game. "If we live in a simulation, our world has to be discrete," Qin said. The black box technique Qin devised does not require that physicists believe the simulation conjecture literally, though it builds on this idea to create a program that makes accurate physical predictions.

The resulting pixelated view of the world, akin to what is portrayed in the movie The Matrix, is known as a discrete field theory, which views the universe as composed of individual bits and differs from the theories that people normally create. While scientists typically devise overarching concepts of how the physical world behaves, computers just assemble a collection of data points.

Qin and Eric Palmerduca, a graduate student in the Princeton University Program in Plasma Physics, are now developing ways to use discrete field theories to predict the behavior of particles of plasma in fusion experiments conducted by scientists around the world. The most widely used fusion facilities are doughnut-shaped tokamaks that confine the plasma in powerful magnetic fields.

Fusion, the power that drives the sun and stars, combines light elements in the form of plasma -- the hot, charged state of matter composed of free electrons and atomic nuclei that represents 99% of the visible universe -- to generate massive amounts of energy. Scientists are seeking to replicate fusion on Earth for a virtually inexhaustible supply of power to generate electricity.

"In a magnetic fusion device, the dynamics of plasmas are complex and multi-scale, and the effective governing laws or computational models for a particular physical process that we are interested in are not always clear," Qin said. "In these scenarios, we can apply the machine learning technique that I developed to create a discrete field theory and then apply this discrete field theory to understand and predict new experimental observations."

This process opens up questions about the nature of science itself. Don't scientists want to develop physics theories that explain the world, instead of simply amassing data? Aren't theories fundamental to physics and necessary to explain and understand phenomena?

"I would argue that the ultimate goal of any scientist is prediction," Qin said. "You might not necessarily need a law. For example, if I can perfectly predict a planetary orbit, I don't need to know Newton's laws of gravitation and motion. You could argue that by doing so you would understand less than if you knew Newton's laws. In a sense, that is correct. But from a practical point of view, making accurate predictions is not doing anything less."

Machine learning could also open up possibilities for more research. "It significantly broadens the scope of problems that you can tackle because all you need to get going is data," Palmerduca said.

The technique could also lead to the development of a traditional physical theory. "While in some sense this method precludes the need of such a theory, it can also be viewed as a path toward one," Palmerduca said. "When you're trying to deduce a theory, you'd like to have as much data at your disposal as possible. If you're given some data, you can use machine learning to fill in gaps in that data or otherwise expand the data set."

Support for this research came from the DOE Office of Science (Fusion Energy Sciences).

        https://www.sciencedaily.com/releases/2021/02/210212094120.htm 

Thursday, February 18, 2021

Machine Diagnosis to Rival a Dog’s Nose

Trained dogs can detect cancer and other diseases by smell. A miniaturized detector can analyze trace molecules to mimic the process.

By David L. Chandler at the MIT News Office

February 17, 2021 -- Numerous studies have shown that trained dogs can detect many kinds of disease — including lung, breast, ovarian, bladder, and prostate cancers, and possibly Covid-19 — simply through smell. In some cases, involving prostate cancer for example, the dogs had a 99 percent success rate in detecting the disease by sniffing patients’ urine samples.

But it takes time to train such dogs, and their availability and time is limited. Scientists have been hunting for ways of automating the amazing olfactory capabilities of the canine nose and brain, in a compact device. Now, a team of researchers at MIT and other institutions has come up with a system that can detect the chemical and microbial content of an air sample with even greater sensitivity than a dog’s nose. They coupled this to a machine-learning process that can identify the distinctive characteristics of the disease-bearing samples.

The findings, which the researchers say could someday lead to an automated odor-detection system small enough to be incorporated into a cellphone, are being published today in the journal PLOS One, in a paper by Claire Guest of Medical Detection Dogs in the U.K., Research Scientist Andreas Mershin of MIT, and 18 others at Johns Hopkins University, the Prostate Cancer Foundation, and several other universities and organizations.

 “Dogs, for now 15 years or so, have been shown to be the earliest, most accurate disease detectors for anything that we’ve ever tried,” Mershin says. And their performance in controlled tests has in some cases exceeded that of the best current lab tests, he says. “So far, many different types of cancer have been detected earlier by dogs than any other technology.”

What’s more, the dogs apparently pick up connections that have so far eluded human researchers: When trained to respond to samples from patients with one type of cancer, some dogs have then identified several other types of cancer — even though the similarities between the samples weren’t evident to humans.

These dogs can identify “cancers that don’t have any identical biomolecular signatures in common, nothing in the odorants,” Mershin says. Using powerful analytical tools including gas chromatography mass spectrometry (GCMS) and microbial profiling, “if you analyze the samples from, let’s say, skin cancer and bladder cancer and breast cancer and lung cancer — all things that the dog has been shown to be able to detect — they have nothing in common.” Yet the dog can somehow generalize from one kind of cancer to be able to identify the others.

Mershin and the team over the last few years have developed, and continued to improve on, a miniaturized detector system that incorporates mammalian olfactory receptors stabilized to act as sensors, whose data streams can be handled in real-time by a typical smartphone’s capabilities. He envisions a day when every phone will have a scent detector built in, just as cameras are now ubiquitous in phones. Such detectors, equipped with advanced algorithms developed through machine learning, could potentially pick up early signs of disease far sooner than typical screening regimes, he says — and could even warn of smoke or a gas leak as well.

In the latest tests, the team tested 50 samples of urine from confirmed cases of prostate cancer and controls known to be free of the disease, using both dogs trained and handled by Medical Detection Dogs in the U.K. and the miniaturized detection system. They then applied a machine-learning program to tease out any similarities and differences between the samples that could help the sensor-based system to identify the disease. In testing the same samples, the artificial system was able to match the success rates of the dogs, with both methods scoring more than 70 percent.

The miniaturized detection system, Mershin says, is actually 200 times more sensitive than a dog’s nose in terms of being able to detect and identify tiny traces of different molecules, as confirmed through controlled tests mandated by DARPA. But in terms of interpreting those molecules, “it’s 100 percent dumber.” That’s where the machine learning comes in, to try to find the elusive patterns that dogs can infer from the scent, but humans haven’t been able to grasp from a chemical analysis.

 “The dogs don’t know any chemistry,” Mershin says. “They don’t see a list of molecules appear in their head. When you smell a cup of coffee, you don’t see a list of names and concentrations, you feel an integrated sensation. That sensation of scent character is what the dogs can mine.”

While the physical apparatus for detecting and analyzing the molecules in air has been under development for several years, with much of the focus on reducing its size, until now the analysis was lacking. “We knew that the sensors are already better than what the dogs can do in terms of the limit of detection, but what we haven’t shown before is that we can train an artificial intelligence to mimic the dogs,” he says. “And now we’ve shown that we can do this. We’ve shown that what the dog does can be replicated to a certain extent.”

This achievement, the researchers say, provides a solid framework for further research to develop the technology to a level suitable for clinical use. Mershin hopes to be able to test a far larger set of samples, perhaps 5,000, to pinpoint in greater detail the significant indicators of disease. But such testing doesn’t come cheap: It costs about $1,000 per sample for clinically tested and certified samples of disease-carrying and disease-free urine to be collected, documented, shipped, and analyzed he says.

Reflecting on how he became involved in this research, Mershin recalled a study of bladder cancer detection, in which a dog kept misidentifying one member of the control group as being positive for the disease, even though he had been specifically selected based on hospital tests as being disease free. The patient, who knew about the dog’s test, opted to have further tests, and a few months later was found to have the disease at a very early stage. “Even though it’s just one case, I have to admit that did sway me,” Mershin says.

The team included researchers at MIT, Johns Hopkins University in Maryland, Medical Detection Dogs in Milton Keynes, U.K., the Cambridge Polymer Group, the Prostate Cancer Foundation, the University of Texas at El Paso, Imagination Engines, and Harvard University. The research was supported by the Prostate Cancer Foundation, the National Cancer Institute, and the National Institutes of Health.

                https://news.mit.edu/2021/disease-detection-device-dogs-0217

Wednesday, February 17, 2021

Complex Climate of Texas

Texas' weather varies widely, from arid in the west to humid in the east. The huge expanse of Texas encompasses several regions with distinctly different climates: Northern Plains, Trans-Pecos Region, Texas Hill Country, Piney Woods, and South Texas. Generally speaking, the part of Texas that lies to the east of Interstate 35 is subtropical, while the portion that lies to the west of Interstate 35 is arid desert.

Texas ranks first in tornado occurrence with an average of 139 per year. Tropical cyclones can affect the state, either from the Gulf of Mexico or from an overland trajectory originating in the eastern Pacific Ocean. Those originating from the Gulf of Mexico are more likely to strike the upper Texas coast than elsewhere. Significant floods have occurred across the state throughout history, both from tropical cyclones and from stalled weather fronts.

Cold and Snow

Northern and western sections of the state average snowfall annually due to their colder average readings each winter. For one week in February 1956, a snow storm of historic proportions struck northern Texas. The maximum amount measured was 61 inches (150 cm) at Vega with Plainview receiving 24 inches (61 cm) in one day.  El Paso, in Far West Texas, received 22.4 in (57 cm) of snow during a 24-hour period December 13–14, 1987.  For central and southern sections, snowfall is considerably more unusual. In February 1895, a large area of southeastern Texas received over 12 inches (30 cm) of snow, with peak amounts near 30 inches (76 cm) at Port Arthur.  More recently around Christmas of 2004, up to 13 inches (33 cm) of snow fell along the middle coast, with the maximum occurring at Victoria.

The worst cold snap to occur statewide occurred during the last half of December in 1983. Four stations recorded their longest continuous readings at or below 32 °F (0 °C) on record. At Austin, the temperature remained at or below freezing for 139 hours.  At Abilene, the period at or below  freezing totaled 202 hours.  Lubbock saw temperatures at or below freezing for 207 hours. The Dallas-Fort Worth airport measured temperatures at or below freezing for a total of 296 consecutive hours.  Snow which fell on December 14 and 15 across northern Texas stayed on the ground until New Year's Day of 1984.

Severe Weather

Thunderstorms are very common in Texas, especially the eastern and northern portion. Texas is part of the Tornado Alley section of the country. The state experiences the most tornadoes in the Union, an average of 139 a year. These strike most frequently in North Texas and the Panhandle.  Tornadoes in Texas generally occur in April, May, and June.

Hurricanes

Texas's position at the northwestern end of the Gulf of Mexico makes it vulnerable to hurricanes. Some of the most destructive hurricanes in U.S. history have impacted Texas. A hurricane in 1875 killed approximately 400 people in Indianola, followed by another hurricane in 1886 that destroyed the town, which was at the time the most important port city in the state. This allowed Galveston to take over as the chief port city, but it was subsequently devastated by a hurricane in 1900 that killed approximately 8,000 people (possibly as many as 12,000), making it the deadliest natural disaster in U.S. history. Other devastating Texan hurricanes include the 1915 Galveston Hurricane, Hurricane Carla in 1961, Hurricane Beulah in  1967, Hurricane Alicia in 1983, Hurricane Rita in 2005, and Hurricane Ike in 2008.

The climatology of where tropical cyclone strikes are most likely within the state appears to be changing. In the early 1980s, the most favored region during the previous century was the middle coast.  However, that region of the coastline has been rarely impacted since the 1960s, and a recent study indicates that the most vulnerable location to a tropical cyclone strike since 1851 is the upper coast, which has received 56 percent of all tropical cyclone landfalls, of which 66 percent originate from the Gulf of Mexico. This is in contrast with Louisiana and the lower Texan coast, where only 39 percent of the landfalls are from tropical cyclones of Gulf of Mexico origin.

Flooding

The most serious threat from tropical cyclones for Texans is from flooding. The worst aspect about tropical cyclones is that the weaker they are, the more efficient they can be at producing heavy rains and catastrophic flooding. Systems with sprawling circulations, such as Hurricane Beulah, also tend to make good rainmakers.  Slow moving systems, such as Tropical Storm Amelia (1978) and Hurricane Harvey (2017) can produce significant rainfall.  Tropical cyclones from the eastern Pacific and Atlantic Basins can impact the Lone Star State.  In general, flooding across Texas is more common during the spring and early autumn months, and it can also be due to nearby stationary fronts interacting with strong upper level cyclones.  The most likely location for floods statewide is the Balcones Escarpment, an area of steep elevation gradient in central Texas at the boundary between the Edwards Plateau and the coastal plain.

Extreme Temperatures

The highest temperature ever measured in Texas was 120 °F (48.9 °C), recorded on August 12, 1936 in Seymour, during the 1936 North American Heatwave, and again on June 28, 1994 in Monahans. The lowest temperature ever measured in Texas was −23 °F (−30.6 °C), recorded on February 8, 1933 in Seminole.

                                 Climate of Texas - Wikipedia