Monday, May 31, 2021

Secret Childrens’ Graveyard Found in Canada

The Kamloops Indian Residential School was part of the Canadian Indian residential school system.  Located in Kamloops, British Columbia, it was once the largest residential school in Canada, with its enrolment peaking at 500 in the 1950s.

The school was established in 1890 and in operation until 1969, when it was taken over by the federal government from the Catholic Church to be used as a day school residence. It closed in 1978.  The school building still stands today, and is located on the Tk’emlúps te Secwépemc First Nation.  In May 2021, the remains of 215 children buried in a mass grave were found at the site.

History of the School

What would become the Kamloops Indian Residential School was established in 1893, after initially opening in 1890 as the Kamloops Industrial School which had the aim of acculturating Indigenous children.  D. Ross of Kamloops was awarded the $10,000 contract to erect the initial set of industrial school buildings in April 1889.  Three two-storey wooden structures were the first buildings on the site, consisting of separate living quarters for boys and girls, and the school's teachers, along with classrooms and a recreation area.  Michel Hagan served as the industrial school's first principal, resigning in 1892, at which time the government charged the Oblates of Mary Immaculate with running the school.  Father Carrion was named principal of the school in March 1893, following Hagan's departure.

In 1927, John Duplanil succeeded J. Maguire as principal of the school, following Maguire's appointment as curate of St. Patrick's Church in Lethbridge, Alberta.  G.P. Dunlop took over as head of the school in 1958, relocating from a position at the Eugene Mission Indian School in Cranbook, British Columbia.

The school continued as the Kamloops Indian Residential School until 1978, located on the traditional territory of the Secwepemc (Secwépemcúl'ecw). The building was the first location of the Secwepemc Museum, which opened in 1982.  In 1991, a special edition of Secwepemc News reported that the public policy which led to the operation of the school for more than 80 years had "done its job; English is now the predominant language within the Shuswap Nation and the survival of the Shuswap language is uncertain."

The school was featured in the 1962 Christmas-themed film Eyes of the Children.  Produced by George Robertson, the film followed 400 students as they prepared for Christmas and aired on the CBC on Christmas Day.

Discovery of the Mass Grave

In May 2021, the buried remains of 215 children, some as young as three years old, were found at the school site with the assistance of a ground-penetrating radar specialist.  There had long been rumours of unmarked graves at residential schools, but none had previously been uncovered.  Tk’emlúps te Secwépemc First Nation Chief Rosanne Casimir said the finding represented "an unthinkable loss ... never documented by the school's administrators", and that work was underway to determine whether the Royal British Columbia Museum holds relevant records.  She also said the radar scanning is yet to be complete, and she expects further discoveries to be made.  The National Centre for Truth and Reconciliation has so far released official recognition of some 51 students who had died. Their dates of death range from 1919 until 1964.  The continued radar survey suggests the investigation is ongoing.

A provincial Indigenous leader said in May 2021 there were plans being made for forensic experts to exhume, identify and repatriate the remains of the children from the school.

Reactions to the Mass Grave

In a statement released by the First Nations Health Authority, CEO Richard Jock said: "That this situation exists is sadly not a surprise and illustrates the damaging and lasting impacts that the residential school system continues to have on First Nations people, their families and communities."  Premier of British Columbia John Horgan said that he was "horrified and heartbroken" at the discovery, and that he supported further efforts to bring to "light the full extent of this loss".  Federal Minister of Indigenous Services Marc Miller also offered his support.  Prime Minister Justin Trudeau called the discovery "heartbreaking" the day of the announcement, and on May 30, ordered flags on federal buildings to be flown at half-mast until further notice.  Other half-mastings included flags at the BC and Manitoba legislatures as well as individual municipalities such as Ottawa, Montreal, Edmonton, M1ississauga, Brampton, and Toronto, which also ordered the 3D Toronto sign dimmed for 215 hours.

Angela White, executive director for the Indian Residential School Survivors Society, has also called on the Canadian federal government and Catholic Church to take action and responsibility around reconciliation efforts, stating: "Reconciliation does not mean anything if there is no action to those words...[w]ell-wishes and prayers only go so far. If we are going to actually create positive strides forward there needs to be that ability to continue the work, like the Indian Residential School Survivors Society does, in a meaningful way."

Inspired by a community memorial at the Vancouver Art Gallery which laid out 215 pairs of children's shoes in rows.  Similar memorials were created across Canada, including in front of government buildings and church buildings that had been in charge of running the residential school system. At the Ontario Legislative Building, security initially ordered the shoes removed before acquiescing. The Anishinabek Nation tweeted in support of social media calls to put out teddy bears on porches on May 31, similar to what was done after the 2018 Humboldt Broncos bus crash with hockey sticks. Another popular campaign called on people to wear orange on May 31.

                 Kamloops Indian Residential School - Wikipedia

Sunday, May 30, 2021

Leading Victorian Critic John Ruskin

John Ruskin (8 February 1819 – 20 January 1900) was the leading English art critic of the Victorian era, as well as an art patron, draughtsman, watercolourist, philosopher, prominent social thinker and philanthropist. He wrote on subjects as varied as geology, architecture, myth, ornithology, literature, education, botany and political economy.

His writing styles and literary forms were equally varied. He wrote essays and treatises, poetry and lectures, travel guides and manuals, letters and even a fairy tale. He also made detailed sketches and paintings of rocks, plants, birds, landscapes, architectural structures and ornamentation.

The elaborate style that characterised his earliest writing on art gave way in time to plainer language designed to communicate his ideas more effectively. In all of his writing, he emphasised the connections between nature, art and society.

He was hugely influential in the latter half of the 19th century and up to the First World War. After a period of relative decline, his reputation has steadily improved since the 1960s with the publication of numerous academic studies of his work. Today, his ideas and concerns are widely recognised as having anticipated interest in environmentalism, sustainability and craft.

Ruskin first came to widespread attention with the first volume of Modern Painters (1843), an extended essay in defence of the work of J. M. W. Turner in which he argued that the principal role of the artist is "truth to nature". From the 1850s, he championed the Pre-Raphaelites, who were influenced by his ideas. His work increasingly focused on social and political issues.  Unto This Last (1860, 1862) marked the shift in emphasis. In 1869, Ruskin became the first Slade Professor of Fine Art at the University of Oxford, where he established the Ruskin School of Drawing.  In 1871, he began his monthly "letters to the workmen and labourers of Great Britain", published under the title Fors Clavigera (1871–1884). In the course of this complex and deeply personal work, he developed the principles underlying his ideal society. As a result, he founded the Guild of St George, an organisation that endures today.

Social Critic and Reformer: Unto This Last

Although in 1877 Ruskin said that in 1860, "I gave up my art work and wrote Unto This Last, . the central work of my life" the break was not so dramatic or final.  Following his crisis of faith, and influenced in part by his friend Thomas Carlyle (whom he had first met in 1850), Ruskin shifted his emphasis in the late 1850s from art towards social issues. Nevertheless, he continued to lecture on and write about a wide range of subjects including art and, among many other matters, geology (in June 1863 he lectured on the Alps), art practice and judgement (The Cestus of Aglaia), botany and mythology (Proserpina and The Queen of the Air). He continued to draw and paint in watercolours, and to travel extensively across Europe with servants and friends. In 1868, his tour took him to Abbeville, and in the following year he was in Verona (studying tombs for the Arundel Society) and Venice (where he was joined by William Holman Hunt). Yet increasingly Ruskin concentrated his energies on fiercely attacking industrial capitalism, and the utilitarian theories of political economy underpinning it. He repudiated his sometimes grandiloquent style, writing now in plainer, simpler language, to communicate his message straightforwardly.

Ruskin's social view broadened from concerns about the dignity of labour to consider issues of citizenship and notions of the ideal community. Just as he had questioned aesthetic orthodoxy in his earliest writings, he now dissected the orthodox political economy espoused by John Stuart Mill, based on theories of laissez-faire and competition drawn from the work of Adam Smith, David Ricardo and Thomas Malthus.  In his four essays Unto This Last, Ruskin rejected the division of labour as dehumanising (separating the labourer from the product of his work), and argued that the false "science" of political economy failed to consider the social affections that bind communities together. He articulated an extended metaphor of household and family, drawing on Plato and Xenophon to demonstrate the communal and sometimes sacrificial nature of true economics.  For Ruskin, all economies and societies are ideally founded on a politics of social justice.  His ideas influenced the concept of the "social economy", characterised by networks of charitable, co-operative and other non-governmental organisations.

The essays were originally published in consecutive monthly instalments of the new Cornhill Magazine between August and November 1860 (and published in a single volume in 1862).  However, the Cornhill's editor, William Makepeace Thackeray, was forced to abandon the series by the outcry of the magazine's largely conservative readership and the fears of a nervous publisher (Smith, Elder & Co.).  The reaction of the national press was hostile, and Ruskin was, he claimed, "reprobated in a violent manner".  Ruskin's father also strongly disapproved.  Others were enthusiastic, including Ruskin's friend Thomas Carlyle, who wrote, "I have read your paper with exhilaration... such a thing flung suddenly into half a million dull British heads... will do a great deal of good."

Ruskin's political ideas, and Unto This Last in particular, later proved highly influential. The essays were praised and paraphrased in Gujarati by Mohandas Gandhi, a wide range of autodidacts cited their positive impact, the economist John A. Hobson and many of the founders of the British Labour party credited them as an influence.

Ruskin believed in a hierarchical social structure. He wrote "I was, and my father was before me, a violent Tory of the old school."  He believed in man's duty to God, and while he sought to improve the conditions of the poor, he opposed attempts to level social differences and sought to resolve social inequalities by abandoning capitalism in favour of a co-operative structure of society based on obedience and benevolent philanthropy, rooted in the agricultural economy.

If there be any one point insisted on throughout my works more frequently than another, that one point is the impossibility of Equality. My continual aim has been to show the eternal superiority of some men to others, sometimes even of one man to all others; and to show also the advisability of appointing such persons or person to guide, to lead, or on occasion even to compel and subdue, their inferiors, according to their own better knowledge and wiser will.

— John Ruskin, Unto This Last: Cook and Wedderburn 17.34

       https://en.wikipedia.org/wiki/John_Ruskin

Saturday, May 29, 2021

Important New Data for Climate Models

A fiery past sheds new light on the future of global climate change.  Ice core samples reveal significant smoke aerosols in the pre-industrial Southern Hemisphere

From: Harvard (the John A. Paulson School of Engineering and Applied Sciences)

May 28, 2021 -- Centuries-old smoke particles preserved in the ice reveal a fiery past in the Southern Hemisphere and shed new light on the future impacts of global climate change, according to new research published in Science Advances.

"Up till now, the magnitude of past fire activity, and thus the amount of smoke in the preindustrial atmosphere, has not been well characterized," said Pengfei Liu, a former graduate student and postdoctoral fellow at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and first author of the paper. "These results have importance for understanding the evolution of climate change from the 1750s until today, and for predicting future climate."

One of the biggest uncertainties when it comes to predicting the future impacts of climate change is how fast surface temperatures will rise in response to increases in greenhouse gases. Predicting these temperatures is complicated since it involves the calculation of competing warming and cooling effects in the atmosphere. Greenhouse gases trap heat and warm the planet's surface while aerosol particles in the atmosphere from volcanoes, fires and other combustion cool the planet by blocking sunlight or seeding cloud cover. Understanding how sensitive surface temperature is to each of these effects and how they interact is critical to predicting the future impact of climate change.

Many of today's climate models rely on past levels of greenhouse gasses and aerosols to validate their predictions for the future. But there's a problem: While pre-industrial levels of greenhouse gasses are well documented, the amount of smoke aerosols in the preindustrial atmosphere is not.

To model smoke in the pre-industrial Southern Hemisphere, the research team looked to Antarctica, where the ice trapped smoke particles emitted from fires in Australia, Africa and South America. Ice core scientists and co-authors of the study, Joseph McConnell and Nathan Chellman from the Desert Research Institute in Nevada, measured soot, a key component of smoke, deposited in an array of 14 ice cores from across the continent, many provided by international collaborators.

"Soot deposited in glacier ice directly reflects past atmospheric concentrations so well-dated ice cores provide the most reliable long-term records," said McConnell.

What they found was unexpected.

"While most studies have assumed less fire took place in the preindustrial era, the ice cores suggested a much fierier past, at least in the Southern Hemisphere," said Loretta Mickley, Senior Research Fellow in Chemistry-Climate Interactions at SEAS and senior author of the paper.

To account for these levels of smoke, the researchers ran computer simulations that account for both wildfires and the burning practices of indigenous people.

"The computer simulations of fire show that the atmosphere of the Southern Hemisphere could have been very smoky in the century before the Industrial Revolution. Soot concentrations in the atmosphere were up to four times greater than previous studies suggested. Most of this was caused by widespread and regular burning practiced by indigenous peoples in the pre-colonial period," said Jed Kaplan, Associate Professor at the University of Hong Kong and co-author of the study.

This result agrees with the ice core records that also show that soot was abundant before the start of the industrial era and has remained relatively constant through the 20th century. The modelling suggests that as land use changes decreased fire activity, emissions from industry increased.

What does this finding mean for future surface temperatures?

By underestimating the cooling effect of smoke particles in the pre-industrial world, climate models might have over-estimated the warming effect of carbon dioxide and other greenhouse gasses in order to account for the observed increases in surface temperatures.

"Climate scientists have known that the most recent generation of climate models have been over-estimating surface temperature sensitivity to greenhouse gasses, but we haven't known why or by how much," said Liu. "This research offers a possible explanation."

"Clearly the world is warming but the key question is how fast will it warm as greenhouse gas emissions continue to rise. This research allows us to refine our predictions moving forward," said Mickley.

       https://www.sciencedaily.com/releases/2021/05/210528152517.htm

Friday, May 28, 2021

Protecting the Brain from Hypoxia

Serendipitous discovery could lead to treatment for strokes, cardiac arrest

From: Massachusetts General Hospital

May 28, 2021 -- Lack of oxygen, which is harmful to the brain, causes hydrogen sulfide 'sewer gas' to accumulate in the brain. The brains of lab animals repeatedly exposed to hydrogen sulfide became tolerant to the gas and lack of oxygen. Researchers identified the mechanism that induces this tolerance, which could lead to new treatments for brain injuries caused by oxygen deprivation.

In a surprising discovery, researchers at Massachusetts General Hospital (MGH) identified a mechanism that protects the brain from the effects of hypoxia, a potentially lethal deprivation of oxygen. This serendipitous finding, which they report in Nature Communications, could aid in the development of therapies for strokes, as well as brain injury that can result from cardiac arrest, among other conditions.

However, this study began with a very different objective, explains senior author Fumito Ichinose, MD, PhD, an attending physician in the Department of Anesthesia, Critical Care and Pain Medicine at MGH, and principal investigator in the Anesthesia Center for Critical Care Research. One area of focus for Ichinose and his team is developing techniques for inducing suspended animation, that is, putting a human's vital functions on temporary hold, with the ability to "reawaken" them later. This state of being would be similar to what bears and other animals experience during hibernation. Ichinose believes that the ability to safely induce suspended animation could have valuable medical applications, such as pausing the life processes of a patient with an incurable disease until an effective therapy is found. It could also allow humans to travel long distances in space (which has frequently been depicted in science fiction).

A 2005 study found that inhaling a gas called hydrogen sulfide caused mice to enter a state of suspended animation. Hydrogen sulfide, which has the odor of rotten eggs, is sometimes called "sewer gas." Oxygen deprivation in a mammal's brain leads to increased production of hydrogen sulfide. As this gas accumulates in the tissue, hydrogen sulfide can halt energy metabolism in neurons and cause them to die. Oxygen deprivation is a hallmark of ischemic stroke, the most common type of stroke, and other injuries to the brain.

In the Nature Communications study, Ichinose and his team initially set out to learn what happens when mice are exposed to hydrogen sulfide repeatedly, over an extended period. At first, the mice entered a suspended-animation-like state -- their body temperatures dropped and they were immobile. "But, to our surprise, the mice very quickly became tolerant to the effects of inhaling hydrogen sulfide," says Ichinose. "By the fifth day, they acted normally and were no longer affected by hydrogen sulfide."

Interestingly, the mice that became tolerant to hydrogen sulfide were also able to tolerate severe hypoxia. What protected these mice from hypoxia? Ichinose's group suspected that enzymes in the brain that metabolize sulfide might be responsible. They found that levels of one enzyme, called sulfide:quinone oxidoreductase (SQOR), rose in the brains of mice when they breathed hydrogen sulfide several days in a row. They hypothesized that SQOR plays a part in resistance to hypoxia.

There was strong evidence for this hypothesis in nature. For example, female mammals are known to be more resistant than males to the effects of hypoxia -- and the former have higher levels of SQOR. When SQOR levels are artificially reduced in females, they become more vulnerable to hypoxia. (Estrogen may be responsible for the observed increase in SQOR, since protection from the adverse effects of hypoxia is lost when a female mammal's estrogen-producing ovaries are removed.) Moreover, some hibernating animals, such as the thirteen-lined ground squirrel, are highly tolerant of hypoxia, which allows them to survive as their bodies' metabolism slows down during the winter. A typical ground squirrel's brain has 100 times more SQOR than that of a similar-sized rat. However, when Ichinose and colleagues "turned off" expression of SQOR in the squirrels' brains, their protection against the effects of hypoxia vanished.

Meanwhile, when Ichinose and colleagues artificially increased SQOR levels in the brains of mice, "they developed a robust defense against hypoxia," explains Ichinose. His team increased the level of SQOR using gene therapy, an approach that is technically complex and not practical at this point. On the other hand, Ichinose and his colleagues demonstrated that "scavenging" sulfide, by using an experiment drug called SS-20, reduced levels of the gas, thereby sparing the brains of mice when they were deprived of oxygen.

Human brains have very low levels of SQOR, meaning that even a modest accumulation of hydrogen sulfide can be harmful, says Ichinose. "We hope that someday we'll have drugs that could work like SQOR in the body," he says, noting that his lab is studying SS-20 and several other candidates. Such medications could be used to treat ischemic strokes, as well as patients who have suffered cardiac arrest, which can lead to hypoxia. Ichinose's lab is also investigating how hydrogen sulfide affects other parts of the body. For example, hydrogen sulfide is known to accumulate in other conditions, such as certain types of Leigh syndrome, a rare but severe neurological disorder that usually leads to early death. "For some patients," says Ichinose, "treatment with a sulfide scavenger might be lifesaving."

The lead author of the study is Eizo Marutani, MD, an investigator in MGH's Department of Anesthesia Critical Care and Pain Medicine and an instructor at Harvard Medical School (HMS). Ichinose is also the William Thomas Green Morton Professor of Anaesthesia at HMS.

https://www.sciencedaily.com/releases/2021/05/210525160848.htm

Thursday, May 27, 2021

Coming: Cheaper Solar Panels

Study of promising photovoltaic material leads to discovery of a new state of matter

May 26, 2006 -- Researchers at McGill University have gained new insight into the workings of perovskites, a semiconductor material that shows great promise for making high-efficiency, low-cost solar cells and a range of other optical and electronic devices.

Perovskites have drawn attention over the past decade because of their ability to act as semiconductors even when there are defects in the material’s crystal structure. This makes perovskites special because getting most other semiconductors to work well requires stringent and costly manufacturing techniques to produce crystals that are as defect-free as possible. In what amounts to the discovery of a new state of matter, the McGill team has made a step forward in unlocking the mystery of how perovskites pull off this trick.

“Historically, people have been using bulk semiconductors that are perfect crystals. And now, all of a sudden, this imperfect, soft crystal starts to work for semiconductor applications, from photovoltaics to LEDs,” explains senior author Patanjali Kambhampati, an associate professor in the Department of Chemistry at McGill. “That's the starting point for our research: how can something that’s defective work in a perfect way?”

Quantum dots, but not as we know them

In a paper published May 26 in Physical Review Research, the researchers reveal that a phenomenon known as quantum confinement occurs within bulk perovskite crystals. Until now, quantum confinement had only been observed in particles a few nanometres in size – the quantum dots of flatscreen TV fame being one much-vaunted example. When particles are this small, their physical dimensions constrain the movement of electrons in a way that gives the particles distinctly different properties from larger pieces of the same material – properties that can be fine-tuned to produce useful effects such as the emission of light in precise colours.

Using a technique known as state-resolved pump/probe spectroscopy, the researchers have shown a similar type of confinement occurs in bulk caesium lead bromide perovskite crystals. In other words, their experiments have uncovered quantum dot-like behaviour taking place in pieces of perovskite significantly larger than quantum dots.

Surprising result leads to unexpected discovery

The work builds on earlier research which established that perovskites, while appearing to be a solid substance to the naked eye, have certain characteristics more commonly associated with liquids. At the heart of this liquid-solid duality is an atomic lattice able to distort in response to the presence of free electrons. Kambhampati draws a comparison to a trampoline absorbing the impact of a rock thrown into its centre. Just as the trampoline will eventually bring the rock to a standstill, the distortion of the perovskite crystal lattice – a phenomenon known as polaron formation – is understood to have a stabilizing effect on the electron.

While the trampoline analogy would suggest a gradual dissipation of energy consistent with a system moving from an excited state back to a more stable one, the pump/probe spectroscopy data in fact revealed the opposite. To the researchers’ surprise, their measurements showed an overall increase in energy in the aftermath of polaron formation.

“The fact that the energy was raised shows a new quantum mechanical effect, quantum confinement like a quantum dot,” Kambhampati says, explaining that, at the size scale of electrons, the rock in the trampoline is an exciton, the bound pairing of an electron with the space it leaves behind when it is in an excited state.

“What the polaron does is confine everything into a spatially well-defined area. One of the things our group was able to show is that the polaron mixes with an exciton to form what looks like a quantum dot. In a sense, it’s like a liquid quantum dot, which is something we call a quantum drop. We hope that exploring the behavior of these quantum drops will give rise to a better understanding of how to engineer defect-tolerant optoelectronic materials.”

https://www.mcgill.ca/science/channels/news/study-promising-photovoltaic-material-leads-discovery-new-state-matter-331213

Wednesday, May 26, 2021

Is “Wokeness” Actually “Sleepiness”?

In a devastating opinion essay in today’s New York Times, Thomas B. Edsall dares to posit that the academic and intellectual concept of wokeness is “kryptonite” for the modern Democratic party.

It is not a short essay, but it is profoundly thought provoking.  See https://www.nytimes.com/2021/05/26/opinion/democrats-republicans-wokeness-cancel-culture.html

Tuesday, May 25, 2021

Certain Exercises Help Young Children with Math

Young children who practice visual working memory and reasoning tasks improve their math skills more than children who focus on spatial rotation exercises, according to a large study by researchers at Karolinska Institutet in Sweden.

From: Karolinska Institutet

May 20, 2021 -- The findings support the notion that training spatial cognition can enhance academic performance and that when it comes to math, the type of training matters. The study is published in the journal Nature Human Behaviour.

"In this large, randomized study we found that when it comes to enhancing mathematical learning in young children, the type of cognitive training performed plays a significant role," says corresponding author Torkel Klingberg, professor in the Department of Neuroscience, Karolinska Institutet. "It is an important finding because it provides strong evidence that cognitive training transfers to an ability that is different from the one you practiced."

Numerous studies have linked spatial ability -- that is the capacity to understand and remember dimensional relations among objects -- to performance in science, technology, engineering and mathematics. As a result, some employers in these fields use spatial ability tests to vet candidates during the hiring process. This has also fueled an interest in spatial cognition training, which focuses on improving one's ability to memorize and manipulate various shapes and objects and spot patterns in recurring sequences. Some schools today include spatial exercises as part of their tutoring.

However, previous studies assessing the effect of spatial training on academic performance have had mixed results, with some showing significant improvement and others no effect at all. Thus, there is a need for large, randomized studies to determine if and to what extent spatial cognition training actually improves performance.

In this study, more than 17,000 Swedish schoolchildren between the ages of six and eight completed cognitive training via an app for either 20 or 33 minutes per day over the course of seven weeks. In the first week, the children were given identical exercises, after which they were randomly split into one of five training plans. In all groups, children spent about half of their time on mathematical number line tasks. The remaining time was randomly allotted to different proportions of cognitive training in the form of rotation tasks (2D mental rotation and tangram puzzle), visual working memory tasks or non-verbal reasoning tasks (see examples below for details). The children's math performance was tested in the first, fifth and seventh week.

The researchers found that all groups improved on mathematical performance, but that reasoning training had the largest positive impact followed by working memory tasks. Both reasoning and memory training significantly outperformed rotation training when it came to mathematical improvement. They also observed that the benefits of cognitive training could differ threefold between individuals. That could explain differences in results from some previous studies seeing as individual characteristics of study participants tend to impact the results.

The researchers note there were some limitations to the study, including the lack of a passive control group that would allow for an estimation of the absolute effect size. Also, this study did not include a group of students who received math training only.

"While it is likely that for any given test, training on that particular skill is the most time-effective way to improve test results, our study offers a proof of principle that spatial cognitive training transfers to academic abilities," Torkel Klingberg says. "Given the wide range of areas associated with spatial cognition, it is possible that training transfers to multiple areas and we believe this should be included in any calculation by teachers and policymakers of how time-efficient spatial training is relative to training for a particular test."

The researchers have received funding by the Swedish Research Council. Torkel Klingberg holds an unpaid position as chief scientific officer for Cognition Matters, the non-profit foundation that owns the cognition training app Vektor that was used in this study.

Examples of training tasks in the study

  • In a number line task, a person is asked to identify the right position of a number on a line bound by a start and an end point. Difficulty is typically moderated by removing spatial cues, for example ticks on the number line, and progress to include mathematical problems such as addition, subtraction and division.
  • In a visual working memory task, a person is asked to recollect visual objects. In this study, the children reproduced a sequence of dots on a grid by touching the screen. Difficulty was increased by adding more items.
  • In a non-verbal reasoning task, a person is asked to complete sequences of spatial patterns. In this study, the children were asked to choose the correct image to fill a blank space based on previous sequences. Difficulty was increased by adding new dimensions such as colors, shapes and dots.
  • In a rotation task, a person is asked to figure out what an object would look like if rotated. In this study, the children were asked to rotate a 2D object to fit various angles. Difficulty was moderated by increasing the angle of the rotation or the complexity of the object being rotated.

       https://www.sciencedaily.com/releases/2021/05/210520133755.htm

Monday, May 24, 2021

Safer Next-generation Nuclear Reactors

Computational researchers develop advanced model, the pebble bed reactor

From: Texas A&M College of Engineering

By Laura Simmons

May 20, 2021 -- When one of the largest modern earthquakes struck Japan on March 11, 2011, the nuclear reactors at Fukushima-Daiichi automatically shut down, as designed. The emergency systems, which would have helped maintain the necessary cooling of the core, were destroyed by the subsequent tsunami. Because the reactor could no longer cool itself, the core overheated, resulting in a severe nuclear meltdown, the likes of which haven’t been seen since the Chernobyl disaster in 1986.  

 Since then, reactors have improved exponentially in terms of safety, sustainability and efficiency. Unlike the light-water reactors at Fukushima, which had liquid coolant and uranium fuel, the current generation of reactors has a variety of coolant options, including molten-salt mixtures, supercritical water and even gases like helium.

Dr. Jean Ragusa and Dr. Mauricio Eduardo Tano Retamales from the Department of Nuclear Engineering at Texas A&M University have been studying a new fourth-generation reactor, pebble bed reactors. Pebble-bed reactors use spherical fuel elements (known as pebbles) and a fluid coolant (usually a gas).

“There are about 40,000 fuel pebbles in such a reactor,” said Ragusa. “Think of the reactor as a really big bucket with 40,000 tennis balls inside.”

During an accident, as the gas in the reactor core begins to heat up, the cold air from below begins to rise, a process known as natural convection cooling. Additionally, the fuel pebbles are made from pyrolytic carbon and tristructural-isotropic particles, making them resistant to temperatures as high as 3,000 degrees Fahrenheit. As a very-high-temperature reactor (VHTR), pebble-bed reactors can be cooled down by passive natural circulation, making it theoretically impossible for an accident like Fukushima to occur.

However, during normal operation, a high-speed flow cools the pebbles. This flow creates movement around and between the fuel pebbles, similar to the way a gust of wind changes the trajectory of a tennis ball. How do you account for the friction between the pebbles and the influence of that friction in the cooling process?

This is the question that Ragusa and Tano aimed to answer in their most recent publication  in the journal Nuclear Technology titled “Coupled Computational Fluid Dynamics–Discrete Element Method Study of Bypass Flows in a Pebble-Bed Reactor.

“We solved for the location of these ‘tennis balls’ using the Discrete Element Method, where we account for the flow-induced motion and friction between all the tennis balls,” said Tano. “The coupled model is then tested against thermal measurements in the SANA experiment.” 

The SANA experiment was conducted in the early 1990s and measured how the mechanisms in a reactor interchange when transmitting heat from the center of the cylinder to the outer part. This experiment allowed Tano and Ragusa to have a standard to which they could validate their models.

As a result, their teams developed a coupled Computational Fluid Dynamics-Discrete Element Methods model for studying the flow over a pebble bed. This model can now be applied to all high-temperature pebble-bed reactors and is the first computational model of its kind to do so. It’s very-high-accuracy tools such as this that allow vendors to develop better reactors. 

“The computational models we create help us more accurately assess different physical phenomena in the reactor,” said Tano. “As a result, reactors can operate at a higher margin, theoretically producing more power while increasing the safety of the reactor. We do the same thing with our models for molten-salt reactors for the Department of Energy.” 

As artificial intelligence continues to advance, its applications to computational modeling and simulation grow. “We’re in a very exciting time for the field,” said Ragusa. “And we encourage any prospective students who are interested in computational modeling to reach out, because this field will hopefully be around for a long time.”

https://engineering.tamu.edu/news/2021/05/nuen-computational-researchers-develop-advanced-model-to-improve-safety-of-next-generation-reactors.html

Sunday, May 23, 2021

Challenging the Standard Model of Cancer

New atavistic model shows role of ancient genes in the spread of cancer

From:  ARIZONA STATE UNIVERSITY

May 20, 2021 -- In spite of decades of research, cancer remains an enigma. Conventional wisdom holds that cancer is driven by random mutations that create aberrant cells that run amok in the body.

In a new paper published this week in the journal BioEssays, Arizona and Australian researchers challenge this model by proposing that cancer is a type of genetic throwback, that progresses via a series of reversions to ancestral forms of life. In contrast with the conventional model, the distinctive capabilities of cancer cells are not primarily generated by mutations, the researchers claim, but are pre-existent and latent in normal cells.

Regents' Professor Paul Davies, director of Arizona State University's Beyond Center for Fundamental Concepts in Science and Kimberly Bussey, cancer geneticist and bioinformatician from the Precision Medicine Program at Midwestern University, Glendale, Ariz., teamed up with Charles Lineweaver and Anneke Blackburn at the Australian National University (ANU) in Canberra to refine what they call the Serial Atavism Model (SAM) of cancer. This model suggests that cancer occurs through multiple steps that resurrect ancient cellular functions.

Such functions are retained by evolution for specific purposes such as embryo development and wound healing, and are usually turned off in the adult form of complex organisms. But they can be turned back on if something compromises the organism's regulatory controls. It is the resulting resurrection steps, or atavistic reversions, that are mostly responsible for the ability of cancer cells to survive, proliferate, resist therapy and metastasize, the researchers said.

Davies and Bussey are also members of ASU's Arizona Cancer Evolution Center (ACE) which seeks to understand cancer, not just in humans, but across all complex species, in the light of evolutionary processes.

"Cancer research has been transformed in recent years by comparing genetic sequences across thousands of species to determine gene ages," Davies said. Just as geologists can date rock strata, so geneticists can date genes, a technique known as phylostratigraphy.

"The atavistic model predicts that the genes needed for cancer's abilities are mostly ancient - in some cases little changed over billions of years," Davies added.

Lineweaver explained, "In biology, nothing makes sense except in the light of evolution, and in the case of cancer nothing makes sense except in the light of the deep evolutionary changes that occurred as we became multicellular organisms."

"The atavistic model of cancer has gained increasing traction around the world," added Bussey. "In part, this is because it makes many predictions that can be tested by phylostratigraphy, unlike the conventional somatic mutation theory."

Blackburn, a cancer biologist in ANU's John Curtin School of Medical Research, agreed.

"Appreciation of the importance of gene ages is growing among oncologists and cancer biologists," she said. "Now we need to use this insight to develop novel therapeutic strategies. A better understanding of cancer can lead to better therapeutic outcomes."

                  https://www.eurekalert.org/pub_releases/2021-05/asu-cts051921.php

Saturday, May 22, 2021

Researchers Find New Form of Carbon

And that new arrangement is not graphene: researchers in Germany and Finland discover new type of atomically thin carbon material

From:  Aalto University

May 20, 2021 -- Carbon exists in various forms. In addition to diamond and graphite, there are recently discovered forms with astonishing properties. For example graphene, with a thickness of just one atomic layer, is the thinnest known material, and its unusual properties make it an extremely exciting candidate for applications like future electronics and high-tech engineering. In graphene, each carbon atom is linked to three neighbours, forming hexagons arranged in a honeycomb network. Theoretical studies have shown that carbon atoms can also arrange in other flat network patterns, while still binding to three neighbours, but none of these predicted networks had been realized until now.

Researchers at the University of Marburg in Germany and Aalto University in Finland have now discovered a new carbon network, which is atomically thin like graphene, but is made up of squares, hexagons, and octagons forming an ordered lattice. They confirmed the unique structure of the network using high-resolution scanning probe microscopy and interestingly found that its electronic properties are very different from those of graphene.

In contrast to graphene and other forms of carbon, the new Biphenylene network — as the new material is named —has metallic properties. Narrow stripes of the network, only 21 atoms wide, already behave like a metal, while graphene is a semiconductor at this size. “These stripes could be used as conducting wires in future carbon-based electronic devices.” said professor Michael Gottfried, at University of Marburg, who leads the team that developed the idea. The lead author of the study, Qitang Fan from Marburg continues, “This novel carbon network may also serve as a superior anode material in lithium-ion batteries, with a larger lithium storage capacity compared to that of the current graphene-based materials.”

The team at Aalto University helped image the material and decipher its properties. The group of Professor Peter Liljeroth carried out the high-resolution microscopy that showed the structure of the material, while researchers led by Professor Adam Foster used computer simulations and analysis to understand the exciting electrical properties of the material.

The new material is made by assembling carbon-containing molecules on an extremely smooth gold surface. These molecules first form chains, which consist of linked hexagons, and a subsequent reaction connects these chains together to form the squares and octagons. An important feature of the chains is that they are chiral, which means that they exist in two mirroring types, like left and right hands. Only chains of the same type aggregate on the gold surface, forming well-ordered assemblies, before they connect. This is critical for the formation of the new carbon material, because the reaction between two different types of chains leads only to graphene. “The new idea is to use molecular precursors that are tweaked to yield biphenylene instead of graphene” explains Linghao Yan, who carried out the high-resolution microscopy experiments at Aalto University.

For now, the teams work to produce larger sheets of the material, so that its application potential can be further explored. However, “We are confident that this new synthesis method will lead to the discovery of other novel carbon networks.” said Professor Liljeroth.

               https://www.aalto.fi/en/news/a-new-form-of-carbon

Friday, May 21, 2021

Disagreement About Forest Management

‘We can’t plant our way out of the climate crisis’

From: University of Michigan

May 20, 2021 -- Some climate activists advocate large-scale tree-planting campaigns in forests around the world to suck up heat-trapping carbon dioxide and help rein in climate change.

But in a Perspectives article scheduled for publication May 21 in the journal Science, a University of Michigan climate scientist and his University of Arizona colleague say the idea of planting trees as a substitute for the direct reduction of greenhouse gas emissions could be a pipe dream.

“We can’t plant our way out of the climate crisis,” said Arizona’s David Breshears, a top expert on tree mortality and forest die-off in the West. His co-author is Jonathan Overpeck, dean of the U-M School for Environment and Sustainability and an expert on paleoclimate and climate-vegetation interactions.

Instead of wasting money by planting lots of trees in a way that is destined to fail, it makes more sense to focus on keeping existing forests healthy so they can continue to act as carbon “sinks,” removing carbon from the atmosphere through photosynthesis and storing it in trees and soils, according to the researchers. At the same time, emissions must be reduced as much as possible, as quickly as possible.

Overpeck and Breshears say they hope the role of the world’s forests—and specifically the urgent need to protect existing forests and keep them intact—is thoroughly debated when the world’s climate action leaders gather at the COP26 climate change conference in Glasgow this November.

“Policymakers need to enable new science, policy and finance mechanisms optimized for the disturbance and vegetation change that is unstoppable, and also to ensure that the trees and forests we wish to plant or preserve for the carbon they sequester survive in the face of climate change and other human threats,” Overpeck and Breshears wrote.

“Failure to meet this challenge will mean that large terrestrial stores of carbon will be lost to the atmosphere, accelerating climate change and the impacts on vegetation that threaten many more of the ecosystem services on which humans depend.”

Keeping forests healthy will require a new approach to forest management, one that Overpeck and Breshears call managing for change. As a first step, policymakers and land managers need to acknowledge that additional large-scale vegetation changes are inevitable.

Climate change has been implicated in record-setting wildfires in the western United States, Australia and elsewhere, as well as extensive tree die-offs that are largely due to hotter, drier climate extremes. Those disturbing trends are expected to accelerate as the climate warms, according to Overpeck and Breshears.

“Even in a world where climate change is soon halted, global temperature rise will likely reach between 1.5 and 2 C above pre-industrial levels, with all the associated extreme heat waves that brings, and thus global vegetation will face up to double the climate change already experienced,” they wrote.

At the same time, deforestation continues to expand globally and is especially damaging in tropical forests, which hold vast amounts of biodiversity and sequestered carbon.

The next step toward a new managing-for-change paradigm is to manage forests proactively for the vegetation changes that can be anticipated—instead of trying to maintain forests as they were in the 20th century, Overpeck and Breshears say.

Managing for change means, for example, more aggressive thinning of forests to reduce the buildup of fuels that stoke massive wildfires. It also means selectively replacing some trees—after a wildfire, for example—that are no longer in optimal climate zones with new species that will thrive now and in coming decades.

Such activities, where needed, will inevitably increase the costs of forest management, according to the researchers. But such costs should be considered a prudent investment, one that helps preserve an underappreciated service that forests provide to humanity for free: carbon storage, also known as carbon sequestration.

Forests are already managed to preserve the natural resources and ecosystem services they provide. In addition to supplying timber, fuelwood, fiber and other products, forests clean the air, filter the water, and help control erosion and flooding. They preserve biodiversity and promote soil formation and nutrient cycling, while offering recreational opportunities such as hiking, camping, fishing and hunting.

Carbon sequestration should rank high on the list of invaluable services that forests provide, and efforts to preserve and enhance this vital function should be funded accordingly, Overpeck and Breshears say.

For example, there’s a big opportunity to improve the ability of forests to store carbon through increased use of biochar, a form of charcoal produced by exposing organic waste matter—such as wood chips, crop residue or manure—to heat in a low-oxygen environment. Large amounts of wood generated during forest thinning projects could be converted to biochar, then added to forest soils to improve their health and increase the amount of carbon that is locked away, Overpeck says.

“Thinning of forests, conversion of the removed wood to biochar and burial of the biochar in forest soils is a way to bring new jobs to forested rural areas while allowing forests to play a bigger role in keeping carbon out of the atmosphere and thus fighting climate change,” he said. “Forest carbon management could be a boon for rural areas in need of new economic engines.”

In the long run, such projects are likely to benefit forests and enhance their ability to store carbon far more than massive tree-planting campaigns conducted without appropriate management strategies, according to Overpeck and Breshears.

“Tree-planting has great appeal to some climate activists because it is easy and not that expensive,” Breshears said. “But it’s like bailing water with a big hole in the bucket: While adding more trees can help slow ongoing warming, we’re simultaneously losing trees because of that ongoing warming.”

In their Perspectives article, Overpeck and Breshears explore the implications of a new study by Ondřej Mottl et al., also scheduled for publication May 21 in Science, titled “Global acceleration in rates of vegetation change over the past 18,000 years.”

https://news.umich.edu/forests-and-climate-change-we-cant-plant-our-way-out-of-the-climate-crisis/

 

Thursday, May 20, 2021

Spiking Immune System to Fight Cancer

A new study shows how engineered immune cells used in new cancer therapies can overcome physical barriers to allow a patient's own immune system to fight tumors.

From:  University of Minnesota

May 14, 2021 -- The research could improve cancer therapies in the future for millions of people worldwide.

The research is published in Nature Communications, a peer-reviewed, open access, scientific journal published by Nature Research.

Instead of using chemicals or radiation, immunotherapy is a type of cancer treatment that helps the patient's immune system fight cancer. T cells are a type of white blood cell that are of key importance to the immune system. Cytotoxic T cells are like soldiers who search out and destroy the targeted invader cells.

While there has been success in using immunotherapy for some types of cancer in the blood or blood-producing organs, a T cell's job is much more difficult in solid tumors.

"The tumor is sort of like an obstacle course, and the T cell has to run the gauntlet to reach the cancer cells," said Paolo Provenzano, the senior author of the study and a biomedical engineering associate professor in the University of Minnesota College of Science and Engineering. "These T cells get into tumors, but they just can't move around well, and they can't go where they need to go before they run out of gas and are exhausted."

In this first-of-its-kind study, the researchers are working to engineer the T cells and develop engineering design criteria to mechanically optimize the cells or make them more "fit" to overcome the barriers. If these immune cells can recognize and get to the cancer cells, then they can destroy the tumor.

In a fibrous mass of a tumor, the stiffness of the tumor causes immune cells to slow down about two-fold -- almost like they are running in quicksand.

"This study is our first publication where we have identified some structural and signaling elements where we can tune these T cells to make them more effective cancer fighters," said Provenzano, a researcher in the University of Minnesota Masonic Cancer Center. "Every 'obstacle course' within a tumor is slightly different, but there are some similarities. After engineering these immune cells, we found that they moved through the tumor almost twice as fast no matter what obstacles were in their way."

To engineer cytotoxic T cells, the authors used advanced gene editing technologies (also called genome editing) to change the DNA of the T cells so they are better able to overcome the tumor's barriers. The ultimate goal is to slow down the cancer cells and speed up the engineered immune cells. The researchers are working to create cells that are good at overcoming different kinds of barriers. When these cells are mixed together, the goal is for groups of immune cells to overcome all the different types of barriers to reach the cancer cells.

Provenzano said the next steps are to continue studying the mechanical properties of the cells to better understand how the immune cells and cancer cells interact. The researchers are currently studying engineered immune cells in rodents and in the future are planning clinical trials in humans.

While initial research has been focused on pancreatic cancer, Provenzano said the techniques they are developing could be used on many types of cancers.

"Using a cell engineering approach to fight cancer is a relatively new field," Provenzano said. "It allows for a very personalized approach with applications for a wide array of cancers. We feel we are expanding a new line of research to look at how our own bodies can fight cancer. This could have a big impact in the future."

          https://www.sciencedaily.com/releases/2021/05/210514134222.htm

Wednesday, May 19, 2021

Certain Proteins Predict Future Dementia

LARGE STUDY OF PLASMA PROTEINS AND DEMENTIA ILLUMINATES THE BIOLOGY OF DEMENTIA AND ALZHEIMER’S AND MAY HELP LEAD TO TREATMENTS

May 17, 2021 -- The development of dementia, often from Alzheimer’s disease, late in life is associated with abnormal blood levels of dozens of proteins up to five years earlier, according to a new study led by researchers at the Johns Hopkins Bloomberg School of Public Health. Most of these proteins were not known to be linked to dementia before, suggesting new targets for prevention therapies.

The findings are based on new analyses of blood samples of over ten thousand middle-aged and elderly people—samples that were taken and stored during large-scale studies decades ago as part of an ongoing study. The researchers linked abnormal blood levels of 38 proteins to higher risks of developing Alzheimers within five years. Of those 38 proteins, 16 appeared to predict Alzheimer’s risk two decades in advance.

Although most of these risk markers may be only incidental byproducts of the slow disease process that leads to Alzheimer’s, the analysis pointed to high levels of one protein, SVEP1, as a likely causal contributor to that disease process.

The study was published May 14 in Nature Aging

“This is the most comprehensive analysis of its kind to date, and it sheds light on multiple biological pathways that are connected to Alzheimer’s,” says study senior author Josef Coresh, MD, PhD, MHS, George W. Comstock Professor in the Department of Epidemiology at the Bloomberg School. “Some of these proteins we uncovered are just indicators that disease might occur, but a subset may be causally relevant, which is exciting because it raises the possibility of targeting these proteins with future treatments.”

More than six million Americans are estimated to have Alzheimer’s, the most common type of dementia, an irreversible fatal condition that leads to loss of cognitive and physical function. Despite decades of intensive study, there are no treatments that can slow the disease process, let alone stop or reverse it. Scientists widely assume that the best time to treat Alzheimer’s is before dementia symptoms develop.

Efforts to gauge people’s Alzheimer’s risk before dementia arises have focused mainly on the two most obvious features of Alzheimer’s brain pathology: clumps of amyloid beta protein known as plaques, and tangles of tau protein. Scientists have shown that brain imaging of plaques, and blood or cerebrospinal fluid levels of amyloid beta or tau, have some value in predicting Alzheimer’s years in advance.

But humans have tens of thousands of other distinct proteins in their cells and blood, and techniques for measuring many of these from a single, small blood sample have advanced in recent years. Would a more comprehensive analysis using such techniques reveal other harbingers of Alzheimer’s? That’s the question Coresh and colleagues sought to answer in this new study.

The researchers’ initial analysis covered blood samples taken during 2011–13 from more than 4,800 late-middle-aged participants in the Atherosclerosis Risk in Communities (ARIC) study, a large epidemiological study of heart disease-related risk factors and outcomes that has been running in four U.S. communities since 1985. Collaborating researchers at a laboratory technology company called SomaLogic used a technology they recently developed, SomaScan, to record levels of nearly 5,000 distinct proteins in the banked ARIC samples.

The researchers analyzed the results and found 38 proteins whose abnormal levels were significantly associated with a higher risk of developing Alzheimer’s in the five years following the blood draw.

They then used SomaScan to measure protein levels from more than 11,000 blood samples taken from much younger ARIC participants in 1993–95. They found that abnormal levels of 16 of the 38 previously identified proteins were associated with the development of Alzheimer’s in the nearly two decades between that blood draw and a follow-up clinical evaluation in 2011–13.

To verify these findings in a different patient population, the scientists reviewed the results of an earlier SomaScan of blood samples taken in 2002–06 during an Icelandic study. That study had assayed proteins including 13 of the 16 proteins identified in the ARIC analyses. Of those 13 proteins, six were again associated with Alzheimer’s risk over a roughly 10-year follow-up period.

In a further statistical analysis, the researchers compared the identified proteins with data from past studies of genetic links to Alzheimer’s. The comparison suggested strongly that one of the identified proteins, SVEP1, is not just an incidental marker of Alzheimer’s risk but is involved in triggering or driving the disease.

SVEP1 is a protein whose normal functions remain somewhat mysterious, although in a study published earlier this year it was linked to the thickened artery condition, atherosclerosis, which underlies heart attacks and strokes.

Other proteins associated with Alzheimer’s risk in the new study included several key immune proteins—which is consistent with decades of findings linking Alzheimer’s to abnormally intense immune activity in the brain.

The researchers plan to continue using techniques like SomaScan to analyze proteins in banked blood samples from long-term studies to identify potential Alzheimer’s-triggering pathways—a potential strategy to suggest new approaches for Alzheimer’s treatments.

The scientists have also been studying how protein levels in the ARIC samples are linked to other diseases such as vascular (blood vessel-related) disease in the brain, heart and the kidney.

First author Keenan Walker, PhD, worked on this analysis while on faculty at the Johns Hopkins University School of Medicine and the Bloomberg School’s Welch Center for Prevention, Epidemiology and Clinical Research. He is currently an investigator with the National Institute of Aging’s Intramural Research Program.

https://www.jhsph.edu/news/news-releases/2021/researchers-identify-proteins-that-predict-future-dementia-alzheimers-risk.html

Tuesday, May 18, 2021

Simple Surgery Prevents Some Strokes

Extends life for patients with heart arrhythmia

From: McMaster University in Canada

May 15, 2021 -- A simple surgery saves patients with heart arrhythmia from often-lethal strokes, says a large international study led by McMaster University.

Researchers found that removing the left atrial appendage— an unused, finger-like tissue that can trap blood in the heart chamber and increase the risk of clots— cuts the risk of strokes by more than one-third in patients with atrial fibrillation.

Even better, the reduced clotting risk comes on top of any other benefits conferred by blood-thinner medications patients with this condition are usually prescribed.

“If you have atrial fibrillation and are undergoing heart surgery, the surgeon should be removing your left atrial appendage, because it is a set-up for forming clots. Our trial has shown this to be both safe and effective for stroke prevention,” said Richard Whitlock, first author of the study.

“This is going to have a positive impact on tens of thousands of patients globally.”

Whitlock is a scientist at the Population Health Research Institute (PHRI), a joint institute of McMaster University and Hamilton Health Sciences (HHS); a professor of surgery at McMaster, the Canada Research Chair in cardiovascular surgical trials, a cardiac surgeon for HHS, and is supported by a Heart and Stroke Foundation career award.

The co-principal investigator of the study is Stuart Connolly who has also advanced this field by establishing the efficacy and safety of newer blood thinners. He is a professor emeritus of medicine at McMaster, a PHRI senior scientist and a HHS cardiologist.

“The results of this study will change practice right away because this procedure is simple, quick and safe for the 15 per cent of heart surgery patients who have atrial fibrillation. This will prevent a great burden of suffering due to stroke,” Connolly said.

The study results were fast tracked into publication by The New England Journal of Medicine and presented at the American College of Cardiology conference today.

The study tracked 4,811 people in 27 countries who are living with atrial fibrillation and taking blood thinners. Consenting patients undertaking cardiopulmonary bypass surgery were randomly selected for the additional left atrial appendage occlusion surgery; their outcomes compared with those who only took medicine. They were all followed for a median of four years.

Whitlock said it was suspected since the 1940s that blood clots can form in the left atrial appendage in patients with atrial fibrillation, and it made sense to cut this useless structure off if the heart was exposed for other surgery. This is now proven to be true.

Atrial fibrillation is common in elderly people and is responsible for about 25 per cent of ischemic strokes which are caused when blood clots block arteries supplying parts of the brain. The average age of patients in the study was 71.

“In the past all we had was medicine. Now we can treat atrial fibrillation with both medicines and surgery to ensure a much better outcome,” said Whitlock.

He said that the current study tested the procedure during cardiac surgery being undertaken for other reasons, but the procedure can also be done through less invasive methods for patients not having heart surgery. He added that future studies to examine that approach will be important.

Whitlock said the left atrial appendage is a leftover from how a person’s heart forms as an embryo and it has little function later in life.

“This is an inexpensive procedure that is safe, without any long-term adverse effects, and the impact is long-term.”

External funding for the study came from the Canadian Institutes of Health Research and the Heart and Stroke Foundation of Canada.

    Simple surgery prevents strokes in heart patients – Brighter World (mcmaster.ca)