Friday, January 31, 2020

Safe Potassium-Ion Batteries


Nonflammable electrolyte for high performance potassium batteries

Materials from Wiley, January 31, 2020 – Australian scientists have developed a nonflammable electrolyte for potassium and potassium-ion batteries, for applications in next-generation energy-storage systems beyond lithium technology. Scientists explain that the novel electrolyte based on an organic phosphate makes the batteries safer and also allows for operation at reduced concentrations, which is a necessary condition for large-scale applications.


Lithium-ion technology still dominates energy-storage applications, but it has intrinsic disadvantages, among which are the price, environmental issues, and the flammability of the electrolyte. Therefore, in next-generation technologies, scientists are replacing the lithium ion with more abundant and much cheaper ions, such as the potassium ion. However, potassium and potassium-ion batteries also face safety issues, and nonflammable electrolytes are not yet available for them.


Materials scientist Zaiping Guo, and her team from the University of Wollongong, Australia have found a solution. The researchers developed an electrolyte based on a flame-retardant material and adapted it for use in potassium batteries. Besides providing nonflammability, it could be operated in batteries at concentrations that are suitable for large-scale applications, write the scientists.


This novel electrolyte contained triethyl phosphate as the sole component of the solvent. This substance is known as a flame retardant. It has been tested in lithium-ion batteries, but only very high concentrations provided enough stability for long-term operation, too high for industrial applications. 
The battery industry demands dilute electrolytes, which are cheaper and ensure better performances. 

By using potassium ions, however, the concentrations could be reduced, the authors reported. They combined the phosphate solvent with a commonly available potassium salt and obtained an electrolyte that did not burn and allowed stable cycling of the assembled battery concentrations of 0.9 to 2 moles per liter, which are concentrations that are suitable for larger scales; for example, in smart-grid applications.


Key to that performance was the formation of a uniform and stable solid-electrolyte interphase layer, according to the authors. They observed this layer, which ensures operability of the electrodes, only with the phosphate electrolyte. Conventional carbonate-based electrolytes were unable to build up this layer. The authors also reported high cycling stability; whereas, under the same conditions, the conventional carbonate-based electrolyte decomposed.


Guo and her team have demonstrated that next-generation potassium-ion batteries can be made safe by using a novel inorganic, phosphate-based electrolyte. They suggest that electrolytes based on flame retardants can be developed further and could be used for the design of other nonflammable battery systems.


https://www.sciencedaily.com/releases/2020/01/200131114739.htm

Thursday, January 30, 2020

Growing Corruption in Academia


The James G. Martin Center for Academic Renewal

The Intellectual and Moral Decline in Academic Research

By Edward Archer


January 29, 2020 -- For most of the past century, the United States was the pre-eminent nation in science and technology.  The evidence for that is beyond dispute: Since 1901, American researchers have won more Nobel prizes in medicine, chemistry and physics than any other nation.  Given our history of discovery, innovation, and success, it is not surprising that across the political landscape Americans consider the funding of scientific research to be both a source of pride and a worthy investment.


Nevertheless, in his 1961 farewell address, President Dwight D. Eisenhower warned that the pursuit of government grants would have a corrupting influence on the scientific community. He feared that while American universities were “historically the fountainhead of free ideas and scientific discovery,” the pursuit of taxpayer monies would become “a substitute for intellectual curiosity” and lead to “domination of the nation’s scholars by Federal employment…and the power of money.” Eisenhower’s fears were well-founded and prescient.


My experiences at four research universities and as a National Institutes of Health (NIH) research fellow taught me that the relentless pursuit of taxpayer funding has eliminated curiosity, basic competence, and scientific integrity in many fields.


Yet, more importantly, training in “science” is now tantamount to grant-writing and learning how to obtain funding. Organized skepticism, critical thinking, and methodological rigor, if present at all, are afterthoughts. Thus, our nation’s institutions no longer perform their role as Eisenhower’s fountainhead of free ideas and discovery. Instead, American universities often produce corrupt, incompetent, or scientifically meaningless research that endangers the public, confounds public policy, and diminishes our nation’s preparedness to meet future challenges.


Nowhere is the intellectual and moral decline more evident than in public health research. From 1970 to 2010, as taxpayer funding for public health research increased 700 percent, the number of retractions of biomedical research articles increased more than 900 percent, with most due to misconduct. Fraud and retractions increased so precipitously from 2010 to 2015 that private foundations created the Center for Scientific Integrity and “Retraction Watch” to alert the public.


One reason non-government organizations lead the battle to improve science is that universities and federal funding agencies lack accountability and often ignore fraud and misconduct. There are numerous examples in which universities refused to hold their faculty accountable until elected officials intervened, and even when found guilty, faculty researchers continued to receive tens of millions of taxpayers’ dollars. Those facts are an open secret: When anonymously surveyed, over 14 percent of researchers report that their colleagues commit fraud and 72 percent report other questionable practices. The problem goes well beyond the known frauds.


The list of elite institutions at which high-profile faculty commit misconduct is growing rapidly.


In 2018, Duke University was the eighth-largest recipient of NIH funding with $475 million, and in 2019, Duke acquired over $570 million. Yet, in 2014, a whistleblower at Duke alleged that $200 million in grants were obtained using falsified data. Despite the retraction of nearly 50 papers, Duke refused to take responsibility and mounted a legal battle. The result was a $112.5 million penalty in which Duke did not have to admit culpability. That was Duke’s second major misconduct debacle in a decade. In each instance, Duke fought to keep the details confidential while continuing to receive hundreds of millions in public funds.


Harvard is the wealthiest university in the world and, despite being a private institution, received almost $600 million in public funds from the NIH and other agencies in 2018. In fact, some faculty received more NIH funding than many states, and these funds are sufficient to pay for tuition, room, board, and books of every undergrad at Harvard. Nevertheless, Harvard’s faculty has an ever-increasing number of retractions due to misconduct or incompetence. In one case, Harvard’s teaching hospital was forced to pay $10 million because its faculty had fraudulently obtained NIH funding. The penalty was only a fraction of the NIH funds acquired by the guilty faculty.


More recently, Cornell opened its third investigation of a researcher who received more than $4.6 million from the United States Department of Agriculture and $3.3 million from the NIH. As is typical, Cornell exonerated its faculty member in the initial investigation and only reinvestigated after intense media scrutiny.


Ubiquitous sexual harassment is also emblematic of the moral decline in academic science. The number of academics found responsible for sexual harassment has skyrocketed. Yet most universities simply “pass the harasser” so that faculty can transfer their grants to another institution. A 2019 headline in The Chronicle of Higher Education read: “‘Pass the Harasser’ Is Higher Ed’s Worst-Kept Secret.” The NIH has been slow to respond and “apologizes for lack of action on sexual harassers.”

Retractions, misconduct, and harassment are only part of the decline. Incompetence is another. An article in The Economist suggested, “[f]raud is very likely second to incompetence in generating erroneous results.”


The widespread inability of publicly funded researchers to generate valid, reproducible findings is a testament to the failure of universities to properly train scientists and instill intellectual and methodologic rigor. That failure means taxpayers are being misled by results that are non-reproducible or demonstrably false.


A number of critics, including John Ioannidis of Stanford University, contend that academic research is often “conducted for no other reason than to give physicians and researchers qualifications for promotion or tenure.” In other words, taxpayers fund studies that are conducted for non-scientific reasons such as career advancement and “policy-based evidence-making.”


Incompetence in concert with a lack of accountability and political or personal agendas has grave consequences: The Economist stated that from 2000 to 2010, nearly 80,000 patients were involved in clinical trials based on research that was later retracted.


Beginning in 2013, my colleagues and I published a series of empirical refutations in top medical and scientific journals showing that no human could survive on the diets used by the U.S. government to create the Dietary Guidelines for Americans. To be precise, we demonstrated that the methods used by government and academic researchers produced data that were physiologically implausible and inadmissible as scientific evidence.


Yet, rather than address the consequences of our refutations, academic researchers simply ignored the evidence. That lack of scientific integrity leads to evermore faculty and students using demonstrably implausible dietary data every year. Given that taxpayers fund thousands of meaningless studies that generate erroneous and often ridiculous conclusions (e.g., eggs cause heart disease or coffee causes cancer), it is unsurprising that policy architects and the public are confused about “healthy eating.”


As Eisenhower feared, the pursuit of government grants corrupted our nation’s scholars and money has now become a substitute for intellectual integrity and curiosity. Nevertheless, reform is possible.


Currently, universities take 52 percent of each NIH grant as “indirect costs” to cover administrative expenses. That revenue incentivizes both a lack of accountability and misconduct while allowing the wealthiest 10 percent of universities to receive 90 percent of NIH funding. Ending the “indirect cost” legerdemain and instituting mandatory penalties against universities for faculty misconduct would effectively double research funding while disincentivizing fraud and harassment.


Second, the NIH should limit the number of projects an investigator can simultaneously control and institute a mandatory age limit for grant recipients.  Currently, the NIH gives more money to investigators aged 56-75 than aged 24-40. Since innovation and discovery occur early in a scientist’s career, that policy would stop elderly but well-connected researchers from impeding progress.


Finally, disallow the use of taxpayer monies for publicity. Currently, investigators have free rein to “hype” their research with taxpayer funds. Misleading or exaggerated claims in press releases has contributed significantly to the public’s confusion on nutrition and many other health issues.


Edward Archer PhD, MS, currently serves as chief science officer for EvolvingFX. His research has been profiled in The New York Times, Los Angeles Times, The Atlantic, ABC World News, and numerous online venues.


https://www.jamesgmartin.center/2020/01/the-intellectual-and-moral-decline-in-academic-research/


= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

Afterword



The link for this article leads to the text above and to the comments made about it – some of them in sizzling agreement with what Edward Archer has written.

Wednesday, January 29, 2020

Thinner, Cheaper Solar Cells


Solar panel costs have dropped lately, but slimming down silicon wafers could lead to even lower costs and faster industry expansion.

By David L. Chandler | MIT News Office
January 26, 2020 -- Costs of solar panels have plummeted over the last several years, leading to rates of solar installations far greater than most analysts had expected. But with most of the potential areas for cost savings already pushed to the extreme, further cost reductions are becoming more challenging to find.


Now, researchers at MIT and at the National Renewable Energy Laboratory (NREL) have outlined a pathway to slashing costs further, this time by slimming down the silicon cells themselves.

Thinner silicon cells have been explored before, especially around a dozen years ago when the cost of silicon peaked because of supply shortages. But this approach suffered from some difficulties: The thin silicon wafers were too brittle and fragile, leading to unacceptable levels of losses during the manufacturing process, and they had lower efficiency. The researchers say there are now ways to begin addressing these challenges through the use of better handling equipment and some recent developments in solar cell architecture.


The new findings are detailed in a paper in the journal Energy and Environmental Science, co-authored by MIT postdoc Zhe Liu, professor of mechanical engineering Tonio Buonassisi, and five others at MIT and NREL.


The researchers describe their approach as “technoeconomic,” stressing that at this point economic considerations are as crucial as the technological ones in achieving further improvements in affordability of solar panels.


Currently, 90 percent of the world’s solar panels are made from crystalline silicon, and the industry continues to grow at a rate of about 30 percent per year, the researchers say. Today’s silicon photovoltaic cells, the heart of these solar panels, are made from wafers of silicon that are 160 micrometers thick, but with improved handling methods, the researchers propose this could be shaved down to 100 micrometers —  and eventually as little as 40 micrometers or less, which would only require one-fourth as much silicon for a given size of panel.


That could not only reduce the cost of the individual panels, they say, but even more importantly it could allow for rapid expansion of solar panel manufacturing capacity. That’s because the expansion can be constrained by limits on how fast new plants can be built to produce the silicon crystal ingots that are then sliced like salami to make the wafers. These plants, which are generally separate from the solar cell manufacturing plants themselves, tend to be capital-intensive and time-consuming to build, which could lead to a bottleneck in the rate of expansion of solar panel production. Reducing wafer thickness could potentially alleviate that problem, the researchers say.


The study looked at the efficiency levels of four variations of solar cell architecture, including PERC (passivated emitter and rear contact) cells and other advanced high-efficiency technologies, comparing their outputs at different thickness levels. The team found there was in fact little decline in performance down to thicknesses as low as 40 micrometers, using today’s improved manufacturing processes.


“We see that there’s this area (of the graphs of efficiency versus thickness) where the efficiency is flat,” Liu says, “and so that’s the region where you could potentially save some money.” Because of these advances in cell architecture, he says, “we really started to see that it was time to revisit the cost benefits.”


Changing over the huge panel-manufacturing plants to adapt to the thinner wafers will be a time-consuming and expensive process, but the analysis shows the benefits can far outweigh the costs, Liu says. It will take time to develop the necessary equipment and procedures to allow for the thinner material, but with existing technology, he says, “it should be relatively simple to go down to 100 micrometers,” which would already provide some significant savings. Further improvements in technology such as better detection of microcracks before they grow could help reduce thicknesses further.


In the future, the thickness could potentially be reduced to as little as 15 micrometers, he says. New technologies that grow thin wafers of silicon crystal directly rather than slicing them from a larger cylinder could help enable such further thinning, he says.


Development of thin silicon has received little attention in recent years because the price of silicon has declined from its earlier peak. But, because of cost reductions that have already taken place in solar cell efficiency and other parts of the solar panel manufacturing process and supply chain, the cost of the silicon is once again a factor that can make a difference, he says.


“Efficiency can only go up by a few percent. So if you want to get further improvements, thickness is the way to go,” Buonassisi says. But the conversion will require large capital investments for full-scale deployment.


The purpose of this study, he says, is to provide a roadmap for those who may be planning expansion in solar manufacturing technologies. By making the path “concrete and tangible,” he says, it may help companies incorporate this in their planning. “There is a path,” he says. “It’s not easy, but there is a path. And for the first movers, the advantage is significant.”


What may be required, he says, is for the different key players in the industry to get together and lay out a specific set of steps forward and agreed-upon standards, as the integrated circuit industry did early on to enable the explosive growth of that industry. “That would be truly transformative,” he says.


Andre Augusto, an associate research scientist at Arizona State University who was not connected with this research, says “refining silicon and wafer manufacturing is the most capital-expense (capex) demanding part of the process of manufacturing solar panels. So in a scenario of fast expansion, the wafer supply can become an issue. Going thin solves this problem in part as you can manufacture more wafers per machine without increasing significantly the capex.” He adds that “thinner wafers may deliver performance advantages in certain climates,” performing better in warmer conditions.


Renewable energy analyst Gregory Wilson of Gregory Wilson Consulting, who was not associated with this work, says “The impact of reducing the amount of silicon used in mainstream cells would be very significant, as the paper points out. The most obvious gain is in the total amount of capital required to scale the PV industry to the multi-terawatt scale required by the climate change problem. Another benefit is in the amount of energy required to produce silicon PV panels. This is because the polysilicon production and ingot growth processes that are required for the production of high efficiency cells are very energy intensive.”


Wilson adds “Major PV cell and module manufacturers need to hear from credible groups like Prof. Buonassisi’s at MIT, since they will make this shift when they can clearly see the economic benefits.”

The team also included Sarah Sofia, Hannu Lane, Sarah Wieghold and Marius Peters at MIT and Michael Woodhouse at NREL. The work was partly supported by the U.S. Department of Energy, the Singapore-MIT Alliance for Research and Technology (SMART), and by a Total Energy Fellowship through the MIT Energy Initiative.


                               http://news.mit.edu/2020/cheaper-solar-cells-thinner-0127

Tuesday, January 28, 2020

Nanoparticles Shrink Vascular Plaques


MSU -- 27, 2020 -- Michigan State University and Stanford University scientists have invented a nanoparticle that eats away – from the inside out – portions of plaques that cause heart attacks.


Bryan Smith, associate professor of biomedical engineering at MSU, and a team of scientists created a “Trojan Horse” nanoparticle that can be directed to eat debris, reducing and stabilizing plaque. The discovery could be a potential treatment for atherosclerosis, a leading cause of death in the United States.


The results, published in the current issue of Nature Nanotechnology, showcases the nanoparticle that homes in on atherosclerotic plaque due to its high selectivity to a particular immune cell type – monocytes and macrophages. Once inside the macrophages in those plaques, it delivers a drug agent that stimulates the cell to engulf and eat cellular debris. Basically, it removes the diseased/dead cells in the plaque core. By reinvigorating the macrophages, plaque size is reduced and stabilized.

Smith said that future clinical trials on the nanoparticle are expected to reduce the risk of most types of heart attacks, with minimal side effects due to the unprecedented selectivity of the nanodrug.


Smith’s studies focus on intercepting the signaling of the receptors in the macrophages and sending a message via small molecules using nano-immunotherapeutic platforms. Previous studies have acted on the surface of the cells, but this new approach works intracellularly and has been effective in stimulating macrophages.


“We found we could stimulate the macrophages to selectively eat dead and dying cells – these inflammatory cells are precursor cells to atherosclerosis – that are part of the cause of heart attacks,” Smith said. “We could deliver a small molecule inside the macrophages to tell them to begin eating again.”


This approach also has applications beyond atherosclerosis, he added.


"We were able to marry a groundbreaking finding in atherosclerosis by our collaborators with the state-of-the-art selectivity and delivery capabilities of our advanced nanomaterial platform. We demonstrated the nanomaterials were able to selectively seek out and deliver a message to the very cells needed,” Smith said. “It gives a particular energy to our future work, which will include clinical translation of these nanomaterials using large animal models and human tissue tests. We believe it is better than previous methods.”


Smith has filed a provisional patent and will begin marketing it later this year.


https://msutoday.msu.edu/news/2020/nanoparticle-chomps-away-plaques-that-cause-heart-attacks/

Monday, January 27, 2020

Commercial Air Travel Is Safest Ever


The rate of passenger fatalities has declined yet again in the last decade, accelerating a long-term trend.

By Peter Dizikes | MIT News Office



January 23, 2020  -- It has never been safer to fly on commercial airlines, according to a new study by an MIT professor that tracks the continued decrease in passenger fatalities around the globe.


The study finds that between 2008 and 2017, airline passenger fatalities fell significantly compared to the previous decade, as measured per individual passenger boardings — essentially the aggregate number of passengers. Globally, that rate is now one death per 7.9 million passenger boardings, compared to one death per 2.7 million boardings during the period 1998-2007, and one death per 1.3 million boardings during 1988-1997.


Going back further, the commercial airline fatality risk was one death per 750,000 boardings during 1978-1987, and one death per 350,000 boardings during 1968-1977.


“The worldwide risk of being killed had been dropping by a factor of two every decade,” says Arnold Barnett, an MIT scholar who has published a new paper summarizing the study’s results. “Not only has that continued in the last decade, the [latest] improvement is closer to a factor of three. The pace of improvement has not slackened at all even as flying has gotten ever safer and further gains become harder to achieve. That is really quite impressive and is important for people to bear in mind.”


The paper, “Aviation Safety: A Whole New World?” was published online this month in Transportation Science. Barnett is the sole author.


The new research also reveals that there is discernible regional variation in airline safety around the world. The study finds that the nations housing the lowest-risk airlines are the U.S., the members of the European Union, China, Japan, Canada, Australia, New Zealand, and Israel. The aggregate fatality risk among those nations was one death per 33.1 million passenger boardings during 2008-2017. Barnett chose the nation as the unit of measurement in the study because important safety regulations for both airlines and airports are decided at the national level.


For airlines in a second set of countries, which Barnett terms the “advancing” set with an intermediate risk level, the rate is one death per 7.4 million boardings during 2008-2017. This group — comprising countries that are generally rapidly industrializing and have recently achieved high overall life expectancy and GDP per capita — includes many countries in Asia as well as some countries in South America and the Middle East.


For a third and higher-risk set of developing countries, including some in Asia, Africa, and Latin America, the death risk during 2008-2017 was one per 1.2 million passenger boardings — an improvement from one death per 400,000 passenger boardings during 1998-2007.


“The two most conspicuous changes compared to previous decades were sharp improvements in China and in Eastern Europe,” says Barnett, who is the George Eastman Professor of Management at the MIT Sloan School of Management. In those places, he notes, had safety achievements in the last decade that were strong even within the lowest-risk group of countries.


Overall, Barnett suggests, the rate of fatalities has declined far faster than public fears about flying.

“Flying has gotten safer and safer,” Barnett says. “It’s a factor of 10 safer than it was 40 years ago, although I bet anxiety levels have not gone down that much. I think it’s good to have the facts.”


Barnett is a long-established expert in the field of aviation safety and risk, whose work has helped contextualize accident and safety statistics. Whatever the absolute numbers of air crashes and fatalities may be — and they fluctuate from year to year — Barnett has sought to measure those numbers against the growth of air travel.


To conduct the current study, Barnett used data from a number of sources, including the Flight Safety Foundation’s Aviation Safety Network Accident Database. He mostly used data from the World Bank, based on information from the International Civil Aviation Organization, to measure the number of passengers carried, which is now roughly 4 billion per year.


In the paper, Barnett discusses the pros and cons of some alternative metrics that could be used to evaluate commercial air safety, including deaths per flight and deaths per passenger miles traveled. 

He prefers to use deaths per boarding because, as he writes in the paper, “it literally reflects the fraction of passengers who perished during air journeys.”


The new paper also includes historical data showing that even in today’s higher-risk areas for commerical aviation, the fatality rate is better, on aggregate, than it was in the leading air-travel countries just a few decades in the past.

\
“The risk now in the higher-risk countries is basically the risk we used to have 40-50 years ago” in the safest air-travel countries, Barnett notes.


Barnett readily acknowledges that the paper is evaluating the overall numbers, and not providing a causal account of the air-safety trend; he says he welcomes further research attempting to explain the reasons for the continued gains in air safety.


In the paper, Barnett also notes that year-to-year air fatality numbers have notable variation. In 2017, for instance, just 12 people died in the process of air travel, compared to 473 in 2018.


“Even if the overall trendline is [steady], the numbers will bounce up and down,” Barnett says. For that reason, he thinks looking at trends a decade at a time is a better way of grasping the full trajectory of commercial airline safety.


On a personal level, Barnett says he understands the kinds of concerns people have about airline travel. He began studying the subject partly because of his own worries about flying, and quips that he was trying to “sublimate my fears in a way that might be publishable.”


Those kinds of instinctive fears may well be natural, but Barnett says he hopes that his work can at least build public knowledge about the facts and put them into perspective for people who are afraid of airplane accidents.


“The risk is so low that being afraid to fly is a little like being afraid to go into the supermarket because the ceiling might collapse,” Barnett says.


http://news.mit.edu/2020/study-commercial-flights-safer-ever-0124

Sunday, January 26, 2020

The Great Galveston Hurricane


The Great Galveston hurricane, known regionally as the Great Storm of 1900, was the deadliest natural disaster in United States history, one of the deadliest hurricanes (or remnants) to affect Canada, and the fourth-deadliest Atlantic hurricane overall. The hurricane left between 6,000 and 12,000 fatalities in the United States; the number most cited in official reports is 8,000. Most of these deaths occurred in and near Galveston, Texas, after storm surge inundated the coastline with 8 to 12 ft (2.4 to 3.7 m) of water. In addition to the number killed, the storm destroyed about 7,000 buildings of all uses in Galveston, which included 3,636 destroyed homes; every dwelling in the city suffered some degree of damage. The hurricane left approximately 10,000 people in the city homeless, out of a total population of nearly 38,000. The disaster ended the Golden Era of Galveston, as the hurricane alarmed potential investors, who turned to Houston instead. In response to the storm, three engineers designed and oversaw plans to raise the Gulf of Mexico shoreline of Galveston island by 17 ft (5.2 m) and erect a 10 mi (16 km) seawall.


On August 27, 1900, a ship east of the Windward Islands detected a tropical cyclone, the first observed during the annual season. Initially at tropical storm status, it remained mostly stagnant in intensity while moving steadily west-northwestward and entered the northeastern Caribbean Sea on August 30. The storm made landfall in the Dominican Republic as a weak tropical storm on September 2. It weakened slightly while crossing Hispaniola, before re-emerging into the Caribbean Sea later that day. On September 3, the cyclone struck modern day Santiago de Cuba Province and then slowly drifted along the southern coast of Cuba. Upon reaching the Gulf of Mexico on September 6, the storm strengthened into a hurricane. Significant intensification followed and the system peaked as a Category 4 hurricane with maximum sustained winds of 145 mph (230 km/h) on September 8. Early on the next day, it made landfall to the south of Houston, Texas. The cyclone weakened quickly after moving inland and fell to tropical storm intensity late on September 9. The storm turned east-northeastward and became extratropical over Iowa on September 11. The extratropical system strengthened while accelerating across the Midwestern United States, New England, and Eastern Canada before reaching the Gulf of Saint Lawrence on September 13. After striking Newfoundland later that day, the extratropical storm entered the far North Atlantic Ocean and weakened, with the remnants last observed near Iceland on September 15.


The great storm brought flooding and severe thunderstorms to portions of the Caribbean, especially Cuba and Jamaica. It is likely that much of South Florida experienced tropical storm-force winds, though mostly minor damage occurred. Hurricane-force winds and storm surge inundated portions of southern Louisiana, though the cyclone left no significant structural damage or fatalities in the state. 

The hurricane brought strong winds and storm surge to a large portion of east Texas, with Galveston suffering the brunt of the impact. Farther north, the storm and its remnants continued to produce heavy rains and gusty winds, which downed telegraph wires, signs, and trees in several states. 

Fatalities occurred in other states, including fifteen in Ohio, six in Wisconsin, two in Illinois, two in New York, one in Massachusetts, and one in Missouri. Damage from the storm throughout the United States exceeded $34 million. The remnants also brought severe impact to Canada. In Ontario, damage reached about $1.35 million, with $1 million to crops. The remnants of the hurricane caused at least 52 deaths – and possibly as many as 232 deaths – in Canada, mostly due to sunken vessels near Newfoundland and the French territory of Saint-Pierre. Throughout its path, the storm caused more than $35.4 million in damage.


History of Galveston Storms


The city of Galveston, formally founded in 1839, had weathered numerous storms, all of which the city survived with ease. In the late 19th century, Galveston was a booming town, with the population increasing from 29,084 people in 1890 to 37,788 people in 1900. The city was the fourth largest municipality in terms of population in the state of Texas in 1900, and had among the highest per capita income rates in the United States. Galveston had many ornate business buildings in a downtown section called The Strand, which was considered the "Wall Street of the Southwest". The city's position on the natural harbor of Galveston Bay along the Gulf of Mexico made it the center of trade in Texas, and one of the busiest ports in the nation. With this prosperity came a sense of complacency, as residents believed any future storms would be no worse than previous events. In fact, Galveston Weather Bureau section director Isaac Cline wrote an 1891 article in the Galveston Daily News that it would be impossible for a hurricane of significant strength to strike the island.


A quarter of a century earlier, the nearby town of Indianola on Matagorda Bay was undergoing its own boom. Then in 1875, a powerful hurricane blew through, nearly destroying the town. Indianola was rebuilt, though a second hurricane in 1886 caused most of the town's residents to move elsewhere. Many Galveston residents took the destruction of Indianola as an object lesson on the threat posed by hurricanes. Galveston is built on a low, flat island, little more than a large sandbar along the Gulf Coast. These residents proposed a seawall be constructed to protect the city, but the majority of the population and the city's government dismissed their concerns. Cline further argued in his 1891 article in the Galveston Daily News that a seawall was not needed due to his belief that a strong hurricane would not strike the island. As a result, the seawall was not built, and development activities on the island actively increased its vulnerability to storms. Sand dunes along the shore were cut down to fill low areas in the city, removing what little barrier there was to the Gulf of Mexico.


https://en.wikipedia.org/wiki/1900_Galveston_hurricane


Saturday, January 25, 2020

Strongest Origami Structures


Kirigimi Designs Hold Thousands of Times Their Own Weight


University of Pennsylvania – January 22, 2020 -- Researchers find a new set of geometric motifs that can create self-locking, lightweight, durable structures out of soft materials. The kirigami-inspired designs can support 14,000 times their weight and, because they don't require adhesives or fasteners, can easily be flattened and re-folded.


The Japanese art of origami (from ori, folding, and kami, paper) transforms flat sheets of paper into complex sculptures. Variations include kirigami (from kiri, to cut), a version of origami that allows materials to be cut and reconnected using tape or glue.


But while both art forms are a source of ideas for science, architecture, and design, each has fundamental limitations. The flat folds required by origami result in an unlockable overall structure, while kirigami creations can't be unfolded back into their original, flattened states because of the adhesive.


Taking inspiration from both art forms, researchers describe a new set of motifs for creating lightweight, strong, and foldable structures using soft materials. These kirigami structures can support 14,000 times their weight and, because they don't require adhesives or fasteners, can easily be flattened and refolded. Published in Physical Review X, the work was conducted by visiting graduate student Xinyu Wang and professor Randall Kamien of the University of Pennsylvania in collaboration with Simon Guest from the University of Cambridge.


Wang, a Ph.D. student at Southeast University, was interested in studying the mechanical properties of origami and kirigami structures and reached out to Kamien to start a new collaboration. After Wang arrived at the Kamien lab in September 2018, Kamien asked her to try some new designs using his group's set of rules for exploring kirigami structures.


Shortly thereafter, Wang showed Kamien a new design for a kirigami triangle that had tilted walls. Kamien was initially surprised to see that Wang had left the excess flaps from the cuts in place. "The usual kirigami route is to cut that off and tape it," says Kamien. Wang "found that, in this particular geometry, you can get the flaps to fit."


While a single triangle wasn't particularly strong on its own, the researchers noticed that when several were arranged in a repetitive design, the force they could support was much greater than expected. "Here was this structure that didn't require tape, it had cuts, and it was really strong," Kamien says. "Suddenly, we have this system that we hadn't anticipated at all."


To figure out what made this geometry so resilient, Wang made several versions of different "soft" materials, including paper, copper, and plastic. She also made versions where the cut flaps were taped, cut, or damaged. Using industry-grade tension and compression testing equipment at the Laboratory for Research on the Structure of Matter, the scientists found that the geometric structure could support 14,000 times its own weight. The tilted, triangular design was strongest when the flaps were undamaged and untapped, and it was also stronger than the same design with vertical walls.


With the help of Guest, the researchers realized that two deviations from the group's typical kirigami rules were key to the structure's strength. When the walls of the triangles are angled, any force applied to the top can be translated into horizontal compression within the center of the design. "With the vertical ones, there's no way to turn a downward force into a sideways force without bending the paper," says Kamien. They also found that the paper-to-paper overlap from leaving the cut flaps in place allowed the triangles to press up against their neighbors, which helped distribute the vertical load.


This paper is yet another example of how kirigami can be used as a "tool" for scientists and engineers, this time for creating strong, rigid objects out of soft materials. "We figured out how to use materials that can bend and stretch, and we can actually strengthen these materials," says Wang. One possible application could be to make inexpensive, lightweight, and deployable structures, such as temporary shelter tents that are strong and durable but can also be easily assembled and disassembled.


Kamien also pictures this Interleaved Kirigami Extension Assembly as a way to create furniture in the future. "Someday, you'll go to IKEA, you fold the box into the furniture, and the only thing inside is the cushion. You don't need any of those connectors or little screws," says Kamien.


Thanks to Wang's "inspired" design and Kamien's burgeoning collaboration with Wang and her advisors Jianguo Cai and Jian Feng , the possibilities for future ideas and designs are endless. "There were things about this study that are totally outside the scope of what a physicist would know," says Kamien. "It was this perfect blend of what I could do and what she could do."

                      https://www.sciencedaily.com/releases/2020/01/200122080509.htm

Friday, January 24, 2020

Wuhan Coronavirus Outbreak


An outbreak of a novel coronavirus was initially identified during mid-December 2019 in the city of Wuhan in central China, as an emerging cluster of people with pneumonia with no clear cause, which was linked primarily to stallholders who worked at the Huanan Seafood Wholesale Market, which also sold live animals. Chinese scientists subsequently isolated a new strain of coronavirus – given the initial designation of 2019-nCoV – which has been found to be at least 70 percent similar in genome sequence to SARS-CoV. With the development of a specific diagnostic PCR test for detecting the infection, a number of cases were confirmed in people directly linked to the market and in those who were not directly associated with it. Whether this virus is of the same severity or lethality as SARS is unclear.


On 20 January 2020, Chinese premier Li Keqiang urged decisive and effective efforts to prevent and control the pneumonia epidemic caused by a novel coronavirus. As of 24 January 2020, 26 deaths have occurred, all in China, and there is evidence of human-to-human transmission. Extensive testing has revealed over 900 confirmed cases in China, some of whom are healthcare workers. Confirmed cases have also been reported in Thailand, South Korea, Japan, Taiwan, Macau, Hong Kong, the United States, Singapore, Vietnam , France and Nepal.


On 23 January 2020, the WHO decided against declaring the outbreak a public health emergency of international concern. The WHO had previously warned that a wider outbreak was possible, and there were concerns of further transmission during China's peak travel season around the Chinese New Year. Many New Year events have been closed over fear of transmission, including the Forbidden City in Beijing, traditional temple fairs, and other celebratory gatherings. The sudden increase in occurrences of the disease has raised questions relating to its origin, wildlife trade, uncertainties surrounding the virus's ability to spread and cause harm, whether the virus has been circulating for longer than previously thought, and the possibility of the outbreak being a super-spreader event.


The first suspected cases were notified to WHO on 31 December 2019, with the first instances of symptomatic illness appearing just over three weeks earlier on 8 December 2019.  The market was closed off on 1 January 2020, and people who showed signs and symptoms of the coronavirus infection were isolated. Over 700 people, including more than 400 healthcare workers who came into close contact with possibly infected individuals, were initially monitored. After the development of a specific diagnostic PCR test for detecting the infection, the presence of 2019-nCoV was subsequently confirmed in 41 people in the original Wuhan cluster. Of those 41 people, two were later reported to be a married couple, one of whom had not been present in the marketplace, and another three were members of the same family that worked at the marketplace's seafood stalls. The first confirmed death from the coronavirus infection occurred on 9 January 2020.


On 23 January 2020, Wuhan was placed under quarantine, in which all public transport in and out of Wuhan has been suspended. The nearby cities of Huanggang, Ezhou, Chibi, Jingzhou, and Zhijiang were also placed under quarantine from 24 January.


Background


In Wuhan, during December 2019, an inaugural cluster of cases displaying the symptoms of a "pneumonia of unknown cause" was linked to a wholesale animal and fish market, which had a thousand stalls selling chickens, pheasants, bats, marmots, venomous snakes, spotted deer and the organs of rabbits and other wild animals (ye wei), i.e. bushmeat, the immediate hypothesis was that this was a novel coronavirus from an animal source (a zoonosis).


Coronaviruses mainly circulate among animals, but have been known to evolve and infect humans in the past as has been seen with SARS, MERS together with four further coronaviruses found in humans that cause mild respiratory symptoms like the common cold.


All six of those already known coronaviruses can spread from human to human. In 2002, with an origin in horseshoe bats, then via civets from live animal markets, an outbreak of SARS started in mainland China, and with the help of a few super-spreaders and international air travel, reached as far as Canada and the United States, resulting in over 700 deaths worldwide. The last case occurred in 2004. At the time, China was criticised by the WHO for its handling of the epidemic. Ten years after the onset of SARS, the dromedary-camel-related coronavirus, MERS, has resulted in more than 850 deaths in 27 countries. The Wuhan outbreak's association with a large seafood and animal market has led to the presumption of the illness having an animal source. This has resulted in the fear that it would be similar to the previous SARS outbreak, a concern exacerbated by the expectation of a high numbers of travellers for Chinese New Year, which begins on 25 January 2020.


Wuhan is the capital of Hubei province and is the seventh-largest city in China, with a population of more than 11 million people. It is a major transportation hub of the country, long known as the "Nine Provinces' Thoroughfare" (九省通衢). It is approximately 1,100 km (700 mi) south of Beijing, 800 km (500 mi) west of Shanghai, and 970 km (600 mi) north of Hong Kong. It is considered today as the political, economic, financial, commercial, cultural and educational centre of Central China. Direct flights from Wuhan also connect with Europe: six flights weekly to Paris, three weekly to London, and five weekly to Rome.


Since 2000, the World Health Organization has coordinated international reactions against several new diseases such as MERS, SARS (2003–2004), 2009 swine flu, and others.


Epidemiology


Confirmed cases outside of mainland China include four women and one man in Thailand, one man in Japan, one woman in South Korea, one woman and two men in Taiwan, two men in Hong Kong, two men in Vietnam, two men and one woman in Singapore, one man and one woman in the United States, and one man in Macau. The figures are supported by experts including Michael Osterholm.


On 17 January, an Imperial College group in the UK published a Fermi estimate that there had been 1,723 cases (95% confidence interval, 427–4,471) with onset of symptoms by 12 January 2020. This was based on the pattern of the initial spread to Thailand and Japan. They also concluded that "self-sustaining human-to-human transmission should not be ruled out", which has since been confirmed as happening. As further cases came to light, they later recalculated that "4,000 cases of 2019-nCoV in Wuhan City... had onset of symptoms by 18th January 2020".  A Hong Kong University group has reached a similar conclusion as the earlier study, with additional detail on transport within China.


On 20 January, China reported a sharp rise in cases with nearly 140 new patients, including two people in Beijing and one in Shenzhen. As of 24 January, the number of laboratory-confirmed cases stands at 911, including 889 in Mainland China, 5 in Thailand, 3 in Singapore, 3 in Taiwan, 2 in Hong Kong, 2 in Macau, 2 in Vietnam, 2 in Japan, 2 in South Korea and one in the United States.


                          https://en.wikipedia.org/wiki/2019–20_Wuhan_coronavirus_outbreak

Thursday, January 23, 2020

Researchers Regrow Damaged Nerves


University of Pittsburgh – January 22, 2020 -- University of Pittsburgh School of Medicine researchers have created a biodegradable nerve guide -- a polymer tube -- filled with growth-promoting protein that can regenerate long sections of damaged nerves, without the need for transplanting stem cells or a donor nerve.

So far, the technology has been tested in monkeys, and the results of those experiments appeared today in Science Translational Medicine.

"We're the first to show a nerve guide without any cells was able to bridge a large, 2-inch gap between the nerve stump and its target muscle," said senior author Kacey Marra, Ph.D., professor of plastic surgery at Pitt and core faculty at the McGowan Institute for Regenerative Medicine. "Our guide was comparable to, and in some ways better than, a nerve graft."

Half of wounded American soldiers return home with injuries to their arms and legs, which aren't well protected by body armor, often resulting in damaged nerves and disability. Among civilians, car crashes, machinery accidents, cancer treatment, diabetes and even birth trauma can cause significant nerve damage, affecting more than 20 million Americans.

Peripheral nerves can regrow up to a third of an inch on their own, but if the damaged section is longer than that, the nerve can't find its target. Often, the disoriented nerve gets knotted into a painful ball called a neuroma.

The most common treatment for longer segments of nerve damage is to remove a skinny sensory nerve at the back of the leg -- which causes numbness in the leg and other complications, but has the least chance of being missed -- chop it into thirds, bundle the pieces together and then sew them to the end of the damaged motor nerve, usually in the arm. But only about 40 to 60% of the motor function typically returns.

"It's like you're replacing a piece of linguini with a bundle of angel hair pasta," Marra said. "It just doesn't work as well."

Marra's nerve guide returned about 80% of fine motor control in the thumbs of four monkeys, each with a 2-inch nerve gap in the forearm.

The guide is made of the same material as dissolvable sutures and peppered with a growth-promoting protein -- the same one delivered to the brain in a recent Parkinson's trial -- which releases slowly over the course of months.

The experiment had two controls: an empty polymer tube and a nerve graft. Since monkeys' legs are relatively short, the usual clinical procedure of removing and dicing a leg nerve wouldn't work. So, the scientists removed a 2-inch segment of nerve from the forearm, flipped it around and sewed it into place, replacing linguini with linguini, and setting a high bar for the nerve guide to match.

Functional recovery was just as good with Marra's guide as it was with this best-case-scenario graft, and the guide outperformed the graft when it came to restoring nerve conduction and replenishing Schwann cells -- the insulating layer around nerves that boosts electrical signals and supports regeneration. In both scenarios, it took a year for the nerve to regrow. The empty guide performed significantly worse all around.

With these promising results in monkeys, Marra wants to bring her nerve guide to human patients. She's working with the Food and Drug Administration (FDA) on a first-in-human clinical trial and spinning out a startup company, AxoMax Technologies Inc.

"There are no hollow tubes on the market that are approved by the FDA for nerve gaps greater than an inch. Once you get past that, no off-the-shelf tube has been shown to work," Marra said. "That's what's amazing here."



Wednesday, January 22, 2020

How Stress Causes Gray Hair


Scientists uncover link between the nervous system and stem cells that regenerate pigment


Harvard University – January 22, 2020 -- Scientists have found evidence to support long-standing anecdotes that stress causes hair graying. Researchers found that in mice, the type of nerve involved in the fight-or-flight response causes permanent damage to the pigment-regenerating stem cells in the hair follicle. The findings advance knowledge of how stress impacts the body, and are a first step toward blocking its negative effects.


When Marie Antoinette was captured during the French Revolution, her hair reportedly turned white overnight. In more recent history, John McCain experienced severe injuries as a prisoner of war during the Vietnam War -- and lost color in his hair.


For a long time, anecdotes have connected stressful experiences with the phenomenon of hair graying. Now, for the first time, Harvard University scientists have discovered exactly how the process plays out: stress activates nerves that are part of the fight-or-flight response, which in turn cause permanent damage to pigment-regenerating stem cells in hair follicles.


The study, published in Nature, advances scientists' knowledge of how stress can impact the body.


"Everyone has an anecdote to share about how stress affects their body, particularly in their skin and hair -- the only tissues we can see from the outside," said senior author Ya-Chieh Hsu, the Alvin and Esta Star Associate Professor of Stem Cell and Regenerative Biology at Harvard. "We wanted to understand if this connection is true, and if so, how stress leads to changes in diverse tissues. Hair pigmentation is such an accessible and tractable system to start with -- and besides, we were genuinely curious to see if stress indeed leads to hair graying. "


Narrowing down the culprit


Because stress affects the whole body, researchers first had to narrow down which body system was responsible for connecting stress to hair color. The team first hypothesized that stress causes an immune attack on pigment-producing cells. However, when mice lacking immune cells still showed hair graying, researchers turned to the hormone cortisol. But once more, it was a dead end.


"Stress always elevates levels of the hormone cortisol in the body, so we thought that cortisol might play a role," Hsu said. "But surprisingly, when we removed the adrenal gland from the mice so that they couldn't produce cortisol-like hormones, their hair still turned gray under stress."


After systematically eliminating different possibilities, researchers honed in on the sympathetic nerve system, which is responsible for the body's fight-or-flight response.


Sympathetic nerves branch out into each hair follicle on the skin. The researchers found that stress causes these nerves to release the chemical norepinephrine, which gets taken up by nearby pigment-regenerating stem cells.


Permanent damage


In the hair follicle, certain stem cells act as a reservoir of pigment-producing cells. When hair regenerates, some of the stem cells convert into pigment-producing cells that color the hair.


Researchers found that the norepinephrine from sympathetic nerves causes the stem cells to activate excessively. The stem cells all convert into pigment-producing cells, prematurely depleting the reservoir.


"When we started to study this, I expected that stress was bad for the body -- but the detrimental impact of stress that we discovered was beyond what I imagined," Hsu said. "After just a few days, all of the pigment-regenerating stem cells were lost. Once they're gone, you can't regenerate pigment anymore. The damage is permanent."


The finding underscores the negative side effects of an otherwise protective evolutionary response, the researchers said.


"Acute stress, particularly the fight-or-flight response, has been traditionally viewed to be beneficial for an animal's survival. But in this case, acute stress causes permanent depletion of stem cells," said postdoctoral fellow Bing Zhang, the lead author of the study.


Answering a fundamental question


To connect stress with hair graying, the researchers started with a whole-body response and progressively zoomed into individual organ systems, cell-to-cell interaction and, eventually, all the way down to molecular dynamics. The process required a variety of research tools along the way, including methods to manipulate organs, nerves, and cell receptors.


"To go from the highest level to the smallest detail, we collaborated with many scientists across a wide range of disciplines, using a combination of different approaches to solve a very fundamental biological question," Zhang said.


The collaborators included Isaac Chiu, assistant professor of immunology at Harvard Medical School who studies the interplay between nervous and immune systems.


"We know that peripheral neurons powerfully regulate organ function, blood vessels, and immunity, but less is known about how they regulate stem cells," Chiu said.


"With this study, we now know that neurons can control stem cells and their function, and can explain how they interact at the cellular and molecular level to link stress with hair graying."


The findings can help illuminate the broader effects of stress on various organs and tissues. This understanding will pave the way for new studies that seek to modify or block the damaging effects of stress.


"By understanding precisely how stress affects stem cells that regenerate pigment, we've laid the groundwork for understanding how stress affects other tissues and organs in the body," Hsu said. 

"Understanding how our tissues change under stress is the first critical step towards eventual treatment that can halt or revert the detrimental impact of stress. We still have a lot to learn in this area."


The study was supported by the Smith Family Foundation Odyssey Award, the Pew Charitable Trusts, Harvard Stem Cell Institute, Harvard/MIT Basic Neuroscience Grants Program, Harvard FAS and HMS Dean's Award, American Cancer Society, NIH, the Charles A. King Trust Postdoctoral Fellowship Program, and an HSCI junior faculty grant.



Story Source:

Materials provided by Harvard University. Original written by Jessica Lau. Note: Content may be edited for style and length.


https://www.sciencedaily.com/releases/2020/01/200122135313.htm