Wednesday, February 29, 2012

Alzheimer's-Like Memory Loss Reversed in Mice

By Randy Dotinga

EDNESDAY, Feb. 29 (HealthDay News) -- New research in mice suggests that Alzheimer's disease triggers a protein that contributes to the breakdown of the brain's memory.

If the findings are confirmed in humans, they could solve part of the puzzle of how gunk-like substances in the brain cause Alzheimer's disease and lead to memory loss. It's conceivable that a drug could be developed to turn off the process and reverse memory problems -- as the researchers managed to do with mice.

For now, the research is in its early stages and it could take five to 10 years to get to drug experiments in humans, said study author Johannes Graff, a postdoctoral researcher at Massachusetts Institute of Technology. Even if a drug is developed using this knowledge, it would only treat the symptoms of Alzheimer's and not the root cause, he said.

But it could mark a major advance to be able to turn around the memory problems spawned by Alzheimer's, Graff said, adding, "We can show that this is potentially reversible."

There are more than 5 million Americans who have been diagnosed with Alzheimer's, according to the U.S. National Institute of Neurological Disorders and Stroke (NINDS).

Researchers believe that Alzheimer's disease begins when the brain becomes clogged by substances known as beta-amyloid plaques and tau tangles. The new research in mice, Graff said, suggests that when a protein known as histone deacetylase 2 (HDAC2) is triggered, it shuts down genes that are crucial to memory. By preventing the buildup of HDAC2 in the brains of mice, the researchers were able to protect against memory loss.

Brain tissue from deceased Alzheimer's patients also showed higher levels of HDAC2 in regions where memory and learning are known to be located, the scientists added, and they theorized that the accumulation of beta amyloid deposits in the brain may be what sends HDAC2 into overdrive.

"If your memory is everything that you know written in a book, then in order to have access, you have to open the book and to turn the pages," Graff said. In Alzheimer's, "this mechanism actually closes your memory book and makes the pages -- the genes -- inaccessible."

The good news is that this latest research suggests that the "blockade" is potentially reversible, Graff said. In other words, the book hasn't been destroyed. "We are proposing to reopen the book and allow it to be more easily read," he said.

There are caveats to the research, said Dr. Brad Dickerson, an associate professor of neurology at Harvard Medical School.

"This is a very early basic science study in mice and requires substantial additional investigation in order to determine whether it is worth pursuing in patients," he said. "The leap from animal studies to human clinical trials is a big one and always takes many years. Drugs in this class are being studied in various types of cancer, which hopefully will provide an indication of their side effects and other important information about how feasible it would be to give these types of medications to patients with Alzheimer's disease if further studies support the potential value of this approach."

Research like this is important, Dickerson added, because "we need studies like this in animals to begin to prove the concept that new drugs of this sort have potential."

The study, which was funded by NINDS, appeared online Feb. 29 in the journal Nature.

http://news.yahoo.com/alzheimers-memory-loss-reversed-mice-

190305822.html;_ylt=AtK4y92ax_aDOl54g5uw5H.s0NUE;_ylu=X3oDMTNtdTQ1cjlsBG1pdANUb3BTdG9yeSBGUARwa2cDZGFkMjVkMWEtODljOC0zODI3LWE5OWQtZGQ2ZTI1NjUxN2UwBHBvcwMxNARzZWMDdG9wX3N0b3J5BHZlcgMxZDA2NGU2OS02MzMzLTExZTEtYmZkYi0xN2E0Yzc1MDUxNjA-;_ylg=X3oDMTFvdnRqYzJoBGludGwDdXMEbGFuZwNlbi11cwRwc3RhaWQDBHBzdGNhdANob21lBHB0A3NlY3Rpb25zBHRlc3QD;_ylv=3

= = = = = = = = = = = = = = = = = =  = = = = = = = =

This research is similar but not the same as that reported on the December 20, 2011, Daily Quiddity blog about research into Alzheimers using mice.

Tuesday, February 28, 2012

How to remove your Google Web Data History

Cnet.com February 27, 2012

Do you know if Google is tracking your Web activity? If you have a Google account (for, say, Gmail) and have not specifically located and paused the Web History setting, then the search giant is keeping track of your searches and the sites you visited. This data has been separated from other Google products, but on March 1 it will be shared across all of the Google products you use when Google's new privacy policy goes into effect.

If you'd like to prevent Google from combining this potentially sensitive data with the information it has collected from your YouTube, Google+, and other Google accounts, you can remove your Web History and stop it from being recorded moving forward.

After signing into your Google account, type https://www.google.com/history into your browser.

(Alternatively, you can choose Account Settings from the pull-down menu in the upper-right corner of a Google product such as Gmail, Google+, or Google.com. From the Account Settings page, scroll down to the Services header and click on the "Go to web history" link.) If your Web History is enabled, you'll see a list of recent searches and sites visited. Click the gray Remove all Web History button at the top of the page and a subsequent OK button to clear your Web History.


Just the way I like it, empty and paused.

This action also pauses the Web History feature so that it will no longer track your Web searches and whereabouts. If you'd like to fire it back up, simply click the blue Resume button.

(Source: Electronic Frontier Foundation)

(Via: LifeHacker)

(Credit: Matt Elliott/CNET)

http://news.yahoo.com/how-to-remove-your-google-web-data-history.html

Monday, February 27, 2012

Plants Have Made Earth Unique

Nature Geoscience had a special February 1st, 2012 issue summarized by Scientific American on the same day. Mark Fischetti, writing the summary for Scientific American, noted that the interest of exoplanets that may be similar to earth has led to a surprising theory: we will probably never find a planet similar to earth because of the effect plants have had on this planet’s surface.

We know that oceans and land masses formed, mountains arose and rain washed over the surface of our planet. Rivers weathered rock to make soil and plants took root.

But new research shows that the last part about rocks eroding and plants growing is wrong. Vascular plants (which have xylem and phloem structures which conduct water) created the rivers and muds that built the soils leading to forests and farmland.

The vascular plants began around 450 million years ago to soak up carbon dioxide, more than the oceans were able to. Temperatures dropped, creating widespread glaciation and melting that ground the earth’s surface.


Plants formed the modern rivers we see today, say Martin Gibling of Dalousie University in Nova Scotia and Neil Davies of the University of Ghent in Belgium in another Nature Geoscience article. Water used to run over land in broad sheets with no defined courses. Vegetation to break down rock and hold that mud in place was needed to form river banks. The channeled water led to periodic flooding which deposited sediment over broad areas, building up rich soil that allowed trees to take root. This caused more channels and more flooding, leading ultimately to forests and fertile plains.

Scientific American summarized this process:

"Sedimentary rocks, before plants, contained almost no mud," explains Gibling, a professor of Earth science at Dalhousie. "But after plants developed, the mud content increased dramatically. Muddy landscapes expanded greatly. A new kind of eco-space was created that wasn't there before."Gibling’s view is that plants "…create the surface system" which becomes incredibly complex because of plant life. The Nature Geoscience edition concludes in an editorial, "Even if there are a number of planets that could support tectonics, running water and the chemical cycles that are essential for life as we know it, it seems unlikely that any of them would look like Earth."

from:

http://www.scientificamerican.com/article.cfm?id=plants-created-earth-landscapel&WT.mc_id=SA_syn_CNET

Sunday, February 26, 2012

U.S. Loses Manufacturing Jobs

MIT has done a particularly thorough and fair-minded analysis of the loss of manufacturing jobs in the U.S. They conclude that consumers save money -- so much money that the loss of jobs is economically worthwhile.  But for specific communities, the job losses cause profound economic decline.  Workers take the retrainiug benefits, but they often don't result in new employment.  Disability status, oddly, skyrockets.  And, counter-intuitively, only a small number abandon the distressed community and move to where the jobs are.

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

MIT research: The high price of losing manufacturing jobs
Posted On: February 23, 2012

The loss of U.S. manufacturing jobs is a topic that can provoke heated arguments about globalization. But what do the cold, hard numbers reveal? How has the rise in foreign manufacturing competition actually affected the U.S. economy and its workers?

A new study co-authored by MIT economist David Autor shows that the rapid rise in low-wage manufacturing industries overseas has indeed had a significant impact on the United States. The disappearance of U.S. manufacturing jobs frequently leaves former manufacturing workers unemployed for years, if not permanently, while creating a drag on local economies and raising the amount of taxpayer-borne social insurance necessary to keep workers and their families afloat.

Geographically, the research shows, foreign competition has hurt many U.S. metropolitan areas — not necessarily the ones built around heavy manufacturing in the industrial Midwest, but many areas in the South, the West and the Northeast, which once had abundant manual-labor manufacturing jobs, often involving the production of clothing, footwear, luggage, furniture and other household consumer items. Many of these jobs were held by workers without college degrees, who have since found it hard to gain new employment.

"The effects are very concentrated and very visible locally," says Autor, professor and associate head of MIT's Department of Economics. "People drop out of the labor force, and the data strongly suggest that it takes some people a long time to get back on their feet, if they do at all." Moreover, Autor notes, when a large manufacturer closes its doors, "it does not simply affect an industry, but affects a whole locality."

In the study, published as a working paper by the National Bureau of Economic Research, Autor, along with economists David Dorn and Gordon Hanson, examined the effect of overseas manufacturing competition on 722 locales across the United States over the last two decades. This is also a research focus of MIT's ongoing study group about manufacturing, Production in the Innovation Economy (PIE); Autor is one of 20 faculty members on the PIE commission.

The findings highlight the complex effects of globalization on the United States.

"Trade tends to create diffuse beneficiaries and a concentration of losers," Autor says. "All of us get slightly cheaper goods, and we're each a couple hundred dollars a year richer for that." But those losing jobs, he notes, are "a lot worse off." For this reason, Autor adds, policymakers need new responses to the loss of manufacturing jobs: "I'm not anti-trade, but it is important to realize that there are reasons why people worry about this issue."

-- Double trouble: businesses, consumers both spend less when industry leaves
In the paper, Autor, Dorn (of the Center for Monetary and Fiscal Studies in Madrid, Spain) and Hanson (of the University of California at San Diego) specifically study the effects of rising manufacturing competition from China, looking at the years 1990 to 2007. At the start of that period, low-income countries accounted for only about 3 percent of U.S. manufacturing imports; by 2007, that figure had increased to about 12 percent, with China representing 91 percent of the increase.

The types of manufacturing for export that grew most rapidly in China during that time included the production of textiles, clothes, shoes, leather goods, rubber products — and one notable high-tech area, computer assembly. Most of these production activities involve soft materials and hands-on finishing work.

"These are labor-intensive, low-value-added [forms of] production," Autor says. "Certainly the Chinese are moving up the value chain, but basically China has been most active in low-end goods."

In conducting the study, the researchers found more pronounced economic problems in cities most vulnerable to the rise of low-wage Chinese manufacturing; these include San Jose, Calif., Providence, R.I., Manchester, N.H., and a raft of urban areas below the Mason-Dixon line — the leading example being Raleigh, N.C. "The areas that are most exposed to China trade are not the Rust Belt industries," Autor says. "They are places like the South, where manufacturing was rising, not falling, through the 1980s."

All told, as American imports from China grew more than tenfold between 1991 and 2007, roughly a million
U.S. workers lost jobs due to increased low-wage competition from China — about a quarter of all U.S. job losses in manufacturing during the time period.

And as the study shows, when businesses shut down, it hurts the local economy because of two related but distinct "spillover effects," as economists say: The shuttered businesses no longer need goods and services from local non-manufacturing firms, and their former workers have less money to spend locally as well.

A city at the 75th percentile of exposure to Chinese manufacturing, compared to one at the 25th percentile, will have roughly a 5 percent decrease in the number of manufacturing jobs and an increase of about $65 per capita in the amount of social insurance needed, such as unemployment insurance, health care insurance and disability payments.

"People like to think that workers flow freely across sectors, but in reality, they don't," Autor says. At a conservative estimate, that $65 per capita wipes out one-third of the per-capita gains realized by trade with China, in the form of cheaper goods. "Those numbers are really startling," Autor adds.

The study draws on United Nations data on international trade by goods category among developing and developed countries, combined with U.S. economic data from the Census Bureau, the Bureau of Economic Analysis and the Social Security Administration.

-- New policies for a new era?
In Autor's view, the findings mean the United States needs to improve its policy response to the problem of disappearing jobs. "We do not have a good set of policies at present for helping workers adjust to trade or, for that matter, to any kind of technological change," he says.

For one thing, Autor says, "We could have much better adjustment assistance — programs that are less fragmented, and less stingy." The federal government's Trade Adjustment Assistance (TAA) program provides temporary benefits to Americans who have lost jobs as a result of foreign trade. But as Autor, Dorn and Hanson estimate in the paper, in areas affected by new Chinese manufacturing, the increase in disability payments is a whopping 30 times as great as the increase in TAA benefits.

Therefore, Autor thinks, well-designed job-training programs would help the government's assistance efforts become "directed toward helping people reintegrate into the labor market and acquire skills, rather than helping them exit the labor market."

Still, it will likely take more research to get a better idea of what the post-employment experience is like for most people. To this end, Autor, Dorn and Hanson are conducting a new study that follows laid-off manufacturing workers over time, nationally, to get a fine-grained sense of their needs and potential to be re-employed.

"Trade may raise GDP," Autor says, "but it does make some people worse off. Almost all of us share in the gains. We could readily assist the minority of citizens who bear a disproportionate share of the costs and still be better off in the aggregate."

Source: Massachusetts Institute of Technology

http://www.sciencecodex.com/mit_research_the_high_price_of_losing_manufacturing_jobs-86717

Saturday, February 25, 2012

Two Mysterious Blood Types Identified

By Joshua E. Brown, February 22, 2012
University Communications of the University of Vermont

You probably know your blood type: A, B, AB or O. You may even know if you’re Rhesus positive or negative. But how about the Langereis blood type? Or the Junior blood type? Positive or negative? Most people have never even heard of these.

Yet this knowledge could be "a matter of life and death," says University of Vermont biologist Bryan Ballif.


                               biologist Bryan Ballif (photo by Joshua E. Brown) 

While blood transfusion problems due to Langereis and Junior blood types are rare worldwide, several ethnic populations are at risk, Ballif notes. "More than 50,000 Japanese are thought to be Junior negative and may encounter blood transfusion problems or mother-fetus incompatibility," he writes.

But the molecular basis of these two blood types has remained a mystery — until now.

In the February issue of Nature Genetics, Ballif and his colleagues report on their discovery of two proteins on red blood cells responsible for these lesser-known blood types.

Ballif identified the two molecules as specialized transport proteins named ABCB6 and ABCG2.
"Only 30 proteins have previously been identified as responsible for a basic blood type," Ballif notes, "but the count now reaches 32."

The last new blood group proteins to be discovered were nearly a decade ago, Ballif says, "so it’s pretty remarkable to have two identified this year."

Both of the newly identified proteins are also associated with anticancer drug resistance, so the findings may also have implications for improved treatment of breast and other cancers.

Cross-border science

As part of the international effort, Ballif, assistant professor in the biology department, used a mass spectrometer at UVM funded by the Vermont Genetics Network. With this machine, he analyzed proteins purified by his longtime collaborator, Lionel Arnaud at the French National Institute for Blood Transfusion in Paris, France.

Ballif and Arnaud, in turn, relied on antibodies to Langereis and Junior blood antigens developed by Yoshihiko Tani at the Japanese Red Cross Osaka Blood Center and Toru Miyasaki at the Japanese Red Cross Hokkaido Blood Center.

After the protein identification in Vermont, the work returned to France. There Arnaud and his team conducted cellular and genetic tests confirming that these proteins were responsible for the Langereis and Junior blood types. "He was able to test the gene sequence," Ballif says, "and, sure enough, we found mutations in this particular gene for all the people in our sample who have these problems."

Transfusion troubles

Beyond the ABO blood type and the Rhesus (Rh) blood type, the International Blood Transfusion Society recognizes twenty-eight additional blood types with names like Duffy, Kidd, Diego and Lutheran. But Langereis and Junior have not been on this list. Although the antigens for the Junior and Langereis (or Lan) blood types were identified decades ago in pregnant women having difficulties carrying babies with incompatible blood types, the genetic basis of these antigens has been unknown until now.  Therefore, "very few people learn if they are Langereis or Junior positive or negative," Ballif says.

"Transfusion support of individuals with an anti-Lan antibody is highly challenging," the research team wrote in Nature Genetics, "partly because of the scarcity of compatible blood donors but mainly because of the lack of reliable reagents for blood screening." And Junior-negative blood donors are extremely rare too. That may soon change.

With the findings from this new research, health care professionals will now be able to more rapidly and confidently screen for these novel blood group proteins, Ballif wrote in a recent news article. "This will leave them better prepared to have blood ready when blood transfusions or other tissue donations are required," he notes.

"Now that we know these proteins, it will become a routine test," he says.

A better match

This science may be especially important to organ transplant patients. "As we get better and better at transplants, we do everything we can to make a good match," Ballif says. But sometimes a tissue or organ transplant, that looked like a good match, doesn’t work — and the donated tissue is rejected, which can lead to many problems or death.

"We don’t always know why there is rejection," Ballif says, "but it may have to do with these proteins."

The rejection of donated tissue or blood is caused by the way the immune system distinguishes self from not-self. "If our own blood cells don’t have these proteins, they’re not familiar to our immune system," Ballif says, so the new blood doesn’t "look like self" to the complex cellular defenses of the immune system. "They’ll develop antibodies against it," Ballif says, and try to kill off the perceived invaders. In short, the body starts to attack itself.

"Then you may be out of luck," says Ballif, who notes that in addition to certain Japanese populations, European Gypsies are also at higher risk for not carrying the Langereis and Junior blood type proteins.
"There are people in the United States who have these challenges too," he says, "but it’s more rare."

Other proteins

Ballif and his international colleagues are not done with their search. "We’re following up on more unknown blood types," he says. "There are probably on the order of 10 to 15 more of these unknown blood type systems — where we know there is a problem but we don’t know what the protein is that is causing the problem."

Although these other blood systems are very rare, "if you’re that one individual, and you need a transfusion," Ballif says, "there’s nothing more important for you to know."

http://www.uvm.edu/~uvmpr/?Page=news&storyID=13259

Friday, February 24, 2012

A Riot of Cool Sneaker Seekers

Hundreds of rioters were dispersed from the Florida Mall near Orlando late Thursday night when a riot broke out over the midnight release of $220 Nike Galaxy Air Foamposite Ones shoes.  The group was disbursed by a group of over 100 police officers.  No arrests nor injuries were reported.

The trendy Nike shoes were timed for release just before this Sunday’s NBA All-Star game in Orlando.

Early Friday, a similar problem occurred at a mall in Hyattsville, Maryland, near Washington, D.C.  One arrest was made out of a crowd of 100.

The Foot Locker store announced early Friday that the shoe’s release was being cancelled at certain malls and stores, in the interest of the safety of the community.

These are not the first mobs going after Nike shoes.  In December of 2011, the release of Nike’s Air Jordans retro shoes caused fights at malls across the country.  Seattle cops used pepper spray to subdue some unruly shoppers.

Link (with internal link to video):

http://news.yahoo.com/blogs/sideshow/nike-foamposite-galaxy-shoe-release-causes-rioting-florida-161202536.html

Thursday, February 23, 2012

Squeezing Superconducting Crystals

Superconductors are materials which can conduct electricity with no resistance, which means no loss of power at all over distance. Most superconductors perform at temperatures near absolute zero (0 degrees Kelvin). Zeeya Merali reports in the February 22, 2012, Nature periodical that an iron-based crystal operates as a superconductor when under pressure. When the pressure is increased, the superconducting capacity is lost. When even more pressure is applied, the superconducting quality reappears in the crystal.

The crystal is made of iron selenide, and Liling Sun of the Institute of Physics, Chinese Academy of Sciences in Beijing, has investigated this phenomenon with her colleagues. "Pressure is a way to tune basic electronic and lattice structures by shortening atomic distances, and it can induce a rich variety of phenomena," she says.

Iron selenide acts as a superconductor up to about 30 K, and by increasing pressure with a diamond anvil, the superconductivity ceased as pressure approached 10 gigapascals. But increasing the pressure further, above 11.5 gigapascals caused the crystal to begin superconducting again. At about 12.5 gigapascals, the crystal could superconduct at tempratures up to 48 K, a new record for iron-selenide superconduction.

Practical applications would require that superconducting continue above 77 K (which is the limit for liquid nitrogen; it boils above that temperature), but a Tsinghua University physicist, Qi-Kun Xue, thinks this is possible. With his collegues, he grew an iron-selenide compound o a strontium-titanate substrate and is hoping for "a drastic increase in transition temperature," he said.

Subir Sachdev, a Harvard University physicist, notes that ‘vacancies’ – sites I a crystal that lack any ions – are shuffled about when pressure is applied. These ion-free zones are magnetic but not superconducting. The superconducting zones have no ion vacancies. Sachdev speculates that, "Under high pressures, it looks like something happens to push the magnetic behavior out, and let superconductivity take over."

Sun and her colleagues plan neutron-scattering experiments to show how a sample’s structure changes under pressure in an effort to determine whether vacancy order or changes in magnetism or some other effect is behind the change in superconductivity.

Zeeya Merali writes in Nature:
"The findings may confuse, rather than solve, the long-standing mystery of high-temperature superconductivity, says Andrew Green, a condensed-matter physicist at University College London, UK. He notes that although the first phase of superconductivity in iron selenide, seen at lower pressures, is related to the transition seen in other high temperature superconductors, the re-emergence of superconductivity at higher pressure is probably a new type of phase transition that follows a different mechanism. "This is a very nice result — but I think it raises more questions than it answers."The article is on-line at:

http://www.nature.com/news/superconductor-breaks-high-temperature-record-1.10081

= = = = = = = = = = = = = = = = = = = = = = = = = =

Note by the Blog Author

High temperature, particularly room temperature, superconductors could provide a revolution in energy generation, transmission over long distances, and in energy-efficient transportation.
Progress in this area, particularly in breaking boundaries for various substances, is good news in future efficient use of power worldwide.

Wednesday, February 22, 2012

Bad Science that Disses Genetic Engineering

Bad science has been around for a long time and ultimately undoes itself. Bruce Chassy and Henry I. Miller have an article in Forbes.com that recounts how Irving Langmuir gave a speech in 1953 debunking he ESP results of J.B. Rhine at Duke University. Rhine claimed results of ESP that couldn’t be explained by science. Langmuir uncovered the fault: Rhine was omitting the results he believed were guesses given for the purpose of humiliating him.

Chassy and Miller go on to describe experiments by Gilles-Eric Seralini in France on genetrically modified plants, specifically genes from a bacterium (Bacillus thurigiensis or "Bt") which are inserted into plant genes, particularly corn, soybean ad cotton, to inhance disease resistance.
There are four central flaws in Seralini’s experiment:
  • The test was done in a petri dish on animal tissue, and the chemical was applied to naked cells, not a good enough environment to simulate the effects on an intact organism in the real world.
  • Almost every chemical, including a concentration of table salt, will be toxic to defenseless cells in petri dishes. Bt proteins cannot penetrate animal skin nor the gastrointestinal tract of animals to get to the type of cells in the petri dish.
  • Seralini used doses of Roundup (the herbicide glyphosate) to provide a comparison. But the soybeans and corn actually used for food products contain a tiny amount of Roundup, orders of magnitude smaller than the amount used by Seralini in his experiment.
  • Actual animal feeding experiments show that Bt proteins "do not harm animals at doses a million times higher 6than humans would encounter in their diets," note Chassy and Miller.
Chassy and Miller also point out that only about 2% of the corn harvest is used to make corn meal products like chips and that baking or frying denatures the Bt proteins, which are also denatured by acid and thus would be digested in the human gut.

The authors point out that consumers have two dangers – an "information cascade" of bad ideas repeated and parroted and thus accepted as true when no persuasive evidence contradicts the information, and "rational ignorance,"
"which comes into play when the cost of sufficiently informing oneself about an issue to make an informed decision on it outweighs any potential benefit one could reasonably expect from that decision.

"For example, citizens occupied with the concerns of daily living—families, jobs, health—may not consider it to be cost-effective to study the potential risks and benefits of nuclear power plants or of plasticizers in children’s toys. This is unfortunate because free speech and democratic processes can only serve society when citizens are well enough informed to be able to reject pseudoscientific claims like Séralini’s and those of other propagandists and abusers of science."
http://www.forbes.com/sites/henrymiller/2012/02/22/the-science-of-things-that-arent-so/

= = = = = = = = = = = = = = = = = = = = = = = = = = = = =

Comments by the Blog Author

I can’t resist pointing out that the 1953 example of bad science, dealing with Rhine and ESP at Duke University, revolved around Rhine throwing out test data from those whom Rhine thought were out to get him. In other words, in his view, the data was only as good as the motivation of those being tested. This flaw seems to me entirely consistent with the fraudulent analytical philosophy of "deconstructionism" which has been thoroughly criticized elsewhere (and summarized on the companion Quiddity blog).

Secondly, what Chassy and Miller say about "rational ignorance" seems to be vitally important. If you don’t know science and pay critical attention to it in the 21st century, you are going to be conned.

Tuesday, February 21, 2012

Plants may have a single ancestor

Plastids are a class of organelles, and organelles include chloroplasts. Chloroplasts contain the green pigment chlorophyll, which converts light into useful cell energy in a process known as photosynthesis.

Since the 1960s, biologists have debated how the first plant species arose between a billion and a billion and one-half years ago. The most widely held theory is that a plastid joined blue-green algae (a cyanobacterium) to form the first plant.

A research time at the Bhattacharya Laboratory at Rutgers University, led by Dana Price, analyzed the DNA of plastids from the glaucophyte yanophora paradoxa, a very primative algae. This DNA was compared to the DNA from plastids of other algae (both red and green) as well as DNA from various land plants.

The paradoxa plastid DNA retained clues that it evolved from early cyanobacteria, as it includes genes needed to fermentation and starch biosynthesis. The DNA also has genes similar to ancient bacteria akin to the Chlamydiae bacteria, and these genes allow the products of photosynthesis to be exported to the rest of the cell and for the photosynthate to be combined into polysaccharides for storage.

Genetic analysis suggests that the cyanobacterium was incorporated just once, requiring cooperation among the host cell, the cyanobacterium (which can perform photosynthesis) and the Chlamydia-like bacterium (which is capable of making the products of photosynthesis available to the host).

The genetic analysis suggests the incorporation of the cyanobacterium must have occurred just once and that it required cooperation between the host cell, the photosynthesizing cyanobacterium, and a Chlamydia-like bacterium. The cyanobacterium (now called a plastid) provided food from sunlight, and the other bacterium made the products of photosynthesis available to the host.

The article linked below notes:

"Dr Bhattacharya, after whom the laboratory at Rutgers University is named, said that the three components together formed the new organelle, but that genes would also have been recruited from multiple sources before cell walls were developed."
 
This is an important clarification, because the existence of distinctive, firm cellular walls is one of the defining characteristics of plants.

The research was published in the journal Science.

The information provided above is summarized from Lin Edwards write up of February 17, 2012, in PhysOrg.com news at:

http://www.physorg.com/news/2012-02-ancestor.html

Monday, February 20, 2012

12 Times That Being Cheap Will Cost You

Jill Krasny at Business Insider came up with these dozen examples of false economies. I have summarized them below [and added a few comments of my own in square brackets].

Hitting the vending machine for cheap snacks

It’s expensive food, bad for the waistline, and more likely to harm your health than groceries (which are also more nutritious).

Going the poor man's divorce route to save $20,000

A good lawyer is needed to leave the parties clean of the past, including custody battles, tax filings, care payments, foreclosed homes, and other latent credit risks. Staying married and hating each other is no picnic (and not necessarily a bargain) either.

Upping your insurance deductible to save upfront

An unexpected event can cost you hundreds of thousands of dollars.

[exception by the blog author: As a CPA I content that by owning a cheap used car purchasec for less than $4,000, keeping it in tip-top shape, and skipping the liability insurance is a good bargain if you have gone YEARS without any accidents…]

Opting for layaway to feel like a saver

Layaway is a marketing ploy to get you to buy! It’s not a way to manage money.

Choosing a variable rate mortgage (ARM) for the lower monthly payment

The supposed advantag of a variable rate mortgage is a lower initial payment. But the savings come at the risk of higher payments because of higher interest in the future – a factor that is entirely out of your control. A fixed rate mortgage bypasses this unnecessary and potentially prohibitive risk.

Overpaying for extended warranties

The extended warranty wouldn’t be offered if they didn’t make money off it – so it is classicly a bad purchase and a waste of money.

However, if you are clumsy with laptops, consider the extended warranty [especially since a laptop is the most notoriously unreliable major appliance widely owned today – as opposed to desktop computers which seldom break down].

Overspending on bulk purchases to cut the grocery bill

It’s tempting to buy more than you can use, but it turns your house into a warehouse, it may go bad before it is used, or it may make you sick of it through repeated use or monotony.

Skipping the dentist (and other health visits) to pocket the co-pay

Teeth get more and more rotten when not treated. If you don’t have the money, look for a visiting or free dental clinic or for a dental or hygienist school within commuting distance.

Splurging on daily deals to feel thrifty

Daily deals aren’t a bargain unless it is something you need and will use right now. That’s unlikely – it is a "daily deal" because not enough people are buying them!

Shirking your pet's health to beat the vet bill

Doing so risks very expensive pet emergency care and a hideous, ghostly guilt trip by you, the pet owner. Your pets life depends entirely on you.

Not getting a prenup to stave off attorneys' fees

Anytime an outcome can be planned and reasoned out, that process is superior to litigation and court appearances. [See also the recent blog entry on the advisability of prenup planning].

Leasing a car to avoid the mileage penalty

Leasing usually runs for three years and 36,000 miles with penalties for mileage above that figure. If you know you will drive more than that – pay upfront!

Or buy the car rather than leasing it, so you won’t have to worry.

= = = = = = = = = = = = = = = = = = = = = = =

Summarized [with a few additions] from:

-- http://finance.yahoo.com/news/12-times-that-being-cheap-will-cost-you.html

Sunday, February 19, 2012

Negative Quiddity: Obama Care



Although this cartoon is rather good sentimental propaganda, it is on the wrong side of what is actually good for Americans.

ESSENTIAL MISTAKES OF OBAMA CARE

I

It’s unconstitutional. The requirement that every adult American acquire medical insurance or face a dollar penalty on their tax return is a power grab and unconscionable extension of the commerce clause of the Constitution. The commerce clause does not apply to refraining from economic activity except with respect to farmers’ federally guaranteed programs to grow or not-grow food.

II

It turns the IRS into cops and the annual tax return into a search conducted without probable cause. That’s not the way due process works. Under due process, there should be a reason to ask questions before the government snoops around looking for illegal activity. The revised tax return in accordance with post-2014 Obama Care constitutes a warrantless search shotgunned to every taxpayer with required answers under penalty of perjury. This is a central move toward the state having all power and the citizen having none.

III

It was written by lobbyists who grabbed what power and whatever pieces of the pie that they could. Especially, lawyers, pharmaceutical houses and insurance companies wrote Obama Care (check Matt Tiabbi’s articles and comments in Rolling Stone in August and September of 2009). Your Obama Care "benefits" in Nebraska are different than they are in any other state because the Democratic Senator sat on the fence and insisted on special treatment. Later on in the process, the drafters cunningly double-crossed the pharmaceutical companies. Result: once implemented, there is going to be less money spent on new drugs, and there will be fewer new cures. Tort reform to reduce malpractice cases and expenses was explicitly turned down by Democrats in the Senate. Lawyers are a big part of the Democratic party – a dishonest ambulance chaser (himself now being prosecuted) was the Democratic Vice presidential nominee in 2004.

IV

The result of II and III is an insanely and needlessly complex bill that will cost tens of billions to administer every year. The president said, while it was being considered that it would cost "not one dime" and be completely paid for by "Medicare savings." Here is a professionally-prepared graph of Obama Care – see if you can find any savings and efficiencies on this bill prepared by hell-bound lobbyists:

http://www.house.gov/brady/pdf/Press_Packet.pdf

I got the main graphic printed out in color at 400% of normal size. I then had it framed. I gave it to my private physician for a Christmas present. He was horrified by it but fascinated. He took it home and had it framed, where he showed it to other doctors.

V

The United States has a public debt that exceeds its gross domestic product. This is reversible, normally, only for a nation which has won a war and, in the aftermath of victory, pays down its debts. As a retired CPA, including service as a state auditor and state fraud examiner, I promise you that we cannot afford Obama Care, because we cannot continue to borrow the money from overseas required to fund it for any
significant period of time at all.

VI

The level of care given to individuals under Obama Care will suck. I know this because as a ten-year veteran of the Navy, the first thing I did afterward was go to a dentist and have my teeth saved. I went to a doctor to have my arthritis treated, though it was not recognized as a problem by government doctors. I am now retired and living in a town with a Veterans Affairs hospital. I also have medical care from my civilian tenure with an employer and the private medical insurance. I pay to see the VA (because I’m not poor enough to get free care), and in situation after situation I retreat to a private doctor and pay for important additional care. Oh, yeah, and additional medicines. There are certain medications that are patented and well prescribed because so effective (Advair for one). The government won’t prescribe it. That’s called "rationing." The government won’t pay for a root canal and crown either, even if you have a throat ulcer and need to chew well to get food past nine inches of raw throat muscle. Do you get the picture? Government medical care is mediocre – and the VA is the very best of mediocre government medicine. It cuts corners. It pinches. It minimizes. It is the government and therefore can’t be sued, so it doesn’t worry about malpractice. I don’t want to force the cartoon family at the top of this post into mediocre care. It’s inhumane. It’s worse than nothing, especially if you’re under 35.

VII

You’re going to have to wait, and wait, and wait for medical attention under Obama Care because there are going to be fewer doctors. Doctors are begging their children (themselves the group of Americans most likely to become doctors) to consider a technological Ph.D. instead of medicine and, for God’s sake!, stay out of general practice and the responsibilities of being a primary care physician.

VIII

It will push up prices. Medicare has been around since 1965. The government always eventually pays the claims, even most of the fraudulent ones. What does this mean? It is called "cost-push" inflation. Medical care and pharmaceuticals have met or exceeded the rate of inflation for every single year since the advent of Medicare. What will happen with Obama Care? Most inflation, more expensive medical visits, and, therefore, rationing. Guaranteed. The LANGUAGE for RATIONING geriatric care to the elderly is ALREADY part of the boilerplate of the law itself, for example. Again, this is inhumane and worse than nothing.

IX

Medical progress will slow. The pharmaceutical houses and medical equipment-manufacturing firms have already been smeared and singled out by President Obama himself. In the future their profits will be squeezed. They will respond to that by doing less research. There are over 700 rare diseases that a lobbying group in DC looks after and hounds Congress into funding in terms of research. There are another 700 to 800 that are so rare that no research is being funded by private or public funders. With Obama Care, people will die from diseases that could have been cured. This is a certainty.

X

About 50 million Americans are uncovered. A big slice of this group consists of young healthy adults who don’t need insurance. But, yes, there are poor sick Americans with no job or part-time work and no insurance.

What about the 260 million who have insurance? "They like it" and don’t want to switch to the Obama Care rules. It’s being forced down their throat. Do you have compassion for people with the maturity and emotional control to stay where they are as long term employees for the benefits? Only to have those benefits threatened by ambitious politicians? Why should the majority be forced to accept government-mandated mediocrity?

XI

Obama Care is NOT universal. Millions will remain ineligible and uncovered. Additionally, crazy people will stay on the street and avoid the system, even if they are dying from a condition that is curable.

Conclusion

The best possible case scenario that I can come up with is for the Supreme Court to properly defend the Constitution be striking down the mandatory insurance element of the law. The rest will collapse of its own order.

I would suggest that the new Congress, early in 2013, enact legislation for Medicare requiring computerized questionnaires of all patients. Such a questionnaire should be thorough, compete, and portable. When a patient fills out a questionnaire before seeing a physician, that physician is FREE from MALPRACTICE. So, let’s turn the waiting room into a computer lab. Let’s do the exact reverse of what Obama first proposed – his initial approach was to computerize lab results and images. If he knew what he was doing, if he had listened to the right professionals and asked wise questions, he would have known that the DIAGNOSIS should be computerized and made into a database (subject to revision and to further improvements in the questionnaire.)

Questionnaires like this already exist.

You could even build in questions to catch hypochondriacs. This is my second recommendation: on a computer diagnosis of hypochondria, a physician can refuse to treat that patient – period.

The family above in the cartoon could go to a library and access the Internet an answer questions and get a free, world-class diagnosis. That’s halfway to a cure. With no cost. No bureaucrats. No lying to the IRS. No political postures. Even crazy people would use it, the ones who’d rather be out on the street than ask for help from anyone.

Saturday, February 18, 2012

$6 trillion in Counterfeit U.S. Bonds Seized in Switzerland

It started with local prosecutors in Potenza, Italy, looking into mafia loan-sharking. But telephone and internet intercepts led to evidence of illegal activity involving U.S. Treasury bonds. Italian, Swiss and U.S. authorities coordinated an investigation which, after a year, led to the seizure of three large trunks in a Swiss trust company which contained $6 trillion in forged U.S. Treasury bonds. U.S. experts were called in to identify the certificates as fakes.

Reuters reports that the network responsible for the forgeries appeared to want to use the forgeries to secure credit with a Swiss bank. The prosecutor in Potenza said a network involved "in many countries" was behind the scheme. The Italian daily newspaper Corriere della Sera website edition said the criminal network was believed to want to acquire polutonium.

American bond traders laughed at the scandal, one calling it "…kind of like fake inflation." But it’s not the first time U.S. bonds have been counterfeited: Italian financial police seized $742 billion of phony U.S. bearer bonds in the town of Chiasso, on Italy’s border with Switzerland, in 2009.

Summarized from a Reuters story at:

http://finance.yahoo.com/news/italy-police-seize-6-trillion-144806660.html

Friday, February 17, 2012

12 Scary Debt Facts for 2012

By Jill Schlesinger, CBS MoneyWatch, February 16,2012


As President Obama unveiled the 2013 fiscal year budget, the nation's financial situation came back into sharp focus. Experts say partisan gridlock in Washington means the budget will probably go nowhere.

Considering this is an election year, however, expect politicians to harp on facts, figures and terms that most Americans weren't taught in high school. To help out, it's time to dredge up lots of scary facts to make you pay attention.

Before we get going, a quick primer on the number TRILLION:
  • $1 trillion = $1,000 billion or $1,000,000,000,000 (that's 12 zeros)
  • How hard is it to spend a trillion dollars? If you spent one dollar every second, you would have spent a million dollars in 12 days. At that same rate, it would take you 32 years to spend a billion dollars. But it would take you more than 31,000 years to spend a trillion dollars.
  • And now, some scary facts about the debt and the deficit -- some basics:
  • Deficit = money government takes in -- money government spends
  • 2012 US deficit = $1.33 trillion
  • 2013 Proposed budget deficit = $901 billion
  • National debt = Total amount borrowed over time to fund the annual deficit
  • Current national debt = $15.3 trillion (or $49,030 per every man, woman and child in the US or $135,773 per taxpayer)

[Also see: Who Benefits From the Safety Net]

OK, let's get started!

1. The U.S. national debt on Jan. 1, 1791, was just $75 million dollars. Today, the U.S. national debt rises by that amount about once an hour.

2. Our nation began its existence in debt after borrowing money to finance the Revolutionary War. President Andrew Jackson nearly eliminated the debt, calling it a "national curse." Jackson railed against borrowing, spending and even banks, for that matter, and he tried to eliminate all federal debt. By Jan. 1, 1835, under Jackson, the debt was just $33,733.

3. When World War II ended, the debt equaled 122 percent of GDP (GDP is a measure of the entire economy). In the 1950s and 1960s, the economy grew at an average rate of 4.3 percent a year and the debt gradually declined to 38 percent of GDP in 1970. This year, the Office of Budget and Management expects that the debt will equal nearly 100 percent of GDP.

4. Since 1938, the national debt has increased at an average annual rate of 8.5 percent. The only exceptions to the constant annual increase over the last 62 years were during the administrations of Clinton and Johnson. (Note that this is the rate of growth; the national debt still existed under both presidents.) During the Clinton presidency, debt growth was almost zero. Johnson averaged 3 percent growth of debt for the six years he served (1963-69).

5. When Ronald Reagan took office, the U.S. national debt was just under $1 trillion. When he left office, it was $2.6 trillion. During the eight Reagan years, the US moved from being the world's largest international creditor to the largest debtor nation.

6. The U.S. national debt has more than doubled since the year 2000.
  • Under President Bush: At the end of calendar year 2000, the debt stood at $5.629 trillion. Eight years later, the federal debt stood at $9.986 trillion.
  • Under President Obama: The debt started at $9.986 trillion and escalated to $15.3 trillion, a 53 percent increase over three years.

7. FY 2013 budget projects a deficit of $901 billion in 2013, representing 5.5 percent of GDP, down from a deficit of $1.33 trillion in FY 2012, which was the fourth consecutive year of more than $1 trillion dollar deficits.

8. The U.S. national debt rises at an average of approximately $3.8 billion per day.

9. The US government now borrows approximately $5 billion every business day.

[Also see: States with the most homes in foreclosure]

10. A trillion $10 bills, if they were taped end to end, would wrap around the globe more than 380 times. That amount of money would still not be enough to pay off the U.S. national debt.

11. The debt ceiling is the maximum amount of debt that Congress allows for the government. The current debt ceiling is $16.394 trillion effective Jan. 30, 2012.

12. The U.S. government has to borrow 43 cents of every dollar that it currently spends, four times the rate in 1980.

You can track the national debt on a daily basis here (http://www.treasurydirect.gov/NP/BPDLogin?application=np ).

Link for this Marketwatch article:
http://finance.yahoo.com/news/12-scary-debt-facts-for-2012.html

Thursday, February 16, 2012

A Less Expensive Malaria Treatment Is Coming

The World Health Organization estimates that approximately 655,000 people die from malaria each year, mostly children under 5 in Africa. The best treatment for the disease is artemisinin, a drug extracted from the sweet wormwood plant, which grows primarily in Vietnam and China. Current processing and refining of wormwood extract into artemisinin makes the drug too expansive for widespread use in the poor areas where malaria is most common.

Chemists at the Max Planck Institute in Germany have taken the waste product from the creation of artemisinin, artemisinic acid, and converted it to artemisinin itself through exposure to ultraviolet light. The device for conversion is about the size of a briefcase. Such a small piece of equipment could be added to production sites anywhere in the world. Ten times as much waste product comes from the production of the drug using current technology than usable artemisinin.

A paper discussing the new production technique was published this month I Angewandte Chemie, a chemistry journal.

The Associated Press covered this story and provided this quote:
"Four hundred of these would be enough to make a world supply of artemisinin," said unit director Peter Seeberger, pointing to the machine on a table in his lab in Berlin's Dahlem neighborhood. "The beauty of these things is they're very small and very mobile."Others have tried to convert the artemisinic acid to artemisinin using ultraviolet light, but the process took several steps in a large tank of acid, making the technique expensive and inefficient. The Max Planck chemists created a small machijne that pumps required ingredients through a thin tube wrapped around a UV lamp. The process is continuous and takes about four and one-half minutes in total.
 
Summarized from:

http://news.yahoo.com/malaria-method-could-boost-drug-production-092948236.html;_ylt=AgcEZ0tzSqB3R2uNnc8OdMms0NUE;_ylu=X3oDMTQ3M2QzMHVqBG1pdANTZWN0aW9uTGlzdCBGUCBTY2llbmNlBHBrZwM2OTlkOTZjZi0zNTAwLTNiNjUtODViZi05MjdjMTdhMmVlZWUEcG9zAzQEc2VjA01lZGlhU2VjdGlvbkxpc3QEdmVyAzEzOGNkYzQwLTU4YmMtMTFlMS1iZGZmLTQwMGNmODFiMGE0Yw--;_ylg=X3oDMTFvdnRqYzJoBGludGwDdXMEbGFuZwNlbi11cwRwc3RhaWQDBHBzdGNhdANob21lBHB0A3NlY3Rpb25zBHRlc3QD;_ylv=3

Wednesday, February 15, 2012

New Electric Luxury SUV from Tesla

Tesla's Model X Receives $40 Million in Advance Sales Just One Day after Unveiling By Tiffany Kaiser, Daily Tech, February 15, 2012

Just last week, Tesla Motors revealed the all-electric Model X crossover, which is the follow-up to its Model S. It has been less than a week since the EV's introduction, and it has already achieved star status with car lovers everywhere.

According to The Detroit News, Tesla received $40 million in pre-sales of the all-electric Model X just one day after unveiling the car. It was also the third most searched term on Google.


                                     Tesla Model X Electric SUV Crossover


"On Thursday evening, the night of the reveal, traffic to teslamotors.com increased 2,800 percent," said Tesla. "Two-thirds of all visitors were new to the website."

The all-electric Model X was introduced for the first time on February 9. The new EV features dual motor all wheel drive, the choice between a 60 or 85 kWh battery, and falcon doors. The Model X can sprint from 0 to 60 in about 4.4 seconds, and offers a rear-mounted 300 HP motor and an optional 150 HP front-mounted motor. The driving range is between 214 and 267 miles.

Price hasn't been announced for the Model X yet, but Tesla said it will be competitively priced with other premium SUVs.

While the Model X has been receiving plenty of attention, it's not the only one. The Model S, which is Tesla's full-sized battery electric sedan that is expected to be delivered in mid 2012, had a 30 percent boost in reservations last week after the Model X was revealed.

Tesla initially entered the electric vehicle arena with the Roadster, which is a $100,000 two-seater that launched in 2008. The Model S is Tesla's second electric vehicle, which features a 40 kWh lithium-ion battery pack (or 85 kWh battery pack in the top-end model), 160-mile range (300 miles on the top-end model), and a $57,400 to $87,400 price tag.

Model X production will begin at the end of 2013, with market launch scheduled for 2014. It is expected to qualify for the $7,500 tax credit, and Tesla hopes to produce 10,000 to 15,000 units annually.

Sources: SlashGear, The Detroit News
 http://www.dailytech.com/Teslas+Model+X+Receives+40+Million+in+Advanced+Sales+Just+One+Day+After+Unveiling/article24012.htm

Tuesday, February 14, 2012

Searching for Cheap and Clean Electricity

Lasers plus a crushing magnetic field may make fusion more efficient
By Chris Lee | Published in Ars Technica, February 7, 2012

Ever since I first heard about the idea, I have loved inertial confinement fusion. The basic concept involves blowing stuff up with lasers to get some energy, then doing it again and again as fast as possible. What more could a 38-going-on-5-year-old want? Well, what I might also want is a fusion reaction that generates more energy than you put in to it.

One thing that lets me down about inertial confinement fusion is that the implosion that gets the fusion reaction going also acts to stop the fusion. One idea for improving the fusion reaction that has been floating around for a while is to use magnetic fields in place of lasers to increase the efficiency of the fusion burn. But until recently, no one could figure out how to make it work properly.
 

A crash course in inertial confinement fusion


Fusion is the process whereby the atomic nuclei of lighter elements are combined to make heavier elements.
So sticking two deuterium atoms together (deuterium is a form of hydrogen with a neutron and a proton) will give you helium and 3MeV (480×10-15J) of energy. To put that in perspective, one gram of deuterium will provide 144 billion Joules of energy when it is completely burned into helium. One gram of benzene, a common hydrocarbon, releases just 48kJ when oxidized (burned in the normal sense).

But fusion is not so easy to achieve. Although atoms are electrically neutral, the parts that need to be stuck together—the atomic nuclei—are positively charged and repel each other. The external pressure needs to be high enough that it overcomes the Coulomb forces holding the nuclei apart.

In traditional inertial confinement fusion, the compression is driven by lasers (all the really cool stuff involves a laser somewhere). A perfectly spherical droplet of deuterium and tritium (tritium is hydrogen with two neutrons) ice is dropped through a target zone, where it is illuminated from many different directions by a very intense pulse of laser light. The photons are all either reflected or absorbed—either way, they give the deuterium and tritium atoms a kick toward the center of the target area. How hard a kick? The nuclei end up moving at about 30 million meters per second.

Right at the center of the target, the pressure is large enough to initiate fusion. Once that begins, the center of the pellet begins expanding, creating a compressed shell that also begins to fuse. Ideally, the chain reaction proceeds outward to completely burn away the deuterium-tritium pellet.

But a complete burn is usually prevented by the lasers that initiate the fusion process.

There are two critical issues. First, the electrons are stripped away from the nuclei and leave the area. In doing so, they carry away vital energy, reducing the temperature and the initial pressure. This slows down the fusion process, allowing the pressure to drop and preventing the expanding shell of fusion from achieving a complete burn.

The second issue is more technical. The pressure that drives the initial compression needs to be evenly applied—the laser pulses all need to have exactly the same energy, same spatial beam profile, and arrive at the target at the same time. If they're not, much of the deuterium and tritium sprays out of the pellet and never undergoes fusion.
 

A magnetic field makes everything better


For many years, fusion scientists had thought that if a magnetic field were used to compress the target, then a more complete fusion burn might be possible. The basic idea is that the role of the lasers changes. Instead of being responsible for compressing the pellet, it is only required to pre-heat the deuterium and tritium. Then, before the pellet explodes, the magnetic field is turned on, compressing it and initiating fusion.

The magnetic field acts on all charged particles, so it confines both electrons and nuclei, keeping the energy within the pellet. Furthermore, because everything is confined, the speed at which the nuclei need to be moving is reduced to just 1 million meters per second. If you think that isn't significant, consider that energy is proportional to the square of speed, so we are talking about requiring a thousand times less energy to initiate fusion.

But the magnetic field itself uses energy, and early calculations showed that it might slow down the expansion of the burn shell, which would also result in an incomplete burn. It would help—the total gain in energy production from magnetically confined inertial fusion was predicted to be a factor of ten. But we need gains on the order of a factor of 50 to make fusion break even. So the entire idea seemed destined for the scrap heap.

This is where this latest bit of research comes in. Slutz and Vesey from Sandia National Laboratories have shown that, if you modify the structure of the pellet, then energy gains between 200 and 1,000 are possible. The major finding is that the pellet and initial heating stage need to be modified. Slutz and Vesy start with a fairly standard pellet: a cylindrical piece of cryogenically cooled deuterium/tritium, surrounded by either aluminum or beryllium (this is the conductor that the magnetic field acts on).

The pellet is fabricated so that the density of the ice is very high just inside the metal shell. And, it seems (though the authors never explicitly say) that the whole cylinder is large enough in diameter so that the only the center of the pellet is heated by the incoming laser beams. The laser beams themselves don't hit it from every direction, but only along the axis of the cylinder.
 

Where the laser meets the hydrogen


The laser pulse heats the material at the very center of the pellet, creating a gas in that location. Before the outside of the pellet can heat up, the magnetic field is turned on, crushing the metal liner and compressing the gas. Fusion initiates, and the expanding shell of fusing material runs right into the layer of dense ice, slamming it into the shell before it can escape outwards. The result is a nearly complete burn.

The researchers calculated the amount of current and the duration of the current pulse required to produce the magnetic fields, and the numbers they came up with are not unreasonable (50-70MA for ~100ns). They also looked into the fabrication of the pellet.

One critical issue is the smoothness of the inner shell of the surrounding metal layer. They show that they require the surface to be perfect to within about 20nm, while current technology routinely manages 30nm. The additional precision should be feasible with current technology.

Where I think the authors may have missed the mark is earlier in their calculations. It seems that they require the laser beam to create a gas with a relatively sharp boundary so that the shell of dense ice is left untouched, even as the interior is vaporized. It is unclear from the paper if they calculate the heating stage explicitly or not. I believe they do, but that leaves unanswered questions about how it's done.

On a computer, it is very easy to create marvelous laser beams with very narrow effects. But in the laboratory, laser beams have strict limitations. Intensities change relatively smoothly, meaning that there is no sharp boundary between where the laser is heating material and where it is not. In addition, laser beams change diameter as they propagate, so the diameter of the heated zone compared to the unheated zone will change depending on where the pellet is hit by the laser.

From what is in the paper, it is hard to say if the initial conditions required for a good burn can be met with a laser. What this really calls for, of course, is a huge experiment where people like me get to blow stuff up.

Physical Review Letters 2012, DOI: 10.1103/PhysRevLett.108.025003

http://arstechnica.com/science/news/2012/02/a-crushing-magnetic-field-combined-with-a-laser-may-make-fusion-more-efficient.ars?comments=1#comments-bar

= = = = = = = = = = = = = = = = = = = = = = = =

A Burning Criticism

(From the comments section at the bottom of the article linked above)
[From:] Maury Markowitz | Tue Feb 07, 2012 6:55 pm
Hi Chris,

I hate to be *that* guy, but this approach is not new, and has been widely studied. Depending on who you ask it goes by a couple of names, but you can read this for an intro:
http://en.wikipedia.org/wiki/Magnetized_target_fusion

Long and short: it's unlikely this will work any better than any other fusion approach using highly non-uniform temperature profiles. Richtmyer–Meshkov instability is a *bitch*, and every time we crank the density, the mixing goes non-linear and we get only one branch closer to the stars.

You have to remember that ICF, and MIF/MTF, is an example of pathological science. It's all based on a mathematical mistake in 1972. Don't believe me? Look it up! So that mistake said we could get fusion with 1 kJ drivers. Oh, but when you fix the mistake, it's 100 kJ. We'', we can probably build that with enough money. So we make Shiva. Then we try it at 100 kJ and it doesn't work. Oh! New issues to factor in. So we factor in the next level of complication and it goes to 1 MJ. Well, we can probably build that with enough money... So now finally we're closing in on that with NIF and LM,
but of course there's Centurion suggesting the correct number is 100 MJ.

MTF is certainly better on paper right now, but I suspect it's chance at success is about zero. And the "fusion establishment" basically agrees, which is why it isn't getting any money. Now, if money was there to get, who would it go to? Sandia. And where did this paper come from? Sandia. Hmmmmm...

Don't get me wrong, I'm all for fusion research - as compared to white elephants like LHC for instance. But I am highly suspicious of any claim of high gain in MIF, as history demonstrates we're likely just forgetting something (in Nuckolls' original paper, it talks about "high gain"). Then we have spherical tokomaks, fast ignition, ITER classic approach, and others.

It's highly likely that all of us will be dead before any one of these is in use, if ever, and I say that with more than a little knowledge on the topic. PV should hit grid parity* within the next three years, at that point, why bother with fusion?

= = = = = = = = = = = = = = = = = = = = = = =


* this means "photo-voltaics should hit grid parity," meaning that solar panels should be able to produce electricity at the same price as the current electric grids can. This argument is made in a pdf file at:
http://www.q-cells.com/uploads/tx_abdownloads/files/11_GLOBAL_OVERVIEW_ON_GRID-PARITY_Paper.pdf (though this argument, itself, is subject to criticism as well). ,

Monday, February 13, 2012

Positive Quiddity: Seven Essential Equations

There’s a very interesting article by Ian Steward in the February 13, 2012 New Scientist. Stewart explains how seven equations changed the world and are central to modern life. These equations are

  • The wave equation
  • Maxell’s 4 equations
  • The Schrodinger equation
  • The Fournier Transform

Wave Equation

Johann Bernoulli (using Newton’s Laws) developed this equation from the ancient Greek Pythagoreans, and subsequently Jean Le Rond d’Alembert used both of these preceding formulae to produce the wave equation.

Maxwell’s Four Equations for Electromagnetism

The wave equation was central to the work of Jam4es Clerk Maxwell, particularly assisted by Michael
Faraday’s work on the basic physics of electromagnetism. Faraday framed his theories as geometric structures – lines of magnetic force. Maxwell reformulated these using the mathematics of fluid flow. He developed two equations showing that magnetism and electricity don’t leak away. A third says that an electrical field spinning in a small circle creates a magnetic field, and the fourth says that a spinning magnetic field creates an electrical field.

By manipulating these four equations, Maxwell derived the wave equation and deduced that light itself must be an electromagnetic wave. This was a revolutionary link between light and electricity and magnetism. This approach led to a prediction that electromagnetic waves of all wavelengths should exist, not just those visible to the human eye.

Some of these invisible long-wave forms would become radio waves as demonstrated in 1887 by Heinrich
Hertz and used to carry information by Nikola Tesla and Guglielo Marconi. This research opened the door to radio, television and microwave towers for cell phones – all derived from four equations and a couple of calculations.

The Schrodinger Equation

In 1927, Erwin Shrodinger wrote an equation for quantum waves, in which electrons were viewed not as particles but as probability clouds. Memory chips and semiconductors are based on this equation. Lasers and laser-based CDs and DVDs are also founded on this equation.

The Fournier Transform

In 1807, Joseph Fournier submitted an equation for heat flow to the French Academy of Sciences. It was rejected. In 1812, the Academy made heat the subject of the annual prize. Fournier submitted a revised paper and won the prize, but the academy wouldn’t publish it because of the way that Fournier resolved the problem. Fournier assumed the temperature varied like a sine wave along its length, so he assembled a more complicated combination of different sine waves of different lengths for each component sine wave and added them together. Fournier’s solution added together an infinite number of waves, and the Academy found this solution unrigorous and therefore improper to publish. In 1822, Fournier published his theory himself.

From this knowledge, an added element became the excluding of highly irregular profiles, resulting in the Fournier transform, which treats a signal that varies with time as a series of component sine waves and calculates the amplitudes and frequencies of the series.

The Fournier transform is used today to analyze earthquake signals, remove noise from old sound recordings, prevent unwanted vibration in automobile design, and save modern digital photographs in five-step JPEG compression.

                                     Here are the Equations



Conclusion

Scientists seek new equations, especially one that unites quantum theory with relativity. Such unifying equations have proven elusive so far.

Summarized from:

http://www.newscientist.com/article/mg21328516.600-seven-equations-that-rule-your-world.html?full=true

Comments by the Blog Author

Without diminishing these accomplishments, they are standing on the shoulders of other mathematical knowledge, physics insights and some brilliant chemistry.

Euclidean geometry is central to modern science, even though we now know that other geometries are also valid, such as Riemannian and Lobachevskian geometries.

It is unlikely that the wave equation could have been formulated without Newton’s laws of physics, itself a profound breakthrough in the hard sciences.

We couldn’t make these things (such as semiconductors, zener diodes, and precise compounds) without the modern Periodic Table [itself a Daily Quiddity post] and stoiciometric chemistry.

Speculation: Einstein won a Nobel Prize for demonstrating the photoelectric effect: certain structures react to the addition of an electron by ejecting an electron. This proven insight may ultimately prove more valuable than the Shrodinger equation.

Sunday, February 12, 2012

Positive Quiddity: Roger Boisjoly

Roger Mark Boisjoly (April 25, 1938 – January 6, 2012) was an American mechanical engineer, fluid dynamicist, and an aerodynamicist who worked for Morton Thiokol, the manufacturer of the solid rocket boosters (SRBs) for the Space Shuttle program. Prior to his employment at Thiokol, Boisjoly worked for companies in California on lunar module life-support systems and the moon vehicle. He is best known for having raised objections to the launch of the Space Shuttle Challenger the day before the loss of the spacecraft and its crew.
  

O-ring safety concerns

In 1980, he moved to Utah "to deepen his involvement in the Mormon religion and to join Morton Thiokol." Boisjoly wrote a memo in July 1985 to his superiors concerning the faulty design of the solid rocket boosters that, if left unaddressed, could lead to a catastrophic event during launch of a Space Shuttle. Such a catastrophic event did occur less than a year later resulting in the Space Shuttle Challenger disaster.
This memo followed his investigation of a solid rocket booster (SRB) from a shuttle flight in January 1985. During his investigation, he discovered that the first of a system of two O-rings had failed completely, and that some damage had been caused to the second O-ring.

The O-rings were two rubber rings that formed a seal between two sections of the SRBs. The sections of the boosters were joined using tang and clevis joints and the rings were intended to seal the joint, while allowing for the inevitable movement between the sections under flight conditions. By design, pressure from within the booster was to push a fillet of putty into the joint, forcing the O-ring into its seat. The system never functioned as designed. The rings were supposed to sit in a groove and seal the joint between the sections of the booster. It was found, however, that flight dynamics caused the joints in the SRB's to flex during launch, opening a gap through which rocket exhaust could escape. As the joints flexed, the rings would come out of their grooves and move to a new position in the joint, a process called extrusion. The extruded ring would form a seal in this new position, but during the time it took for the ring to shift, the joint was unsealed and hot gasses could escape, a process called blow-by. These hot gasses would cause
damage to the rings until the seal was achieved.

Boisjoly's investigation showed that the amount of damage to the O-ring depended on the length of time it took for the ring to move out of its groove and make the seal, and that the amount of time depended on the temperature of the rings. Cold weather made the rubber hard and less flexible, meaning that extrusion took more time and more blow-by took place. He determined that if the O-rings were damaged enough they could fail.

If the second O-ring had failed, Boisjoly realized, the results would almost certainly have been catastrophic with the complete loss of the shuttle and crew seemingly the only outcome. His investigation found that the first O-ring failed because of the low temperatures on the night before the flight had compromised the flexibility of the O-ring, reducing its ability to form a seal. The temperature at launch had been only 10 °C – the coldest on record (until January 28, 1986). The first rubber O-ring had formed a partial seal, but not a complete one, but the second O-ring had held.

Boisjoly sent a memo, describing the problem to his managers, but was apparently ignored. Morton Thiokol was in discussions with NASA with regards to a new contract (reportedly worth up to $1 billion) and it is possible that the management was concerned that any issues discovered with the solid rocket boosters might compromise the chances of the contract being renewed.

Following several further memos, a task force was set up – including Boisjoly – to investigate the matter, but after a month Boisjoly realized that the task force had no power, no resources and no management support.

In late 1985 Boisjoly advised his managers that if the problem was not fixed, there was a distinct chance that a shuttle mission would end in disaster. No action was taken.
 

Challenger Disaster

Following the announcement that the Challenger mission was confirmed for January 28, 1986, Boisjoly and his colleagues tried to stop the flight. Temperatures were due to be down to −1 °C (30 °F) overnight. Boisjoly felt that this would severely compromise the safety of the O-ring, and potentially lose the flight.

The matter was discussed with Morton Thiokol managers, who agreed that the issue was serious enough to recommend delaying the flight. They arranged a telephone conference with NASA management and gave their findings. However, after a while, the Morton Thiokol managers asked for a few minutes off the phone to discuss their final position again. Despite the efforts of Boisjoly and others in this off-line briefing, the Morton Thiokol managers decided to advise NASA that their data was inconclusive. NASA asked if there were objections. Hearing none, the decision to fly the ill-fated STS-51L Challenger mission was made.

Boisjoly's concerns proved correct. In the first moments after ignition, the O-rings failed completely and were burned away, resulting in the black puff of smoke visible on films of the launch. This left only a layer of aluminum oxide (a combustion product) to seal the joint. At 59 seconds after launch, buffeted by high-altitude winds, the oxide gave way. Hot gases streamed out of the joint in a visible torch-like plume that burned into the external hydrogen tank. At about 73 seconds, the adjacent SRB strut gave way and the vehicle quickly disintegrated.

Boisjoly was quite relieved when the flight lifted off, as his investigations had predicted that the SRB would explode during the initial take-off. Seventy-three seconds later he witnessed the shuttle disaster on television.

Later career

After President Ronald Reagan ordered a presidential commission to review the disaster, Boisjoly was one of the witnesses called. He gave accounts of how and why he felt the O-rings had failed. After the commission gave its findings, Boisjoly found himself shunned by colleagues and managers and he resigned from the company.

Boisjoly became a speaker on workplace ethics. He argued that the caucus called by Morton Thiokol managers, which resulted in a recommendation to launch, "constituted the unethical decision-making forum resulting from intense customer intimidation."


For his honesty and integrity leading up to and directly following the shuttle disaster, Boisjoly was awarded the Award for Scientific Freedom and Responsibility by the American Association for the Advancement of Science in 1988.


When Boisjoly left Morton Thiokol, he took 14 boxes of every note and paper he received or sent in seven years. On May 13, 2010, he donated his personal memoranda — six boxes of personal papers, including memos and notes from congressional testimony — to Chapman University in Orange, California. Rand Boyd, the special-collections and archival librarian at Chapman's Leatherby Libraries, said the materials will be catalogued and archived. It was to be about six months to a year before library visitors would be able to view the materials.

Boisjoly died on January 6, 2012, of cancer to the colon, kidneys, and liver.

http://en.wikipedia.org/wiki/Roger_Boisjoly

More background on Roger Boisjoly is available at:

http://www.lowellsun.com/local/ci_19902357

Saturday, February 11, 2012

Nearly Everyone Needs a Prenuptial Agreement


A "prenup" is shorthand for a pre-nuptial agreement, a legal document that lays out how couples will divide their assets should they divorce. Kimberly Palmer wrote about these agreements for U.S. News and World Report in the February 8, 2012, issue. She contacts Silvana Raso, maytrimonial attorney for the firm Schepisi & Mlaughlin in Englewood Cliffs, New Jersey. Raso said prenups were for everyone, "…even if you’re going into marriage with little assets," because assets may be accumulated during a marriage, and it is best not to leave those assets for a judge to divide them.

Couples often assume some assets or liabilities are individually owned, such as student loans, but a court might make a different determination. Couples can take away this stess from uncertainty with an agreement, Raso notes.

The article lists five reasons for looking into prenups:


Talking about "what ifs" can shed light on your relationship.

You can create an agreement even post-marriage if both parties are bringing something to the table (a "bargained for exchange").

Prenups cost around $2,500, around half as much as the average engagement ring.

Couples often make the mistake of waiting until the last minute (or delay considering a prenup indefinitely).


Community property states make prenups even more important.

Raso warns that do-it-yourself form agreements are usually disapproved by courts because of legal requirements that are left unmet. Raso adds that each spouse needs a separate lawyer, because using the same attorney creates an ethical conflict that would invalidate the prenup.

"No one can say 'I definitely won't get divorced,'" says Raso.


http://finance.yahoo.com/news/why-almost-everyone-needs-prenup-163015254.html

Friday, February 10, 2012

Detailed Map of Lyme Disease Areas Now Available

The Associated Press reports that after three years of snagging ticks and checking them for disease, a map has been developed showing the areas of the United States subject to Lyme disease. The map is part of a study published in the February American Journal of Tropical Medicine and Hygiene.

The areas at significant risk for Lyme disease include virtually all of Wisconsin as well as a strip of Illinois bordering Wisconsin and much of eastern Minnesota. There is some risk for the eastern border area of North Dakota that abuts Minnesota. In the eastern United States, the section of Virginia that consists of the western shore of the Chesapeake Bay, the northern neck area, the Virginia suburbs of Washington DC form the southernmost area of significant risk. Washington DC itself is a high risk area as are the Maryland suburbs and all states north and east, particularly Delaware, the eastern half of Pennsylvania, all of New Jersey, parts of New York state, virtually all of Connecticut, Rhode Island and Massachusetts, as well as the southern half of Vermont and New Hampshire as well as particular areas in Maine.

It is hoped that the map will lead to more accurate diagnosis of the disease in the high risk areas as well as consideration for other diagnoses in the low risk locations.

Lyme disease is named after a town in Connecticut. The only significant early symptom is a red rash. Other indications are vague, usually similar to the onset of the flu. The disease is easily cured by antibiotics, but only if administered in the early stage of the illness. Untreated victims may develop arthritis, meningitis and other serious illnesses.

http://news.yahoo.com/map-pinpoints-lyme-disease-risk-areas-165547027.html

Enlarged map of areas of high human risk for Lyme disease:

http://news.yahoo.com/photos/u-s--1316130479-slideshow/map-released-yale-school-public-health-friday-feb-photo-164843064.html