Sunday, July 31, 2011

Shaky Deal Reached on USA Debt Ceiling

The Effect of Government Debt

There was a study on the economic effects of government debt drafted in January of 2010 called Growth in a Time of Debt. It was written by a professor from Harvard and one from the University of Maryland. Here are some quotes (followed by my comments):

"For the five countries with systemic financial crises (Iceland, Ireland, Spain, the United Kingdom, and the United States), average debt levels are up by about 75 percent, well on track to reach or surpass the three year 86 percent benchmark that Reinhart and Rogoff (2009a,b) find for earlier deep post-war financial crises."

-- Growth in a Time of Debt, page 4

"In principle, the manner in which debt builds up can be important. For example, war debts are arguably less problematic for future growth and inflation than large debts that are accumulated in peace time. Postwar growth tends to be high as war-time allocation of manpower and resources funnels to the civilian economy. Moreover, high war-time government spending, typically the cause of the debt buildup, comes to a natural close as peace returns. In contrast, a peacetime debt explosion often reflects unstable underlying political economy dynamics that can persist for very long periods."

-- Growth in a Time of Debt, page 6

Looking at the debt levels of the 20 most advanced countries in the world:

"From the figure, it is evident that there is no obvious link between debt and growth until public debt reaches a threshold of 90 percent. The observations with debt to GDP over 90 percent have median growth roughly 1 percent lower than the lower debt burden groups and mean levels of growth almost 4 percent lower. (Using lagged debt should not dramatically change the picture.) The line in Figure 2 plots the median inflation for the different debt groupings—which makes plain that there is no apparent pattern of simultaneous rising inflation and debt."

-- Growth in a Time of Debt, page 7

"There are exceptions to this inflation result, as Figure 3 makes plain for the United States, where debt levels over 90% of GDP are linked to significantly elevated inflation."

-- Growth in a Time of Debt, page 9

"Interestingly, introducing the longer time series yields remarkably similar conclusions. Over the past two centuries, debt in excess of 90 percent has typically been associated with mean growth of 1.7 percent versus 3.7 percent when debt is low (under 30 percent of GDP), and compared with growth rates of over 3 percent for the two middle categories (debt between 30 and 90 percent of GDP). Of course, there is considerable variation across the countries, with some countries such as Australia and New Zealand experiencing no growth deterioration at very high debt levels. It is noteworthy, however, that those high-growth high-debt observations are clustered in the years following World War II."

-- Growth in a Time of Debt, page 11

Post-crisis private debt tends to show deleveraging and exacerbate high unemployment:

"Private debt, in contrast to public debt, tends to shrink sharply for an extended period after a financial crisis.  Just as a rapid expansion in private credit fuels the boom phase of the cycle, so does serious deleveraging exacerbate the post-crisis downturn. Just as a rapid expansion in private credit fuels the boom phase of the cycle, so does serious deleveraging exacerbate the post-crisis downturn. This pattern is illustrated in Figure 7, which shows the ratio of private debt to GDP for the United States for 1916-2009. Periods of sharp deleveraging have followed periods of lower growth and coincide with higher unemployment (as shown in the inset to the figure). In varying degrees,
the private sector (households and firms) in many other countries (notably both advanced and emerging Europe) are also unwinding the debt built up during the boom years. Thus, private deleveraging may be another legacy of the financial crisis that may dampen growth in the medium term."

-- Growth in a Time of Debt, page 21

Figure 7 on page 22 of Growth in a Time of Debt shows private debt has reached 270% of US GDP, the highest in a century and perhaps the highest ever.


"…Seldom do countries simply "grow" their way out of deep debt burdens.

"Why are there thresholds in debt, and why 90 percent? This is an important question that merits further research, but we would speculate that the phenomenon is closely linked to logic underlying our earlier analysis of "debt intolerance" in Reinhart, Rogoff, and Savastano (2003). As we argued in that paper, debt thresholds are importantly country-specific and as such the four broad debt groupings presented here merit further sensitivity analysis. A general result of our "debt intolerance" analysis, however, highlights that as debt levels rise towards historical limits, risk premia begin to rise sharply, facing highly indebted governments with difficult tradeoffs. Even countries that are committed to fully repaying their debts are forced to dramatically tighten fiscal policy in order to appear credible to investors and thereby reduce risk premia."

-- Growth in a Time of Debt, page 22

"…countries that choose to rely excessively on short term borrowing to fund growing debt levels are particularly vulnerable to crises in confidence that can provoke very sudden and "unexpected" financial crises. Similar statements could be made about foreign versus domestic debt, as discussed. At the very minimum, this would suggest that traditional debt management issues should be at the forefront of public policy concerns."
-- Growth in a Time of Debt, page 23

Source: Growth in a Time of Debt was prepared by Carmen M. Reinhart (University of Maryland) and Kenneth S. Rogoff (Harvard University) in January, 2010, for the American Economic Review Papers and Proceedings. It is available as a pdf file at:

http://www.economics.harvard.edu/faculty/rogoff/files/Growth_in_Time_Debt.pdf
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
Comments by the Blog Author

A "deal" to raise the debt limit achieved on Sunday, July 31, 2011.  The existing debt level is $14.3 trillion in debt for a USA economy with a GDP of $14.8 trillion.  The agreement:

Raises the debt above 100 percent of GDP for the USA
Skips requiring a Congressional vote on a balanced budget amendment
Does not disallow further tax increases
Does not specify a baseline
Reduces spending between now and January of 2013 by only a puny amount
Will probably be funded by short-term debt (inviting another crisis)
Invites higher inflation and a lower growth rate (as debt exceeds 90% of GDP)
Does nothing to reduce the huge level of private debt (which elevated level hasn’t reduced unemployment)

Saturday, July 30, 2011

Opinion: The World Is Not Overpopulated

July 20, 2011

The World Is Not Overpopulated

By Alex B. Berezow

An opinion piece in the Los Angeles Times declared the world to be overpopulated and even compared humanity to a cancerous growth. This reasoning is not only disturbing, but is almost certainly incorrect, as well.

The world, indeed, has a lot of people. By the end of 2011, there will be nearly 7 billion people living on the planet. But population growth rates will not sustain at those levels. An analysis by The Economist describes how each subsequent billion will take longer and longer to achieve, until population growth eventually plateaus at around 9 billion people by 2050.

A 2003 assessment by the United Nations concurs. The UN projects, under its medium-growth scenario, that the human population will remain relatively stable at 9 billion until the year 2300.

The reason is that birth rates are naturally falling around the world. The current growth in world population exceeds the replacement rate of 2.1 births per woman, but there are good reasons to believe that growth will slow down in the future. As countries become more technologically and economically advanced, people naturally choose to have fewer children. Also, there is a link between increasing female education and a declining birth rate.

Europe is the poster child for this phenomenon, where the total fertility rate is below 2.1 in all 27 EU nations. The problem is so bad in Russia, which may shrink by 25 million people in the next 40 years, that demographers are referring to a population crisis. This will put an enormous strain on Russia's economy as the government struggles to care for its aging population.

The authors also contend that "reproductive freedom" benefits all of humanity. But does it? Research shows that families around the world, particularly in Asia, selectively abort female infants. This "gendercide" distorts natural male-female ratios in the population. In some provinces in China, the ratio is perversely skewed in favor of boys, with 130 male births for every 100 female births. Obviously, this will have dire consequences for society.

If population poses a problem, it is likely due to distribution, not to growth. After all, only so many people can fit on the coasts of China, India, and the United States. There are many wide-open spaces for the population to expand. The trick will be to figure out a way to incentivize responsible growth, not to discourage it entirely.

Finally, the authors claim that poverty results from overpopulation. While this might be partially correct, many other factors contribute to poverty. China, with a population of approximately 1.3 billion, has entered a period of skyrocketing economic growth. India, with a similarly sized populace, is also slowly working its way out of poverty.

Instead of focusing on controlling population growth, a better way to tackle poverty is to help solve humanity's basic problems. Infectious disease, corrupt governance, and lack of access to global markets are Africa's biggest problems. When these devastating issues are corrected, African countries could experience rapid economic growth in the same way as did the Asian Tigers.

When the world becomes a more prosperous place, the "problem" of population growth will largely take care of itself.



Alex B. Berezow is the editor of RealClearSciwence. He holds a Ph.D. in microbiology.

Friday, July 29, 2011

Positive Quiddity: Richard Rodgers

Biography for Richard Charles Rodgers

Date of Birth
28 June 1902, New York City, New York, USA
Date of Death
30 December 1979, New York City, New York, USA

Birth Name

Richard Charles Rodgers
Nickname
Dick
Spouse
Dorothy Rodgers (1940 – 30 December 1979) (his death) 2 children
 
Trivia
Father of writer Mary Rodgers and Linda Rogers Melnick.

Grandfather of composer/lyricist/actor Adam Guettel and composer Peter Rodgers Melnick.

Wrote over 1,500 songs (at least 85 regarded as standards),and 42 musicals, 19 of which were transferred to film.

He was one of the pioneers, along with Jerome Kern and Oscar Hammerstein II, in the creation of a less frivolous type of musical theater than had been the case, in which musicals were adapted from well-known plays or novels, and featured dialog taken directly from those plays and/or novels, as well as believable, realistic characterizations, with dramatic storylines which sometimes contained tragedy and did not necessarily follow the usual formulas. His first musical using an unusual storyline and characters was "Pal Joey" , written with Lorenz Hart and produced in 1940, based on short stories by John O’Hara, about an unprincipled but sympathetic nightclub hoofer who does not reform. It was not very successful in its first run, but became a hit in 1952 (and was unfortunately watered down in its 1957 film version). The musicals Rodgers wrote, beginning in 1943 with Oscar Hammerstein II, overthrew many of the usual musical theater formulas of the time and continued the path begun by Hammerstein and Kern in their 1927 "Show Boat". The Rodgers and Hammerstein musicals sometimes used ballet sequences to reveal what was happening in the mind of one of the characters, the lead characters were not necessarily admirable people (i.e "Carousel"'s Billy Bigelow), one or more characters were sometimes killed off in the storylines, the hero and heroine did not necessarily fall in love or have a love scene (as in "The King and I"), and very often subjects which were then considered taboo in musicals were used. The songs in most of the Rodgers and Hammerstein musicals (with the exception of some of the musical numbers in "The Sound of Music") developed the storyline and the characters, and could not be removed from the shows without seriously damaging them.

Family name was originally Rojazinsky, becoming Rodgers when his father changed the family name to a more Americanized one sometime in the 1880s.

One of the first Kennedy Center Honorees in 1978.

In addition to their phenomenal success as writers of their own shows, Rodgers and Hammerstein produced shows by other writers, as well, most notably Irving Berlin’s "Annie Get Your Gun" and John Van Druten’s "I Remember Mama." The latter, incidentally, became the basis for Rodgers' last completed show. Written in collaboration with Martin Charnin and starring Liv Ullmann, Rodgers' "I Remember Mama" was one of the most notorious Broadway flops of the 1970s, running only a few months in 1979, largely on advance ticket sales. Rodgers died shortly after it closed. Most agreed that it was a sad end to an otherwise distinguished career.

"The King and I" was performed at the London Palladium in 2000 and was nominated for Outstanding Musical Production at the Laurence Olivier Theatre Awards in 2001.

His musical, "Oklahoma!" was awarded the 1998 London Evening Standard Theatre Award for Best Musical.

Broadway's 46th Street Theater was renamed the Richard Rodgers Theater in 1980. This was somewhat ironic, in that none of Rodgers' major shows ever played at that theater.

Served as Managing Director of the Music Theater of Lincoln Center, 1964-1969. During this period, the Music Theater, whose productions were given at the New York State Theater, presented three successful revivals of Rodgers and Hammerstein shows: "The King and I" with Rise Stevens, Darren McGavin and Frank Porretta in 1964; "Carousel" with John Raitt and Edward Everett Horton in 1965 and "South Pacific" with Florence Hendersion and Giorgio Tozzi in 1967.

Won eleven Tony Awards: three in 1950 for "South Pacific" for his music as part of the Best Musical win; as Best Composer; and as Best Producers (Musical), shared with Oscar Hammerstein UIUI, Leland Hayward and Joshua Logan; one in 1952, for his music with Hammerstein's book and lyrics as part of a Best Musical win for "The King and I"; two in 1960, for his music as part of a Best Musical win for "The Sound of Music" in a tie with "Fiorello!, and as Best Composer, in a tie with Jerry Bock for "Fiorello!"; two in 1962, as Best Composer for "No Strings" and a Special Tony Award "for all he has done for young people in the theatre and for taking the men of the orchestra out of the pit and putting them on stage in 'No Strings';" one in 1972, another Special Tony Award; and one in 1979, the Lawrence Langner Memorial Award for Distinguished Lifetime Achievement in the Theatre. He was also Tony-nominated five other times: in 1956, for his music and as a co-producer of Best Musical nominee "Pipe Dream"; in 1959 for his music for Best Musical nominee "Flower Drum Song;" in 1962 for his music and lyrics and as co-producer of Best Musical nominee "No Strings;" in 1972 for Best Composer and Lyricist with collaborator Stephen Sondheim for "Do I Hear a Waltz?;" and in 1996, posthumously, for Best Original Musical Score, music only for designated songs that were original and not in the previous film version of State Fair (1945). Along with his lyricist-partner Oscar Hammerstein II, he made more than ten appearances on "The Ed Sullivan Show" (1948), Ed Sullivan’s television show. They are probably the only American musical play creators to have made that many personal appearances on one television series.

Inducted into the Songwriters Hall of Fame in 1970.

Is a member of Pi Lambda Phi fraternity.

Won a 1961 Special Tony Award (New York City) for his work with young people in the theater and putting the orchestra on the stage in his production "No Strings.".

Won a 1972 individual Special Tony Award (New York City).

Won a 1979 Lawrence Langner Tony Award (New York City) and for a distinguished life in the American theater.

Richard Rodgers won the 1950 Pulitzer Prize for Drama for the musical "South Pacific" collaborating with Osar Hammerstein II and Joshua Logan.

Ex-father-in-law of Daniel Melnick.

Portrayed by Tom Drake in Words and Music (1948).

Brother-in-law of Ben Feiner Jr.


Is one of the only 12 people who are an EGOT, which means that he won at least one of all of the four major entertainment awards: Emmy, Grammy, Oscar and Tony.  The other ones in chronological order are Barbara Steisand, Helen Hayes, Rita Moreno, Liza Minelli, John Geilgud, Audrey Hepburn, Marvin Hamlisch, Jonathan Tunick, Mel Brooks, Mike Nichols and Whoopi Goldberg. Barbra Streisand, however, won a Special Tony Award, not a competitive one, and Liza Minnelli won a Special Grammy.

Personal Quotes Defending his songs and style in a changing world:

"What's wrong with sweetness and light? It's been around quite awhile."

"Whenever I get an idea for a song, even before jotting down the notes, I can hear it in the orchestra, I can smell it in the scenery, I can see the kind of actor who will sing it, and I am aware of an audience listening to it."

Referring to Oscar Hammerstein II and what became the song "It Might As Well Be Spring": "He gave me a lyric. And when he gives me a lyric, I write music."

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
 
Footnote by the blog author:

An excellent DVD biography of Richard Rodgers is available at Amazon.com and elsewhere. "The Sweetest Sounds" is the title.

Typically, a composer writes a tune for a Broadway show. A lyricist plays around with it for a while and then writes some words to match the melody. But Richard Rodgers could write spectacular, unforgettable tunes merely by being given lyrics that were already written! And this didn’t happen once or twice, but all the time.

Richard Rodgers himself did not like jazz. But jazz performers and improvisers have been entranced by Rodgers’ tunes since the 1930s. Rodgers is over-represented among jazz versions of American standards.

Rodgers also wrote soundtracks (famously, for the television documentary "Victory at Sea.") He and Hammerstein had a hit popular tune not associated with any musical or soundrack, "Blue Moon."

On March 31, 1957, a new version of "Cinderella" was broadcast live on television with music by Richard Rodgers and lyrics by Oscar Hammerstein. Julie Andrews, then 21, had the lead role. They don’t do new live Broadway productions on television anymore.

Thursday, July 28, 2011

Satellite Data Is Inconsistent with Global Warming



Satellite Data Strongly Challenges
Global Warming Models

NASA has a satellite in earth orbit called "Aqua," measuring various attributes of the earth’s surface. There is a device attached to this satellite called the "Advanced Microwave Scanning Radiometer which is supplying some interesting hard data about radiation of heat from the earth into outer space.

The radiometer data from 2000 to 2011 indicate that far more heat is radiated out into space than the global warming computer models have predicted, according to a peer-review4ed articl4e I the journal Remote Sensing. The results support earlier (pre-UN IPCC 2007) studies that show that far less carbon dioxide is trapped in the atmosphere than claimed by the UN.

"The satellite observations suggest there is much more energy lost to space during and after warming than the climate models show," study co-author Dr. Roy Spencer said on July 26 in a press release from the University of Alabama, where he is a research scientist and U.S. Science Team Leader for the radiometer attached to the Aqua satellite. "There is a huge discrepancy between the data and the forecasts that is especially big over the oceans." [This press release can be found at http://pielkeclimatesci.wordpress.com/2011/07/26/new-paper-on-the-misdiagnosis-of-surface-temperature-feedbacks-from-variations-in-earth%E2%80%99s-radiant-energy-balance-by-spencer-and-braswell-2011/ ].

The NASA satellite data show the atmosphere radiating heat into space long before the UN models predicted this would occur.

A Forbes magazine article dealing with this new data (link below) notes that the primary issue in contention about global warming is whether carbon dioxide emissions will indirectly trap more heat and thus care increased humidity and cirrus clouds, both of which trap considerable heat.

The satellite data does not support humidity and cirrus cloud levels that were used for predictions in the UN computer models.

NASA’s ERBS satellite demonstrated that more longwave radiation (heat) escaped into space between 1985 and 1999 than computer models predicted.

Simply, much more heat is escaping into space than was predicted in the global warming models of IPCC 2007.

Wednesday, July 27, 2011

Good News About Melancholy

Sad Music Can Make You Happy
Real Clear Science Newton Blog -- Posted by Ross Pomeroy on Mon, 18 Jul 2011

With my music collection being what it is (dominated by rock and metal, with a smattering of less mainstream rap and classical), I've often fielded this question from friends: "Ross, your music is depressing; doesn't it make you sad?"

To which I've always replied, "No, I actually listen to it because it makes me happy."

"Weird."

I thought it was strange, too. How could it be that sad music would actually make somebody less sad?

Well, thanks to David Huron, a distinguished professor of arts and humanities in the School of Music and the Center for Cognitive Science at the Ohio State University, I am now armed with an explanation.

The answer involves a hormone called prolactin, which is most commonly associated with lactation in women. However, in an interview with The National, Huron insists that prolactin also limits feelings of depression that result from "sad" experiences.

     When a person is in a sad state, this hormone called prolactin is released and it has
     a consoling psychological effect. So if something happens like your pet dog dies,
     or you've lost job, or you've broken up with your boyfriend or girlfriend, and you
     feel this sad, grief state, the body will respond by releasing prolactin. In most
     cases, this produces a consoling or warming effect. It's as though Mother
     Nature has stepped in and said "We don't want the grief to get too exorbitant."

Huron believes that sad music can put a listener into a "sham state" of sadness. This causes prolactin to be released without the listener actually experiencing a sorrowful event. Thus, without real grief to counterbalance the hormone's effect, the listener receives a net positive "comforting" effect.

However, Huron has found that not everyone can feel this effect (My naysaying friends could be a part of the disadvantaged group that cannot.) One of the professor's previous studies discovered that individuals who score high on "openness" or "neuroticism" in personality tests are more inclined to listen to sad music, perhaps because they can feel the prolactin effect.

Now that I can explain to my friends why sad music makes me happy, I just need to convince them that I'm merely more "open," not neurotic. This point might be a harder sell...

= = = = = = = = = = = = = = = = = = =

Footnote by the blog author: It looks like the Greeks were right when they described nine Muses, being spirits that can inspire humans. These Muses talk to each other all the time when they are trying to inspire us – but we don’t understand their language. Instead we hear music, the language of the Muses.

According to Greek mythology, the God Zeus beguiled the human Mnemosyne and gave her nine children. Thus the Muses are combinations of POWER and MEMORY. Originally they may have been the goddesses of prophetic springs, but they became the source of human inspiration. There is no indication that any one Muse is more powerful than another, although Thalia is at once one of the nine Muses and one of the three Graces. Here is a list of all nine Muses:

Calliope, Muse of epic song
Clio, Muse of history
Euterpe, Muse of lyric song
Thalia, Muse of comedy and bucolic poetry
Melpomene, Muse of tragedy

Terpsichore, Muse of dance
Erato, Muse of erotic poetry
Polyhymnia, Muse of sacred song
Urania, Muse of astronomy.

If the Greeks were right – and they were probably on to something because the story is still being told – then sadness and melancholy, the hallmarks of Melpomene, are gifts of inspiration rather than moods to be avoided.

Tuesday, July 26, 2011

True Story of Long Term Unemployment

There's a website that COLLECTS horrific long term unemployment histories. Here's a lulu of a true story, a victim of George Bush's "compassionate conservatism" and Barack Obama's "hope and change"...
"
This story and dozens of other wrenching stories are on line at this blog:

 
"I am not stupid but I sure have been made to feel that way

My story won’t be any different from the article and stories I read on your website but it is good to know I am not alone. Everything people wrote about feeling horrible about themselves and not sleeping, not going out, not having money for food is true of me! I have a BS in Education and taught in the public school systems in Texas and Florida for over 15 years. I was on an annual contract in ‘08 in Florida when I was told my position was eliminated. I did apply for unemployment but thought that the proper thing to do was get another job immediately. I found a temp job and lost my unemployment. I also lost that job after working one month. Of course, I was no longer eligible for unemployment.

I found other odd jobs but was forced to sale my house on a short-sale. I attempted suicide when I had no money, no gas for my car, no job, no food, and my electric and water had been shut off. Only by some fluke of nature or whatever you want to call it did I survive. But survive is all I have been doing ever since. Odd jobs and graciousness from my mom keep me going but I have no quality of life. I live in hell on earth each and everyday.

I’m not stupid but I sure have been made to feel that way. I have filled out 100s of online applications and sent my resume as well and haven’t even received so much as an email from any of the places I have applied. It now seems like filling out online applications is someone’s idea of a cruel joke. They seem pointless. If I physically go into a place to inquire about jobs, the management always tells me to fill out the online application and then I never hear back.

At this point, I live with my boyfriend because it is a roof over my head. My mother sends me money each month that helps me pay bills and have food. I have no car, no other means of money coming in, no TV, and basically no life. I have no self-esteem and no motivation. Everything seems pointless to me. I go nowhere because I have no transportation and no money. I have been to other parts of the world during better parts of my life, South Korea, Germany, Italy, Barbados, and people in those countries seem to have a better life than I do. Sometimes it seems unreal that I am an American!

At 46, I never dreamed my life would ever be like this.

Linda L., via email

On line at:

Monday, July 25, 2011

Pluto's Fourth Moon Found by Hubble

               Hubble Discovers a New Moon Around Pluto

July 20, 2011: Astronomers using the Hubble Space Telescope have discovered a fourth moon orbiting the icy dwarf planet Pluto. The tiny, new satellite – temporarily designated P4 -- popped up in a Hubble survey searching for rings around the dwarf planet.

The new moon is the smallest discovered around Pluto. It has an estimated diameter of 8 to 21 miles (13 to 34 km). By comparison, Charon, Pluto's largest moon, is 648 miles (1,043 km) across, and the other moons, Nix and Hydra, are in the range of 20 to 70 miles in diameter (32 to 113 km).

"I find it remarkable that Hubble's cameras enabled us to see such a tiny object so clearly from a distance of more than 3 billion miles (5 billion km)," said Mark Showalter of the SETI Institute in Mountain View, Calif., who led this observing program with Hubble.

                   This composite of two Hubble images shows Pluto's four satellites in motion.

The finding is a result of ongoing work to support NASA's New Horizons mission, scheduled to fly through the Pluto system in 2015. The mission is designed to provide new insights about worlds at the edge of our solar system. Hubble's mapping of Pluto's surface and discovery of its satellites have been invaluable to planning for New Horizons' close encounter.

"This is a fantastic discovery," said New Horizons’ principal investigator Alan Stern of the Southwest Research Institute in Boulder, Colo. "Now that we know there's another moon in the Pluto system, we can plan close-up observations of it during our flyby."

The new moon is located between the orbits of Nix and Hydra, which Hubble discovered in 2005. Charon was discovered in 1978 at the U.S. Naval Observatory and first resolved using Hubble in 1990 as a separate body from Pluto.

The dwarf planet’s entire moon system is believed to have formed by a collision between Pluto and another planet-sized body early in the history of the solar system. The smashup flung material that coalesced into the family of satellites observed around Pluto.
 
Lunar rocks returned to Earth from the Apollo missions led to the theory that our moon was the result of a similar collision between Earth and a Mars-sized body 4.4 billion years ago. Scientists believe material blasted off Pluto's moons by micrometeoroid impacts may form rings around the dwarf planet, but the Hubble photographs have not detected any so far.

"This surprising observation is a powerful reminder of Hubble's ability as a general purpose astronomical observatory to make astounding, unintended discoveries," said Jon Morse, astrophysics division director at NASA Headquarters in Washington.

P4 was first seen in a photo taken with Hubble's Wide Field Camera 3 on June 28. It was confirmed in subsequent Hubble pictures taken on July 3 and July 18. The moon was not seen in earlier Hubble images because the exposure times were shorter. There is a chance it appeared as a very faint smudge in 2006 images, but was overlooked because it was obscured.

Wednesday, July 13, 2011

Blog Will Pause until July 25

The Blog Author will be Travelling on Vacation -- so the blog will pause until July 25.

Have Fun This Summer!

Tuesday, July 12, 2011

Long-lasting Antimatter


UC Riverside Physicists Discover New Way

to Produce Antimatter-containing Atom

New method allows positronium to be produced for the first time
at a wide range of temperatures and in a controllable way
(July 11, 2011)

 
RIVERSIDE, Calif. – Physicists at the University of California, Riverside report that they have discovered a new way to create positronium, an exotic and short-lived atom that could help answer what happened to antimatter in the universe and why nature favored matter over antimatter at the universe’s creation.

Positronium is made up of an electron and its antimatter twin, the positron. It has applications in developing more accurate Positron Emission Tomography or PET scans and in fundamental physics research.

Recently, antimatter made headlines when scientists at CERN, the European Organisation for Nuclear Research, trapped antihydrogen atoms for more than 15 minutes. Until then, the presence of antiatoms was recorded for only fractions of a second.

In the lab at UC Riverside, the physicists first irradiated samples of silicon with laser light. Next they implanted positrons on the surface of the silicon. They found that the laser light frees up silicon electrons that then bind with the positrons to make positronium.

"With this method, a substantial amount of positronium can be produced in a wide temperature range and in a very controllable way," said David Cassidy, an assistant project scientist in the Department of Physics and Astronomy, who performed the research along with colleagues. "Other methods of producing positronium from surfaces require heating the samples to very high temperatures. Our method, on the other hand, works at almost any temperature – including very low temperatures."

Cassidy explained that when positrons are implanted into materials, they can sometimes get stuck on the surface, where they will quickly find electrons and annihilate.

"In this work, we show that irradiating the surface with a laser just before the positrons arrive produces electrons that, ironically, help the positrons to leave the surface and avoid annihilation," said Allen Mills, a professor of physics and astronomy, in whose lab Cassidy works. "They do this by forming positronium, which is spontaneously emitted from the surface. The free positronium lives more than 200 times longer than the surface positrons, so it is easy to detect."

Study results appear in the July 15 issue of Physical Review Letters.

The researchers chose silicon in their experiments because it has wide application in electronics, is robust, cheap and works efficiently.

"Indeed, at very low temperatures, silicon may be the best thing there is for producing positronium, at least in short bursts," Cassidy said.

The researchers’ eventual goal is to perform precision measurements on positronium in order to better understand antimatter and its properties, as well as how it might be isolated for longer periods of time.

Cassidy and Mills were joined in the research by Harry Tom, a professor and the chair of physics and astronomy, and Tomu H. Hisakado, a graduate student in Mills’s lab.

In the near future, this research team hopes to cool the positronium down to lower energy emission levels for other experimental uses, and create also a "Bose-Einstein condensate" for positronium – a collection of positronium atoms that are in the same quantum state.

"The creation of a Bose-Einstein condensate of positronium would really push the boundaries of what is possible in terms of real precision measurements," Cassidy said. "Such measurements would shed more light on the properties of antimatter and may help us probe further into why there is asymmetry between matter and antimatter in the universe."

Grants from the National Science Foundation and the US Air Force Research Laboratory funded the study.

http://newsroom.ucr.edu/2683

Monday, July 11, 2011

Our Stormy Sun

Dark Fireworks on the Sun
By Dr. Tony Phillips, NASA

July 11, 2011:
"We'd never seen anything like it," says Alex Young, a solar physicist at the Goddard Space Flight Center. "Half of the sun appeared to be blowing itself to bits."

NASA has just released new high-resolution videos of the event recorded by the Solar Dynamics Observatory (SDO). The videos [available at the internet link at the bottom of this posting] are large, typically 50 MB to 100 MB, but worth the wait to download. Click on the arrow to launch the first movie, then scroll down for commentary:

"In terms of raw power, this really was just a medium-sized eruption," says Young, "but it had a uniquely dramatic appearance caused by all the inky-dark material. We don't usually see that."

Solar physicist Angelos Vourlidas of the Naval Research Lab in Washington DC calls it a case of "dark fireworks."

"The blast was triggered by an unstable magnetic filament near the sun's surface," he explains. "That filament was loaded down with cool1 plasma, which exploded in a spray of dark blobs and streamers."

The plasma blobs were as big as planets, many larger than Earth. They rose and fell ballistically, moving under the influence of the sun's gravity like balls tossed in the air, exploding "like bombs" when they hit the stellar surface.

Some blobs, however, were more like guided missiles. "In the movies we can see material 'grabbed' by magnetic fields and funneled toward sunspot groups hundreds of thousands of kilometers away," notes Young.

SDO also detected a shadowy shock wave issuing from the blast site. The 'solar tsunami' propagated more than halfway across the sun, visibly shaking filaments and loops of magnetism en route.

Long-range action has become a key theme of solar physics since SDO was launched in 2010. The observatory frequently sees explosions in one part of the sun affecting other parts. Sometimes one explosion will trigger another ... and another ... with a domino sequence of flares going off all around the star.

"The June 7th blast didn't seem to trigger any big secondary explosions, but it was certainly felt far and wide," says Young.

It's tempting to look at the movies and conclude that most of the exploded material fell back--but that wouldn't be true, according to Vourlidas. "The blast also propelled a significant coronal mass ejection (CME) out of the sun's atmosphere."

He estimates that the cloud massed about 4.5 x1015 grams, placing it in the top 5% of all CMEs recorded in the Space Age. For comparison, the most massive CME ever recorded was 1016 grams, only a factor of ~2 greater than the June 7th cloud.2 The amount of material that fell back to the sun on June 7th was approximately equal to the amount that flew away, Vourlidas says.

As remarkable as the June 7th eruption seems to be, Young says it might not be so rare. "In fact," he says, "it might be downright common."

Before SDO, space-based observatories observed the sun with relatively slow cadences and/or limited fields of view. They could have easily missed the majesty of such an explosion, catching only a single off-center snapshot at the beginning or end of the blast to hint at what actually happened.

If Young is right, more dark fireworks could be in the offing. Stay tuned.
 
Footnotes:

1"Cool" has a special meaning on the sun: The plasma blobs registered a temperature of 20,000 K or less. That is relatively cool. Most of the surrounding gas had temperatures between 40,000 K and 1,000,000 K. On June 7, 2011, Earth-orbiting satellites detected a flash of X-rays coming from the western edge of the solar disk. Registering only "M" (for medium) on the Richter scale of solar flares, the blast at first appeared to be a run-of-the-mill eruption--that is, until researchers looked at the movies.

2Containing some 1015 grams of matter, coronal mass ejections aren't as massive as they sound. It would take a hundred of the June 7th CMEs to make a decent-sized comet; e.g., the nucleus of Halley's Comet masses about 2 x 1017 gm. "Remember that this is just a magnetized cloud of gas leaving from the quite tenuous corona," notes Vourlidas. "The cloud is big, but really not very massive compared to things like comets, moons, and planets."

Link (with solar pictures and videos):


http://science.nasa.gov/science-news/science-at-nasa/2011/11jul_darkfireworks/

Sunday, July 10, 2011

Improved Flu Vaccines Are Coming

Discovery of natural antibody brings a

universal flu vaccine a step closer

LA JOLLA, CA – July 7, 2011 – Annually changing flu vaccines with their hit-and-miss effectiveness may soon give way to a single, near-universal flu vaccine, according to a new report from scientists at The Scripps Research Institute and the Dutch biopharmaceutical company Crucell. They describe an antibody that, in animal tests, can prevent or cure infections with a broad variety of influenza viruses, including seasonal and potentially pandemic strains.

The finding, published in the journal Science Express on July 7, 2011, shows the influenza subtypes neutralized with the new antibody include H3N2, strains of which killed an estimated one million people in Asia in the late 1960s.

"Together this antibody and the one we reported in 2009 have the potential to protect people against most influenza viruses," said Ian Wilson, who is the Hansen Professor of Structural Biology and a member of the Skaggs Institute for Chemical Biology at Scripps Research, as well as senior author of the new paper with Crucell's chief scientific officer Jaap Goudsmit.

Tackling a Major Shortcoming

Wilson's laboratory has been working with Crucell scientists since 2008 to help them overcome the major shortcoming of current influenza vaccines: They work only against the narrow set of flu strains that the vaccine makers predict will dominate in a given year, so their effectiveness is temporary. In addition, current influenza vaccines provide little or no protection against unforeseen strains.

These shortcomings reflect a basic flu-virus defense mechanism. The viruses come packaged in spherical or filamentous envelopes that are studded with mushroom-shaped hemagglutinin (HA) proteins, whose more accessible outer structures effectively serve as decoys for a normal antibody response. "The outer loops on the HA head seem to draw most of the antibodies, but in a given strain these loops can mutate to evade an antibody response within months," said Wilson. Antiviral drugs aimed at these and other viral targets also lose effectiveness as flu virus populations evolve.

"The major goal of this research has been to find and attack relatively unvarying and functionally important structures on flu viruses," said Damian Ekiert, a graduate student in the Scripps Research Kellogg School of Science and Technology who is working in the Wilson laboratory. Ekiert and Crucell's Vice President for Antibody Discovery Robert H. E. Friesen are co-first authors of the Science Express report.

By sifting through the blood of people who had been immunized with flu vaccines, Goudsmit and his colleagues several years ago discovered an antibody that bound to one such vulnerable structure. In mice, an injection of the antibody, CR6261, could prevent or cure an otherwise-lethal infection by about half of flu viruses, including H1 viruses such as H1N1, strains of which caused deadly global pandemics in 1918 and 2009.

The Crucell researchers approached Wilson, whose structural biology lab has world-class expertise at characterizing antibodies and their viral targets. Ekiert, Wilson, and their colleagues soon determined the three-dimensional molecular structure of CR6261 and its binding site on HA, as they reported in Science in 2009. That binding site, or "epitope," turned out to be on HA's lower, less-accessible stalk portion. The binding of CR6261 to that region apparently interferes with flu viruses' ability to deliver their genetic material into host cells and start a new infection. That antibody is about to begin tests in human volunteers.

The Missing Piece

Crucell researchers subsequently searched for an antibody that could neutralize some or all of the remaining flu viruses unaffected by CR6261, and recently found one, CR8020, that fits this description. As the team now reports in the Science Express paper, CR8020 powerfully neutralizes a range of human-affecting flu viruses in lab-dish tests and in mice. The affected viruses include H3 and H7, two subtypes of great concern for human health that have already caused a pandemic (H3) or sporadic human infections (H7).

As with the CR6261 project, Ekiert and colleagues were able to grow crystals of the new antibody bound to an HA protein from a deadly strain of H3N2, and to use X-ray crystallography techniques to determine the antibody's structure and its precise epitope on the viral HA protein.

"It's even lower on the HA stalk than the CR6261 epitope; in fact it's closer to the viral envelope than any other influenza antibody epitope we've ever seen," said Ekiert.

Crucell is about to begin initial clinical trials of CR6261 in human volunteers, and the company expects eventually to begin similar trials of CR8020. If those trials succeed, aside from a vaccine the two antibodies could be combined and used in a "passive immunotherapy" approach. "This would mainly be useful as a fast-acting therapy against epidemic or pandemic influenza viruses," said Wilson. "The ultimate goal is an active vaccine that elicits a robust, long-term antibody response against those vulnerable epitopes; but developing that is going to be a challenging task."

Source: Scripps Research Institute
 
http://www.sciencecodex.com/discovery_of_natural_antibody_brings_a_universal_flu_vaccine_a_step_closer

Saturday, July 9, 2011

Hottest Temperatures

Highest Recorded Temperatures on Earth

El Azizia, Libya - 136 degrees (September 13, 1922)

Death Valley, United States - 134 degrees (July 10, 1913)

Ghadames, Libya - 131 degrees

Kebili, Tunisia - 131 degrees

Timbuktu, Mali - 130.1 degrees

Araouane, Mali - 130 degrees (summer of 1945)

Tirat Tsvi, Israel - 129 degrees (June, 1942)

Ahwaz, Iran - 128 degrees

Wadi Halfa, Sudan - 127 degrees (April, 1967)

http://www.ouramazingplanet.com/9-hottest-places-earth-1720/
= = = = = = = = = = = = = = = = = = = = = =
Statistically, it is impressive that the two hottest records have stood for nearly 89 years and for 98 years. Aside from those two locales, the upper limit seems to be 131 degrees Fahrenheitit.

-- the blog author

Friday, July 8, 2011

University Science Serves Government

Big Science, Big Government
By Patrick Michaels
Forbes Op-Ed, July 8, 2011

My lobby, the American Association for the Advancement of Science (AAAS) isn’t located in Washington, D.C., because its employees are fond of the city’s heat and humidity in the summer. No, it is here to be close to Congress and the president, to have access to largely sympathetic media, and, by using its influence, bring home the bucks for its members.

AAAS never saw a big-government science program it didn’t like, which is logical behavior in a place where the public’s monies are chopped up, stuffed into legislative sausages, and cold-delivered to your alma mater.

Programmatic science funding goes to universities in the form of grants and contracts funding research proposals generated by the faculty. Academia has become profoundly dependent upon the "overhead" generated by such money.

For example, University of Virginia, typical of the "Public Ivy’s", charges a fringe rate on salaries of 31% (that’s for health and life insurance, disability, institutional contributions to retirement, etc…) over a faculty member’s base salary. Then they tack on an additional 52% to that total, the overhead. So a beginning Full Professor, who probably makes around $100,000, actually requires a taxpayer outlay of over $199,000 if the feds are to pay him. Such full-time "research faculty", paid on outside funds, are common now (I was one at UVA for 30 years).

Federal domination of science funding has two quite intended consequences: both individual scientists and major universities have become wards of Washington. For decades, academic sociologists have noted that almost all faculty party affiliations are with the Democrats. This is no conspiracy–it is merely like-minded individuals hiring other like minds and voting their best interest. Political economists would shrug at this example of "Public Choice" at work.

The Big Government bias of academic science goes back to Franklin Roosevelt, who knew that the Manhattan Project was going to be an explosive success. Before he died (and before the first ignition in New Mexico), he wrote a letter to Vannevar Bush, director of the wartime Office of Scientific Research and Development, which oversaw building of the atomic bombs. In it he asked Bush to propose a mechanism that would employ, in the postwar years, armies of scientists in federally funded laboratories or university research projects.

Bush’s answer (to President Truman), called "Science, The Endless Frontier", laid out the map for the creation of the National Science Foundation, the National Institutes of Health, and various and sundry military research facilities. Bush proposed, and Congress eventually disposed, with the federalization of science.

Inevitably the large but limited public funds disbursed to research institutes and universities would become bones of political contention, attracting courtesans, lobbyists, and consumers as only a large honeypot of dollars does. Hence the role of AAAS.

So it was not very surprising (though alarming to a naïf like me) when, in the Spring of 2007, AAAS draped the following multistory image from the Eye Street side of its headquarters located at 1200 New York Avenue in Washington:

The banner was unfurled precisely when the Senate was considering the 2007 energy bill, which passed, mandating the production of 36 billion barrels of ethanol by 2022. The result of all of this cheerleading is that we are projected to divert approximately 40% of this year’s corn crop to the production of this fuel, which creates more carbon dioxide in its life-cycle than gasoline.

The image was hardly neutral. Backgrounding the corncob/gasoline pump is an image of a wild blue (i.e. pollution-free) ocean. This was propaganda and public relations, not science.

In his 1961 Farewell Address, Dwight Eisenhower famously predicted the rise of a "military-industrial complex," in which he said, "The potential for the disastrous rise of misplaced power exists and will persist". He then went on to speculate as to what Vannevar Bush had wrought.

Few remember the next paragraphs, in which he said that at universities, because of the enormous cost of scientific research, "a government contract becomes virtually a substitute for intellectual curiosity," and that "we must also be alert" to the "danger that public policy itself could become the captive of a scientific-technological elite."

I submit that when the scientists’ lobby hangs a multistory poster in praise of corn ethanol on its Washington building, and when our Senate passes a law that results in our nation burning up over 40% of its corn to please political factions, that we have entered a new phase in our history, where indeed rational policy–like providing cheap and abundant food for the world–has been replaced by the insanity of an unelected scientific-technological elite.

http://blogs.forbes.com/patrickmichaels/2011/07/08/big-science-big-government/

Thursday, July 7, 2011

Phoenix Hit by Dust Storm

A huge dust storm blew through downtown Phoenix, Arizona, at sunset on Tuesday night. Visibility was reduced to nearly zero in Phoenix and its suburbs.

Airline flights were halted and power was knocked out to almost 10,000 people. Swimming pools were made muddy and cars were caked with dirt.

Dust storms are hard to predict, and the large storm hitting Phoenix Tuesday night took everyone by surprise.

The storm, estimated by sme to be 100 miles wide, turned the sky nearly black and made the city's downtown skyscrapers invisible.  Such storms are often called by the Arabic term "haboobs." Some images are available at the link below.

--summarized by the blog author from this AP story at

http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2011/07/06/MN721K7B6E.DTL

Wednesday, July 6, 2011

Large Asteriod to Be Probed --Tiny Moon Sought

Does Asteroid Vesta Have a Moon?

July 6, 2011:
You might think of asteroids as isolated bodies tumbling alone through space, but it's entirely possible for these old "loners" to have companions. Indeed, 19-mile-wide Ida, 90-mile-wide Pulcova, 103-mile-wide Kalliope, and 135-mile-wide Eugenia each have a moon. And 175-mile-wide Sylvia has two moons.

Measuring 330 miles across, Vesta is much larger than these other examples, so a "Vesta moon" is entirely possible.

Where do such moons come from?

Rayman suggests one source: "When another large body collides with an asteroid, the resulting debris is sprayed into orbit around the asteroid and can gradually collapse to form a moon."

Another possibility is "gravitational pinball": A moon formed elsewhere in the asteroid belt might, through complicated gravitational interactions with various bodies, end up captured by the gravity of one of them.  Hubble and ground based telescopes have looked for Vesta moons before, and seen nothing. Dawn is about to be in position for a closer look. This Saturday, July 9th, just one week before Dawn goes into orbit around Vesta, the moon hunt will commence.  The cameras will begin taking images of the space surrounding the asteroid, looking for suspicious specks.

"If a moon is there, it will appear as a dot that moves around Vesta in successive images as opposed to remaining fixed, like background stars," says Dawn Co-investigator Mark Sykes, who is also director of the Planetary Science Institute. "We'll be able to use short exposures to detect moons as small as 27 meters in diameter. If our longer exposures aren't washed out by the glare of nearby Vesta, we'll be able to detect moons only a few meters in diameter."

While you won't see "find a moon" among the mission's science goals, a moon-sighting would be a nice feather in Dawn's cap. Not that it will need more feathers. The probe is already primed to build global maps and take detailed images of the asteroid's surface, reveal the fine points of its topography, and catalog the minerals and elements present there.


Besides, Dawn will become a moon itself when it enters orbit around Vesta. And the probe's motions as it circles will provide a lot of information about the rocky relic.

Sykes explains: "We'll use the spacecraft's radio signal to measure its motion around Vesta. This will give us a lot of detailed information about the asteroid's gravitational field. We'll learn about Vesta's mass and interior structure, including its core and potential mascons (lumpy concentrations of mass)."

As you read this, the spacecraft is gently thrusting closer to its target. And with the navigation images alone we're already watching a never-before-seen world grow ever larger and clearer.

"The pictures are beginning to reveal the surface of this battered, alien world," says Rayman. "They're more than enough to tantalize us. We've been in flight for four years, we've been planning the mission for a decade, and people have been looking at Vesta in the night sky for two centuries. Now, finally, we're coming close
up to it, and we'll be getting an intimate view of this place."

This is not only the first time a spacecraft has visited this alien world, it's also the first time a spacecraft has visited a massive body we haven't approached previously. In the past, rocket ships have orbited Earth, the moon, Mars, Venus, Jupiter, Saturn, and Mercury.

"In each case, flyby missions occurred first, providing a good estimate of the target's gravity along with information on other aspects of its physical environment, including whether any moons are present. This time we're much less certain what we'll find."

At a recent press conference, NASA Planetary Science Deputy Director Jim Adams told reporters that Dawn will "paint a face on a world seen only as a 'fuzzy blob' up to now." What does Rayman think Vesta's face will look like?

"Wrinkled, ancient, wizened, with a tremendous amount of character that bears witness to some fascinating episodes in the solar system's history."

If a new moon is among the episodes, Rayman has a name in mind.

"How about 'Dawn'?" 

http://science.nasa.gov/science-news/science-at-nasa/2011/06jul_vestamoon/

Tuesday, July 5, 2011

Twins Can't Fool Well Trained Dogs

Dogs Can Tell Apart Identical Twins

By Charles Q. Choi, LiveScience Contributor – Tue, Jul 5, 2011

An identical twin might seem like a perfect alibi for any crime one wanted to commit, but now it turns out
dogs can tell such twins apart by scent, researchers find.

Research investigating whether dogs can distinguish identical twins from one another has gone on for more than a half-century. However, the findings of these studies have proven conflicting and inconclusive over the years.

Scientists in the Czech Republic suggested any discrepancies seen between the results of past experiments might have been due to different levels of training of the dogs used in each. Instead, they suggested using police dogs with the highest level of training, noting that such canines have been used in police lineups to identify people by scent in Europe and the United States.

Researchers started with samples swabbed from the bellies of two sets of identical twins, boys age 5 and girls age 7, as well as two pairs of fraternal twins, 8-year-old girls and 13-year-old boys. Experiments with 10 German shepherds then had each dog go through 12 tests, all of which involved the canines sniffing a swab and then seeking out a matching scent from seven possibilities.

The scientists found all the dogs picked out the correct match every time.  They distinguished identical twins from one another without fail, even though each twin lived in the same place and ate the same food as their counterpart. In comparison, DNA tests could not tell identical twins apart, although they could naturally tell fraternal twins apart, which are only as genetically similar as any regular pair of siblings.

"Dogs can discriminate the odor of identical twins if well-trained," said researcher Lud?k Bartoš, an ethologist at the Czech University of Life Sciences in Prague.

As to how dogs can tell identical twins apart, each might be influenced by different outside factors, such as infections that might alter an individual's odor, researchers conjectured. "It should be emphasized, however, that we are far from being able to elucidate the reason why the twins differ,"
Bartoš told LiveScience.

Future research can focus on whether or when the scents of identical twins start to differentiate with age.

The scientists detailed their findings online June 15 in the journal PLoS ONE.
http://news.yahoo.com/dogs-tell-apart-identical-twins-135001904.html

Monday, July 4, 2011

Computers Make Mistakes

Error Free Computers
Are A Myth

By the blog author

You have probably heard some nonsense from computer nerds that "computers don’t make mistakes." Yes, they do. There are two central reasons that computers make mistakes: 1) the method of manufacturing them and 2) programming errors.

                              Manufacturing Errors

An integrated circuit represents layers of wafers which have circuits built into them, usually through lithography. The entire process occurs in a "clean room," as the silicon wafers are quite subject to degraded performance when exposed to impurities of any sort.

On the other hand, specific impurities are added because of the electrical properties induced into the
material, creating doped transistors. The modern technique for this is called ion implantation. The doping is followed by setting and sealing through heat ("furnace anneal") or, for advanced integrated circuits, rapid thermal anneal (RTA). Transistors and capacitors are built in to particular layers. Then the layers are fully interconnected. Then they are tested for operation, called "the wafer test." Then they are packaged and tested again.

Detail available at http://en.wikipedia.org/wiki/Integrated_circuit_fabrication





This is the same programmer from Poland who won a computer chess award as discussed in yesterday’s blog. Note the room fans used to force air past the processors and keep them cool!
I’m going into these details to make a vital point. Integrate circuits are delicate, they won’t function properly if impurities are introduced during manufacture, they are baked like cookies and tested after transistor layering and capacitor layering are added, then tested again after connections are added, then tested again once packaged.

How many integrated circuits "die" in the clean room because they fail a test during the manufacturing process? "Thirty to forty percent." The other sixty to seventy percent pass and are put into use. Let’s be a little more precise and logical in stating this. Sixty to seventy percent of clean room output fails to fail the tests during manufacture. That means that sixty to seventy percent are approved for use with no known built-in shortcomings. The computer nerds are wrong when they say that computers don’t make mistakes, obviously. Nerds may counter that it is only a one in a million possibility that a mistake may be made. But computer programs use hundreds of thousands of lines of coding. Many iterations may be needed to get a result. So a computer does millions and millions of operations during regular processing. The potential for error is rampant. Usually, the computer will lock or get stuck in a loop or refuse to function if there is a serious error. This is called "crashing." Any operating system that can crash can also proceed with erroneous data or calculations and continue functioning, producing incorrect results undetectably. Crashing can be caused by corrupt data, manufacturing error, equipment operating outside its reliable temperature range, or programming errors (discussed below).

Undetectable errors may be most likely the result of some weakness or manufacturing problem in the annealing process, where the wafer is "baked like a cookie" as I described above. If you will, the sixty to seventy percent of integrated circuits that pass all the factory tests will eventually drop to nearly zero percent that continue to perform correctly outside the narrow temperature range for which they were designed. Specifically, if you heat any integrated circuit enough, it will certainly malfunction.

The layers of an integrated circuit are, themselves, getting thinner and thinner. This makes the device faster and able to store more data. Yet this miniaturization also makes it more susceptible to certain sorts of interference, perhaps surges in voltage or outside interference (such as static electricity).

Going back half a century, the Department of Defense used circuit boards mounted with individual transistors. These circuit boards were stacked in expensive, water-cooled cabinets. Errors still occurred. Master computer technicians used to carry rubber mallets which they would use to bang on the cabinet to insure the trays of circuit boards were connected, often fixing an operative machine by brute force. Now the cabinet is all on a chip and the chip is tested after its layers are connected together, but such a test is not an absolute guarantee of error-free functioning.

Further, there is an inherent potential for error in solid-state computer devices such as transistors and integrated circuits. Such devices use a semiconductor such as silicon dioxide, properly "doped" with an intentional impurity for specific electrical properties. Current is passed through junctions of material, usually P-N-P junctions or N-P-N junctions. Solid state devices are much smaller and use much less power than the original computing devices, vacuum tubes. But vacuum tubes had one simple superiority: either the current was passed or it was blocked. Completely. Properly manufactured tubes don’t "leak" electrons when set to 0 even when the vacuum tube itself is powered up. P-N-P junctions and N-P-N junctions are inherently capable of leaks, again, especially, outside the designed operating temperatures. These leaks may be associated with the problem that power surges or AC fluctuation can case computational errors or loss of data.

= = = = = = = = = = = = = = = = = = = = = = = = =

PNP and NPN Transistor Vulnerabilities (from Wikipedia)

Vulnerabilities
Exposure of the transistor to ionizing radiation causes radiation damage. Radiation causes a buildup of 'defects' in the base region that act as recombination centers. The resulting reduction in minority carrier lifetime causes gradual loss of gain of the transistor.

Power BJTs are subject to a failure mode called secondary breakdown, in which excessive current and normal imperfections in the silicon die cause portions of the silicon inside the device to become disproportionately hotter than the others. The doped silicon has a negative temperature coefficient, meaning that it conducts more current at higher temperatures. Thus, the hottest part of the die conducts the most current, causing its conductivity to increase, which then causes it to become progressively hotter again, until the device fails internally. The thermal runaway process associated with secondary breakdown, once triggered, occurs almost instantly and may catastrophically damage the transistor package.

If the emitter-base junction is reverse biased into avalance or Zener mode and current flows for a short period of time, the current gain of the BJT will be permanently degraded.

http://en.wikipedia.org/wiki/PNP_transistor#PNP

= = = = = = = = = = = = = = = = = = = = = = = = =

                                Programming Errors

There are very basic errors that are easy to make in programming. A program is a collection of "routines" and a routine contains processes that are performed over and over again ("loops") as well as outside processes brought in from time to time to perform certain functions ("subroutines"). It is the programmer’s responsibility to assure that no data dependencies exist between the subroutine and the appropriate loop(s). Local variables in a routine should be labeled automatic instead static. And allocating local variables to a stack can result in stack overflow.

It’s not so simple. See http://download.oracle.com/docs/cd/E19205-01/819-5262/aeujh/index.html .

Do programmers check all these things meticulously? No, they write a program and then use test data. If the test data produce the right answer, they assume the program is error free. But such programming may include needless additional steps, taking longer to compute and possibly introducing errors bringing the data from one operation to another. A "fix" for this is to use a dense and compact programming language, for which complex functions are reduced to simple coding. APL is such a computer programming language.

The difficulty here is that the language itself is so dense that it is difficult to proof it for errors.

Computers are data preparation and comparison devices using base 2. The initial purpose of computers was to "crack" the cryptographic codes of enemy messages during World War II. This was a brilliant, perhaps war-winning, accomplishment for British intelligence, a special use computer ("The Bronze Goddess") which cracked messages scrambled by the German Enigma cipher machine). Later in America, special use computers were created to develop the "explosive lens" needed to implode radioactive material together so densely that an atomic explosion occurred. The individual central to that accomplishment was John von Neumann, who, shortly after the war, changed and developed the computer by introducing the "central processing unit" or CPU, an architecture that remains the standard to this day.

The establishment of central processing units opened the door for standard instructions, a process greatly accelerated by the development of the FORTRAN computer language in 1956.

Computers remain the most efficient way to crack coded messages and perform certain mathematical calculations. But they can be twisted into setting up and calculating a spreadsheet, performing word processing functions, and into database management. But it takes a lot of twisting to use computers in this inefficient way.

It takes hundreds of thousands of lines of computer programming for each such application. The Department of Defense figured out years ago that it is impossible to write massive coding without making mistakes.  It is unwise to state that a large program has been tested and reviewed to the point of certainty that it is perfect.

In yesterday’s blog post, a winning computer chess tournament program was featured. The trophy was taken back because the contestant used programming from someone else, which he had tweaked further.
Let’s get beyond the judgmental question of whether the contest judges should have taken the trophy away. There is another and more important consideration here. The programmer, rather than write a new subprogram for a certain set of circumstance, cut and pasted existing programming. This saves time and is probably no more error prone that doing it from scratch himself.

We can say that most programs are like this, a wall full of paste-it notes drafted by various programmers other than the anthologizer who assembles a program from many, usually remote and unrelated, programmers.

There’s nothing inherently wrong with this. The problem is simpler and deeper. Programming lacks what a biologist would call bench protocols or what a chemist would call lab protocols or an engineer would call protocols. These are standard procedures everyone adopts to use certain equipment safely. In chemistry, it also involves using equipment accurately and correctly (infamously, including sparklingly clean and uncontaminated glassware for titrations and reactions).

An analog to software engineering would be set pieces of mathematical functions, universally available and uncorrupt, which can be used in programming without worrying about inaccuracies or corruption. These perfect building blocks do not appear to exist yet, though programming is a skill that has been around for more than a half-century.

When a software engineer talks about protocols, it means this:

"In computing, a protocol is a set of rules which is used by computers to communicate with each other across a network. A protocol is a convention or standard that controls or enables the connection, communication, and data transfer between computing endpoints. In its simplest form, a protocol can be defined as the rules governing the syntax, semantics, and synchronization of communication. Protocols may be implemented by hardware, software, or a combination of the two. At the lowest level, a protocol defines the behavior of a hardware connection."

Read more at: http://wiki.answers.com/Q/What_are_software_protocols#ixzz1RCOSaRds
 
That’s syntax, that’s continuity, but it isn’t what a scientist would call a protocol.

Add to this the "dense" computer language used to save computer time (such as APL), the sheer length and complexity of the coding instructions, and programs written to resist being reconstructed by competitors, and there is an overwhelming tendency for errors. This is why "Beta" versions of new programs are given out to computer nerds and hackers to see whether they can find the shortcomings.

My point is that they seldom find all the bugs. The programs are too large to be error free, and there is no perfect bug-finding set of data. And there are things that computers simply cannot do. They can’t find the shortest route to multiple destinations without an exhaustive search, although bees can do this in nature. They can’t truly generate an absolutely random series of numbers, although dice can be thrown that perform that function. Further, and dangerously, computers and their programmers don’t fully appreciate the difference between an iteration and genuine outside scientific lab results. There is a dangerous tendency to equate these two (which error was central to the financial crash of September, 2008).

Conclusion: Computers make errors. Programmers often make errors. Computer owners (insultingly dismissed as "users") are ignored when they suffer the consequences.

Sunday, July 3, 2011

Cheating at Computer Chess

Computer Chess Champion Caught
Injecting Performance-Enhancing Code

By Jason Mick, July 1, 2011
Code theft -- the performance enhancing drug of the future?

From Barry Bonds to Lance Armstrong, many of the sporting world's legendary figures have been discolored by "cheating" scandals, in which they were accused of taking performance enhancing drugs (PEDs). The typical response has always been the same -- deny, deny, deny.

Now one of the world's first computer athletes is facing a scandal that is unfolding in a very similar fashion.

An organization known as the International Computer Games Association (ICGA), which pits computers against each other in tournaments of Chess, Go, and other games, has found by a 5-0 vote that one of its grand champions was a cheater. The vote follows a lengthy investigation that included reverse-engineering of source code, to prove the digital "doping".

Much as the NCAA has continually waged a war against cheaters in the world of United States collegiate sports or Major League Baseball has been forced to contend with its own cheating scandal, the ICGA wiped the record books clean for the World Computer Chess Championship from 2007 to 2010, bumping the runner-up(s) to first place.

The decision came after it was found that the winning engine, Rybka, illegally enhanced its performance by using code from two other engines -- Crafty and Fruit -- without permission.

Fruit was the runner up for the title in 2005 and its code was published under the Gnu Public License. The panel presiding over the case concluded that Rybka could legally have made use of the code in question given the GPL, but would have had to have credited it -- which it did not. Thus, it was Rybka's refusal to share the spotlight that proved its downfall.

The copied code was first noticed when some noted that Rybka's was performing a series of moves eerily similar to the Crafty and Fruit.

As a result of the code lifting, Rybka is "banned for life" from the tournament, as is its creator Vasik Rajlich, as per a sentence passed down by a 34-member arbitration panel.

Worse yet, Mr. Rajlich -- an international master level human chess player -- is expected to give back his prize money. From this source [http://www.grappa.univ-lille3.fr/icga/files/rules-2010.pdf ][PDF] we know that he won €1,000 ($1,593 USD) for his 2010 victory -- it is unclear how much his winnings from the other three titles total.

Mr. Rajlich denied the charges, but refused to fully participate or defend himself in the investigation. He merely wrote in a short email:
Rybkahas does not "include game-playing code written by others", aside from standard exceptions which wouldn't count as 'game-playing'.


The vague phrase "derived from game-playing code written by others" also does not in my view apply to Rybka.

Mr. Rajlich is a graduate of the Massachusetts Institute of Technology (MIT) and currently resides in Warsaw, Poland. Iweta Rajlich, Mr. Rajlich's wife, is regarded as one of the world's top female chess players, to this day.

                                       Vasik and Iweta Rajlich

It is unclear how the decision will effect other non-ICGA victories of Rybka. For example in 2007 in a match entitled "Everything but a pawn", Rybka took home €11,000 ($15,930 USD) in a fan-sponsored prize for defeating human chess grand master Jaan Ehlvest.


Rybka is driven by a cluster of computers that contains "one w5580, two Skulltrails, another Harpertown oct, and five i7-920s, for a total of 52 cores" in 2009. More details on the engine and its 2009 victory can be read here [at http://www.chessbase.com/newsdetail.asp?newsid=5444 ].

                                     Rybka (dethroned) champ with its 52 cores


-- this blog article is available at http://www.dailytech.com/Computer+Chess+Champion+Caught+Injecting+PerformanceEnhancing+Code/article22060.htm

= = = = = = = = = = = = = = = = = = = = = = = = = = = =

A thought-provoking comment to this article, also from the link immediately above:

Coders

I have yet to meet a programmer that didn't copy and paste code from somewhere else.That’s how compatability is ensured with a lot of programs, common interface with common code with a slightly different perspective taken to make it a little different. It's like copyrighting the color blue or salt. It's just a different recipe that it takes to make a good program.

–By RedemptionAD, July 1, 2011

= = = = = = = = = = = = = = = = = = = = = = = = = = =
Tomorrow this blog will have some more thoughts on computers based on this story and the comment made immediately above.
 

Saturday, July 2, 2011

Many Users Lax About Computer Security

Study: Computer Security Not Nearly
As Good as Some Users Think

By Shane McGaun [blog] June 28, 2011

Non-adults sites may be more harmful than adult sites according to firm
The survey was from G Data Software and surveyed 16,000 users globally with about 5,500 of those users inside America. According to the study 40% of the respondents inside the U.S. think that visiting an adult site is more risky than visiting a hobby site (for example, a site dedicated to horseback riding). According to G Data, however, hobby sites are more likely to have lax security compared to an adult site.

The report stated, "The level of awareness among Internet users is still inadequate and out-of-date in many respects."

The study found that almost all of the respondents in the U.S. think they would be able to recognize an attack. The respondents think the attack will cause their computer to crash or that popups will appear. The reality according to G Data is that most modern attacks are stealthy so they can remain unnoticed for longer periods of time.
The survey researchers wrote, "The aim of online criminals is to earn as much money as they can, which means that they want to keep infections hidden from users for as long as possible."

The survey also found that about 19% of those in the U.S. would access links to social networks regardless of where the link came from. Those users are classified as easy targets for cyber criminals. The study found that nearly 88% of U.S. participants use security software with about half using free versions. The most popular free security software in the U.S. is from Microsoft.

While most users have security software installed according to survey results, 28% of users don't update their virus definitions daily while 24% have no idea if they update definitions.

http://www.dailytech.com/Study+Computer+Security+Not+Nearly+as+Good+as+Some+Users+Think/article22012.htm
Computer security isn't something that average consumers think a lot about. Many people think that they are visiting friendly sites and that these sites pose no risk. The reality according to a new study is that many people are using very bad behavior in protecting computers at home and in the office.

Friday, July 1, 2011

Automobile With The Most Miles Travelled

Volvo Owner Nears 3 Million Miles
July 20, 2010

American Irv Gordon - driver of the highest mileage vehicle on the road, a 1966 Volvo P1800 - has just turned 70. Now he aims to reach an unmatchable record sometime in the next three years.

The two-seater sports coupe P1800 was presented at the auto show in Brussels 1960. Production started one year later, so 2011 the P1800 celebrates its 50 year anniversary. Almost 46,000 units of the famous Swedish sailor Pelle Petterson designed model were produced in various versions until production finished in 1973.

One of the P1800 icons is the British actor Roger Moore, who drove the car in the TV series "The Saint". Another famous P1800 driver is the American Irv Gordon. With more than 2.8 million miles on his sporty red Volvo sports car, Gordon was celebrating his birthday on 15 July by affirming his goal of reaching three million miles before his 73rd birthday - forever enshrining him as an iron man in automotive endurance.

Gordon, a retired science teacher from East Patchogue, N.Y., purchased his Volvo in June 1966, and immediately fell in love, driving 1,500 miles in the first 48 hours. With a 125-mile round-trip daily commute, a fanatical dedication to vehicle maintenance and a passion for driving, Gordon logged 500,000 miles in 10 years.

In the Guinness Book of World Records
In 1998 with 1.69 million miles, he made the Guinness Book of World Records for most miles driven by a single owner in a non-commercial vehicle. In 2002, he drove the car's two-millionth mile down Times Square to national and international media attention.

Today, Gordon breaks his own record every time he drives, whether it's to Cincinnati for coffee, Rolla, Mo., for lunch or Green River, Wyo., for dinner. Gordon - like any mighty record-holder at the top of his game - is mindful of his legacy, as well as setting a record no one can beat.

"Three million miles by my 73nd birthday sounds right," Gordon said. "But, whether I reach that mark is more up to me than it is the car. The car's parts have long proven they can take it, but I'm not so sure about my own. Either way, it will be a fantastic testament to the engineering genius of Volvo as well as to the resiliency of folks my age.

Like baseball legend diMaggio
"Three million miles is an iron clad number that I'd like to think sits right up there with DiMaggio's consecutive game hitting streak. Who's going to beat that? No one."

Gordon is unsure what to do with his Volvo after three million miles, though he has considered selling it for no less than one dollar per each mile he's driven.

"I used to think I'd park it in a museum where people will get to enjoy seeing the car that beat the odds - all with the same engine, same radio, same axles, same transmission and of course the same driver,"
Gordon said. "Now I think, ‘no way.' I'll either keep driving it or sell it for $3 million."

And what would he do if he made $3 million off the car?

"I'd spend it on traveling," he said. "Road trips, of course."

http://www.volvocars.com/intl/top/about/news-events/pages/default.aspx?itemid=192