Thursday, June 30, 2011

Vital Careers -- Information Assurance and Forensic Accounting

Information Assurance
Information assurance (IA)is the practice of managing risks related to the use, processing, storage, and transmission of information or data and the systems and processes used for those purposes. While focused dominantly on information in digital form, the full range of IA encompasses not only digital but also analog or physical form. Information assurance as a field has grown from the practice of information security which in turn grew out of practices and procedures of computer security.
There are three models used in the practice of IA to define assurance requirements and assist in covering all necessary aspects or attributes.

The first is the classic information security model, also called the CIA Triad, which addresses three attributes of information and information systems, confidentiality, integrity, and availability. This C-I-A model is extremely useful for teaching introductory and basic concepts of information security and assurance; the initials are an easy mnemonic to remember, and when properly understood, can prompt systems designers and users to address the most pressing aspects of assurance.

The next most widely known model is the Five Pillars of IA model, promulgated by the U.S. Department of Defense (DoD) in a variety of publications, beginning with the National Information Assurance Glossary, Committee on National Security Systems Instruction CNSSI-4009*. Here is the definition from that publication: "Measures that protect and defend information and information systems by ensuring their availability, integrity, authentication, confidentiality, and non-repudiation. These measures include providing for restoration of information systems by incorporating protection, detection, and reaction capabilities." The Five Pillars model is sometimes criticized because authentication and non-repudiation are not attributes of information or systems; rather, they are procedures or methods useful to assure the integrity and authenticity of information, and to protect the confidentiality of those same.

*[available at ]

A third, less widely known IA model is the Parkerian Hexad, first introduced by Don B. Parker in 1998. Like the Five Pillars, Parker's Hexad begins with the C-I-A model but builds it out by adding authenticity, utility, and possession (or control). It is significant to point out that the concept or attribute of authenticity, as described by Parker, is not identical to the pillar of authentication as described by the U.S. DoD.

Information assurance is closely related to information security and the terms are sometimes used interchangeably. However, IA’s broader connotation also includes reliability and emphasizes strategic risk management over tools and tactics. In addition to defending against malicious hackers and code (e.g., viruses), IA includes other corporate governance issues such as privacy, compliance, audits, business continuity, and disaster recovery. Further, while information security draws primarily from computer science, IA is interdisciplinary and draws from multiple fields, including accounting*, fraud examination*, forensic science*, management science, systems engineering, security engineering, and criminology, in addition to computer science. Therefore, IA is best thought of as a superset of information security (e.g. umbrella term).

= = = = = = = = = = = = = = = = = = = = = = = = =

* a closely related and fast-growing career – forensic accounting:

Five Reasons to Study Forensic Accounting

By J.J. Yong, October 23, 2010
Article Source:

Forensic accounting is the practice of utilizing accounting auditing and investigative skills to assist in legal matters to obtain an accurate result to establish the accountability for administrative proceeding.

You may be wondering, why study forensic accounting?

Well, here are the five reasons:
  • Our current economic crisis has left many companies to face serious financial issues that may lead to bankruptcy. Hence, these companies have been forced to stoop down to the lowest level to save their company by committing frauds and swindles. This makes such a job an important one that increases in demand each year.
  • Internal audit in the company could not throw light on the different fact and other hidden aspects of the corporate fraud. They are hardly in a position to initiate proper action at proper time due to their lack of forensic accounting skills.
  • Forensic accounting is a new and very exciting study. This change[s] the perspective of the world on accounting study, which has been a theoretically dull field in itself.
  • If you are ambitious, fast, observant, creative and diligent, Forensic accounting is definitely a dream job and a great investment. Using computer technology, creative thinking, and careful inspection of financial records; the hidden proof of the crimes can be discovered.
  • You will always be equipped with the latest computer software and gadgets. Forensic accounting heavily relies on computer software and generalized audit software to aid in the detection and investigation of fraud and white-collar crimes. Also, investigative tools such as data mining, link analysis software and case management software and the use of the Internet are the essential skills as well.
In conclusion, forensic accounting has been stereotyped as a boring and uninteresting job which has been proved to be wrong. There are many reasons which show to be benefits when it comes to studying forensic accounting. Not only will you be rewarded with a stable job, you would also look forward to going to work everyday.

For more information about online forensic accounting and benefits of forensic accounting, visit 

Negative Quiddity: Inhumane Incompetence

American tourist left on Great Reef
by tour boat company

By Liz Goodwin | The Lookout [a blog] – Wed, Jun 29, 2011

An American tourist fought off panic when an Australian tour boat company accidentally left him stranded near the Great Barrier Reef in his snorkeling gear.

Michigan 28-year-old Ian Cole says he grew frightened when he surfaced and realized the Passions of Paradise tour boat was gone last Saturday. Since American couple Tom and Eileen Lonergan disappeared after being left behind by a boat in nearby Port Douglas in the 1990s, tour companies have followed strict head count rules, the AFP reports. You can watch an Australian TV news report on the incident in the video above [by going to the link at the end of this article].

"Panic kicks in, your heart rate goes up, and you don't know what's going to happen. I was sucking water back into my snorkel and was really trying hard to stay calm," Cole told the Sydney Morning Herald.

Cole saw another tour boat and says he swam about 15 minutes to reach it. ''When I got to the other boat they looked down like 'what the hell are you doing here?' They said my boat had left 15 minutes ago. I thought they were joking.''

The staff member who failed to do an accurate headcount has been fired, the paper says, and Cole was given a $200 restaurant voucher and a refund for the trip.

But the spokesman of a regional tour operators association told local Australian CairnsBlog that he thought Cole was exaggerating the danger he faced in the incident. Association of Marine Park Operators' Col McKenzie said Cole was ''making a mountain out of a molehill'' to get media attention.

"His response is very upsetting. I engaged in a professional and respectful way and all I sought was for this company to do the right thing following an incident that was totally unacceptable," Cole told CairnsBlog. He had asked for a letter apologizing for the incident and outlining what safety measures would be taken to prevent similar oversights from occurring in the future. Cole told NBC the skipper of the boat called him to apologize.

The horror movie "Open Water" was based on the story of the Lonergan couple, who disappeared after being left by a tour boat on the reef in 1998, and were presumed to have been eaten by sharks. The AP reports that in 2008, a British diver and his American girlfriend were lost nearby when they surfaced from a dive and couldn't find their tour boat. They spent 19 hours in the water awaiting a helicopter rescue.

= = = = = = = = = = = = = = = = = = = = = = = = = =

Moral of this story:  Incompetent People Have No Kindness

-- the blog author

Tuesday, June 28, 2011

Tough Diet Reverses Diabetes

Diabetes Reversed in Patients on Extreme DietBy Rachael Rettner
 MyHealthNewsDaily Staff Writer – Tue, Jun 28, 2011
An extreme diet of just 600 calories a day may reverse diabetes in some patients, a new study finds.

After one week on the diet, diabetes patients saw their blood glucose levels return to normal, indicating their diabetes had gone into remission. Eight of the 11 patients remained diabetes-free three months after they stopped the diet.

Previous studies have found a similar reversal of diabetes immediately after gastric bypass surgery. But the new study, conducted by researchers at Newcastle University in the United Kingdom, shows this quick resolution is possible through diet alone.

However, if the patients gained back all the weight they lost, it's likely their diabetes would come back too, said Dr. Ronald Goldberg, a professor of medicine at the University of Miami School of Medicine's Diabetes Research Institute, who was not involved in the new study. And in people who haven't had gastric bypass, a surgery that makes the stomach smaller, such a calorie-sparce diet would be hard to keep up, Goldberg said.

Extreme dieting

The patients adhered to a liquid-based diet consisting of meal replacement drinks. It also included three portions of non-starchy vegetables per day. The diet lasted for eight weeks.

After seven days, the participants' blood-glucose levels were comparable with those of people who did not have diabetes.

The participants also saw a reduction in the fat content of their liver and pancreas both during and after the diet. It's thought excess fat in the liver and pancreas may be an underlying cause of diabetes.

Participants lost an average of 33 pounds (15 kilograms) during their diet. Twelve weeks after the diet had finished, subjects had put back on an average of 6.5 pounds (3 kg).

Long-term benefits?
The findings may only apply to people who have not had diabetes for a long period of time —the participants had diabetes for less than four years prior to the study.
It's also unclear whether their diabetes will remain in remission for years rather than just months, the researchers said.

The study was published online June 9 in the journal Diabetologia.

Monday, June 27, 2011

Perforin -- a critical protein


Cell-destroying 'death protein'

to fight disease

by Amy Coopes, Agence France-Presse, November 1, 2011  
SYDNEY: Australian and British researchers uncovered the structure of a 'death protein' that destroys rogue cells – a two-billion-year-old structure stolen from bacteria – which could be used to fight disease.

The protein, perforin, targets wayward cells and punches a hole in their membranes to let in killer
enzymes, said James Whisstock from Monash University in Melbourne, Australia, adding the discovery "answers a really fundamental mystery of immunity".
"Perforin is our body's weapon of cleansing and death," said Whisstock. "It breaks into cells that have been hijacked by viruses or turned into cancer cells and allows toxic enzymes in, to destroy the cell from within," he said. "Without it, our immune system can't destroy these cells."

A structure stolen from bacteria

Using the Australian Synchrotron in Melbourne and powerful electron microscopes at London's Birkbeck College allowed scientists to examine perforin's structure and function, Whisstock said, revealing a "powerful molecule" that targets malignant or infected cells.

Fellow researcher Joe Trapani said the 10-year study, published in Nature found that perforin's structure was similar to bacterial toxins like anthrax and listeria, suggesting that the human body had learned its tactics from diseases themselves.

"Quite remarkably that mechanism is conserved all the way back to bacteria ... we've actually pinched it off bacteria at some point [in human evolution] and turned it back against them," said Trapani, from Melbourne's Peter MacCallum Cancer Centre.

"Two billion year old blueprint"

It's a war conducted between our immune system and bacteria and we're actually fighting using similar weapons," he said, explaining that it was two billion year old blueprint.

Without perforin – released by 'killer' cells designed to destroy harmful invaders – the body was unable to fight infections. Studies with mice had linked defective perforin to leukaemia and heightened cell malignancy.

The discovery also had implications for autoimmune diseases such as juvenile type 1 diabetes and for transplant patients, with the protein linked to both the elimination of healthy cells and tissue rejection, added Whisstock.

Sunday, June 26, 2011

Positive Quiddity: Evolution not Creationism

Christian Faith Requires Accepting Evolution
By Jonathan Dudley
Huffington Post editorial, June 18, 2011

In the evangelical community, the year 2011 has brought a resurgence of debate over evolution. The current issue of Christianity Today asks if genetic discoveries preclude an historical Adam. While BioLogos, the brainchild of NIH director Francis Collins, is seeking to promote theistic evolution among evangelicals, the president of the Southern Baptist Theological Seminary recently argued that true Christians should believe the Earth is only a few thousand years old.

As someone raised evangelical, I realize anti-evolutionists believe they are defending the Christian tradition. But as a seminary graduate now training to be a medical scientist, I can say that, in reality, they've abandoned it.

In theory, if not always in practice, past Christian theologians valued science out of the belief that God created the world scientists study. Augustine castigated those who made the Bible teach bad science, John Calvin argued that Genesis reflects a commoner's view of the physical world, and the Belgic confession likened scripture and nature to two books written by the same author.

These beliefs encouraged past Christians to accept the best science of their day, and these beliefs persisted even into the evangelical tradition. As Princeton Seminary's Charles Hodge, widely considered the father of modern evangelical theology, put it in 1859: "Nature is as truly a revelation of God as the Bible; and we only interpret the Word of God by the Word of God when we interpret the Bible by science."

In this analysis, Christians must accept sound science, not because they don't believe God created the world, but precisely because they do.

Of course, anti-evolutionists claim their rejection of evolution is not a rejection of science. Phillip Johnson, widely considered the leader of the Intelligent Design movement, states that all he's rejecting is the atheistic lens through which evolutionary scientists view the world. Evolution, he argues, is "based not upon any incontrovertible empirical evidence, but upon a highly philosophical presupposition."

And to a certain extent, this line of argument makes sense. Science is not a neutral enterprise. Prior beliefs undoubtedly influence interpretation. If one believes God created vertebrates with a similar design plan, one can acknowledge their structural similarities without believing in common descent. No amount of radiocarbon dating evidence will convince someone the Earth is 4.5 billion years old if that person believes God created the world to look old, with the appearance of age.

But beyond a certain point, this reasoning breaks down. Because no amount of talk about "worldviews" and "presuppositions" can change a simple fact: creationism has failed to provide an alternative explanation for the vast majority of evidence explained by evolution.

It has failed to explain why birds still carry genes to make teeth, whales to make legs, and humans to make tails.

It has failed to explain why the fossil record proposed by modern scientists can be used to make precise and accurate predictions about the location of transition fossils.

It has failed to explain why the fossil record demonstrates a precise order, with simple organisms in the deepest rocks and more complex ones toward the surface.

It has failed to explain why today's animals live in the same geographical area as fossils of similar species.

It has failed to explain why, if carnivorous dinosaurs lived at the same time as modern animals, we don't find the fossils of modern animals in the stomachs of fossilized dinosaurs.

It has failed to explain the broken genes that litter the DNA of humans and apes but are functional in lower vertebrates.

It has failed to explain how the genetic diversity we observe among humans could have arisen in a few thousand years from two biological ancestors.

Those who believe God created the world scientists study, even while ignoring most of the data compiled by those who study it, might as well rip dozens of pages out of their Bibles. Because if "nature is as truly a revelation of God as the Bible," it's basically the same thing.

Many think the widespread rejection of evolution doesn't really matter. Evolution is about what happened in the past, the argument goes, so rejecting it doesn't have an impact on policies we make today. And aside from school curricula, they may be right.

But the belief that scientists can discover truth, and that, once sufficiently debated, challenged and modified, it should be accepted even if it creates tensions for familiar belief systems, has an obvious impact on decisions that are made everyday. And it is that belief Christians reject when they reject evolution.

In doing so, they've not only led America astray on questions ranging from the value of stem cell research to the etiology of homosexuality to the causes of global warming. They've also abandoned a central commitment of orthodox Christianity.

= = = = = = = = = = = = = = = = = = = = = = = = = = = =

Jonathan Dudley is the author of Broken Words: The Abuse of Science and Faith in American Politics 

Saturday, June 25, 2011

Babies Think Clearly

Nine Brainy Baby Abilities
By Jeanna Bryner and Stephanie Pappas
May 26, 2011

They may be small and not be able to hold an adult conversation, but babies are proving their collective cleverness. As soon as scientists figured out smarter ways to uncover the wee ones' abilities, they began finding infants' skills are more than they're cracked up to be.

Know Who’s Boss

From as early as 10 months of age, babies figure out that might makes right. When shown scenes of big and small cartoon blocks interacting, infants stare longer (indicating more surprise) when the big one yields the right-of-way to the small one than they do when the small one is subservient. The finding, published in January 2011 in the journal Science, suggest that babies understand social hierarchies and know that size can determine who's in charge.

The results suggest that the blueprints of social interaction are built into the human brain.
"When you're showing these kind of fairly sophisticated or rich concepts are in place before infants get language and before they really participate extensively in social interactions with the world, that is telling you: What are the basic building blocks of the mind?" study researcher Lotte Thomsen of the University of Copenhagen told LiveScience. "These are really the basics of how we think."

Grasp a Dog’s Emotions
Even before they can say more than "mama" and "dada," babies can decipher emotions … of dogs. A 2009 study showed that 6-month-olds could match the sounds of an angry snarl or friendly yap with photos of dogs showing the corresponding body language. "Emotion is one of the first things babies pick up on in their social world," said lead researcher Ross Flom, a psychology professor at Brigham Young University in Utah. The study was published in the journal Developmental Psychology.

Understand Moods and Emotions
While your infant still might not speak, he or she likely knows when you’re feeling down. As young as 5 months of age, babies can accurately pick out an upbeat tune from a gloomy one, according to a study published in 2010 in the journal Neuron. And by 9 months, babies could also identify the sorrowful sound of Beethoven's Seventh Symphony from a lineup of more joyful songs.
Born to Dance
Speaking of music, babies can't seem to resist it. Not only are their ears tuned to the beats, babies can actually dance in time with them, according to a study published in 2010.
To test babies’ dancing diposition, the researchers played recordings of classical music, rhythmic beats and speech to infants, and videotaped the results. They also recruited professional ballet dancers to analyze how well the babies matched their movements to the music.

The babies moved their arms, hands, legs, feet, torsos and heads in response to the music, much more than to speech. The findings, published in the journal Proceedings of the National Academy of Sciences, suggest this dancing ability is innate in humans, though the researchers aren’t sure why it evolved.

Mirror Actions of Others
No surprise that the look of amazement in a baby’s eyes as he or she explores is matched by churning activity inside the noggin. A 2009 study revealed that when 9-month-olds watch an adult reach for an object, the motor region in their brain gets activated as if they are actually doing the reaching. The study researchers suggest mirror neurons are at play.

These brain cells fire for both observed and real actions and have only been directly measured in monkeys. In one of the study’s experiments, once the babies had observed the experimenter grabbing for a toy, the "mirror" brain activity also occurred just prior to the action. Having that predictive capability could help little ones respond to another's actions to, say, intercept the movements and take the toy. The brain finding could also be an example of baby's first steps into the social world, the researchers suggest in the journal Biology Letters.

Learn Quickly While Sleeping
Babies can apparently learn even while asleep, according to a 2010 study. In experiments with 26 sleeping infants, each just 1 to 2 days old, scientists played a musical tone followed by a puff of air to their eyes 200 times over the course of a half-hour. A network of 124 electrodes stuck on the scalp and face of each baby also recorded brain activity during the experiments. The babies rapidly learned to anticipate a puff of air upon hearing the tone, showing a fourfold increase on average in the chances of tightening their eyelids in response to the sound by the end of each session.

As newborns spend most of their time asleep, this newfound ability might be crucial to rapidly adapt to the world around them and help to ensure their survival, researchers said. The study was published in the journal Proceedings of the National Academy of Sciences.

Are Math Whizzes – sort of
Babies know their 2s and 3s. In a 2006 study, 7-month-olds were presented with voices of two or three women saying "Look," and they had to choose between looking at a video image of two women saying the word or three women saying it. The babies spent significantly more time looking at the image matching the number of women speaking. The study was published in the journal Proceedings of the National Academy of Sciences.

Know Their P’s and Q’s
When we say babies do this and that even "before they learn to talk," we're obviously not including baby talk and other language smarts.

In a 2007 study published in the journal Science, researchers had 36 infants watch silent videos of three bilingual French-English speakers reciting sentences. After being trained to become comfortable with a speaker reciting a sentence in one language, babies ages 4 months and 6 months spent more time looking at a speaker reciting a sentence in a different language —demonstrating that they could tell the difference between the two.

"Newborns can be said to be 'intelligent' in that they have the ability to almost effortlessly learn any of the world's languages," psychologist George Hollich of Purdue University told LiveScience in 2007. Some of Hollich's research shows that babies start to understand grammar by the age of 15 months, processing grammar and words simultaneously.

Judge Character Well
Pegging another person as helpful or harmful is crucial when choosing friends. And that ability starts early. Kiley Hamlin of Yale University showed both 6- and 10-month-olds a puppet show of sorts with anthropomorphized shapes, in which one shape helped another climb a hill. In another scenario a third shape pushed the climber down. The little ones then got to choose which shape they preferred. For both age groups, most babies chose the helper shapes. This character-judging ability could be baby’s first step in the formation of morals, Hamlin speculated. The work was published in 2007 in the journal Nature.

Friday, June 24, 2011

Static Electricity Remains A Mystery

Static Electricity: How Does It Work?
By John Timmer, Ars Technica
June 23, 2011

For many of us, static electricity is one of the earliest encounters we have with electromagnetism, and it's a staple of high school physics. Typically, it's explained as a product of electrons transferred in one direction between unlike substances, like glass and wool, or a balloon and a cotton T-shirt (depending on whether the demo is in a high school class or a kids' party). Different substances have a tendency to pick up either positive or negative charges, we're often told, and the process doesn't transfer a lot of charge, but it's enough to cause a balloon to stick to the ceiling, or to give someone a shock on a cold, dry day.

Nearly all of that is wrong, according to a paper published in today's issue of Science. Charges can be transferred between identical materials, all materials behave roughly the same, the charges are the product of chemical reactions, and each surface becomes a patchwork of positive and negative charges, which reach levels a thousand times higher than the surfaces' average charge.

Where to begin? The authors start about 2,500 years ago, noting that the study of static began with a Greek named Thales of Miletus, who generated it using amber and wool. But it wasn't until last year that some of the authors of the new paper published a surprising result: contact electrification (as this phenomenon is known among its technically oriented fans) can occur between two sheets of the same substance, even when they're simply allowed to lie flat against each other. "According to the conventional view of contact electrification," they note, "this should not happen since the chemical potentials of the two surfaces/materials are identical and there is apparently no thermodynamic force to drive charge transfer."

One possible explanation for this is that a material's surface, instead of being uniform from the static perspective, is a mosaic of charge-donating and charge-receiving areas. To find out, they performed contact electrification using insulators (polycarbonate and other polymers), a semiconductor (silicon), and a conductor (aluminum). The charged surfaces were then scanned at very high resolution using Kelvin force microscopy, a variant of atomic force microscopy that is able to read the amount of charge in a surface.

The Kelvin force microscopy scans showed that the resulting surfaces were mosaics, with areas of positive and negative charges on the order of a micrometer or less across. All materials they tested, no matter what overall charge they had picked up, showed this mosaic pattern. The charges will dissipate over time, and the authors found that this process doesn't seem to occur by transferring electrons between neighboring areas of different charge—instead of blurring into the surroundings, peaks and valleys of charge remain distinct, but slowly decrease in size. The authors estimate that each one of these areas contains about 500 elementary charges (that's ±500 electrons), or about one charge for each 10nm2.

The reason that this produces a relatively weak charge isn't because these peaks and valleys are small; the charge difference between them is on the order of 1,000 times larger than the average charge of the whole material. It's just that the total area of sites with positive and negative charges are roughly equal (the two are typically within a fraction of a percent of each other). The distribution appears to be completely random, as the authors were able to produce similar patterns with a white noise generator that fluctuated on two length scales: 450nm and 44nm.

So, what causes these charges to build up? It's not, apparently, the transfer of electrons between the surfaces. Detailed spectroscopy of one of the polymers (PDMS) suggests that chemical reactions may be involved, as many oxidized derivatives of the polymer were detected. In addition, there is evidence that some material is transferred from one surface to another. Using separate pieces of fluorine- and silicon-containing polymers allowed the authors to show that signals consistent with the presence of fluorine were detected in the silicon sample after contact.

The exact relationship between the charge transfer and the processes seen here—chemical reactions and the transfer of materials between the surfaces—isn't clear at this point. But there are plausible mechanisms by which these processes could build up charges, and the authors very clearly intend to follow up on these findings.

In the meantime, you can be duly impressed with how much charge you can shuffle around when you build up static. Each square inch is equivalent to about 6.5 x 1014 square nanometers, so based on the authors' numbers, that's a lot of electrons.

, 2011.

Thursday, June 23, 2011

Absolute Heat -- the quark-gluon plasma

Matter Melts in Superhot Particle Collisions
by Clara Moskowitz, LiveScience Senior Writer
June 23, 2011

By creating a soup of subatomic particles similar to what the Big Bang produced, scientists have discovered the temperature boundary where ordinary matter dissolves.

Normal atoms will be converted into another state of matter —a plasma of quarks and gluons — at a temperature about 125,000 times hotter than the center of the sun, physicists said after smashing the nuclei of gold atoms together and measuring the results.

While this extreme state of matter is far from anything that occurs naturally on Earth, scientists think the whole universe consisted of a similar soup for a few microseconds after the Big Bang about 13.7 billion years ago.

An ordinary proton or neutron (foreground) is formed of three quarks bound together by gluons, carriers of the color force. Above a critical temperature, protons and neutrons and other forms of hadronic matter "melt" into a hot, dense soup of free quarks and gluons (background), the quark-gluon plasma.
CREDIT: Lawrence Berkeley National Laboratory
Physicists could re-create it only inside powerful atom smashers like the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory on Long Island, which has a 2.4-mile-long (3.8 km) ring. Researchers there accelerated the nuclei of gold atoms to incredible speeds, then crashed them into each other. The inferno created in this explosion was enough to give rise, briefly, to particle soup.

Quark-gluon plasma

"Normal matter like we are, nuclear matter, is called hadronic matter. If you excite the system to a very high temperature, normal matter will transform into a different type of matter called quark-gluon plasma," said physicist Nu Xu of the U.S. Department of Energy's Lawrence Berkeley National Laboratory in Berkeley, Calif.

Xu and his colleagues created quark-gluon plasma by crashing together gold nuclei inside the STAR experiment (Solenoidal Tracker at RHIC), which is inside the ring of the RHIC accelerator.
The nuclei of gold atoms consist of 79 protons, and 118 neutrons. Both protons and neutrons are made of quarks, held together by massless, chargeless particles called gluons. (Protons contain two "up" quarks and one "down," while neutrons have two "down" quarks and an "up.")

When two of these gold nuclei slammed into each other head-on, they melted down into their constituent parts, an incoherent swarm of quarks and gluons. The researchers found that this occurred when the particles reached an energy of 175 million electron volts (MeV).

This corresponds to about 3.7 trillion degrees Fahrenheit (2 trillion degrees Celsius), which is about 125,000 times hotter than the center of the sun.

"If you can heat the system to that temperature, any hadron will be melted into quarks and gluons," Xu told LiveScience.

A new breakthrough

This wasn't the first time physicists had created quark-gluon plasma. The first hints that RHIC had produced the extreme state of matter came in 2005, and firm evidence that it had been achieved was announced in 2010.

But until now, scientists had never been able to precisely measure the temperature at which the nuclei transitioned into the quark-gluon plasma state.

The discovery allows researches to compare hard measurements with predictions from a theory called quantum chromodynamics (QCD), which describes how matter is fundamentally put together, including how quarks assemble to form protons and neutrons. The interactions involved in quark-gluon plasma are governed by a framework called lattice gauge theory.

"This is the first time we compare the experimentally measured quantities with that of QCD lattice gauge calculations," said Xu, who is the spokesman for the STAR experiment. "It is the start of the era of precision measurements in high-energy nuclear collisions. It is very exciting."

Xu and his colleagues, led by Sourendu Gupta of India's Tata Institute of Fundamental Research, published their findings in the June 24 issue of the journal Science.

Soupy caldron

By creating the soupy caldron of quarks and gluons, researchers hope to learn not just about how matter is put together, but how our whole universe began.

According to the Big Bang theory, the universe began extremely hot and dense, then cooled and expanded. A few microseconds after the Big Bang, scientists think, matter was still hot enough that it existed in a quark-gluon plasma state; it was only after the quarks cooled enough that they could bind together with gluons and form the protons and neutrons that make up the matter we see today.

Through studies like the one at RHIC, as well as at the world's largest particle accelerator, CERN's Large Hadron Collider near Geneva, Switzerland, researchers hope to create more of this extreme matter to probe just how this happened.

"With many more results expected from the RHIC experiments in the near future, additional insights into
the details of the transition from ordinary matter to quark matter are within reach," wrote physicist Berndt Müller of Duke University in an essay published in the same issue of Science. Müller was not involved in the new study.

Wednesday, June 22, 2011

Positive Quiddity: Jack Benny

Jack Benny (February 14, 1894 -- December 26, 1974) was an American comedian, vaudevillian, and actor for radio, television and film.  Widely recognized as one of the leadiung American entertainers of the 20th century, Benny played the role of the comic penny-pinching miser, insisting on remaining 39 years old on stage despite his actual age, and often playing the violin badly.

Benny was known for his comic timing and his ability to get laughs with either a pregnant pause or a single expression, such as his signature exasperated "Well!" His radio and television programs, tremendously popular from the 1930s to the 1960s, were a foundational influence on the situation comedy genre.

Benny was born Benjamin Kubelsky on February 14, 1894, in Chicago, Illinois, and grew up in neighboring Waukegan, Illinois.  He was the son of Meyer Kubelsky and Emma Sachs Kubelsky. Meyer was a Jewish saloon owner, later to become a haberdasher, who had emigrated to America from Poland.

At age 17, he began playing the instrument in local vaudeville theaters for $7.50 a week.  He was joined by Ned Miller, a young composer and singer, on the vaudeville circuit.  They became life-long
friends and Miller eventually joined the cast of The Jack Benny Program in the 1960s.

In 1911, Benny was playing in the same theater as the young Marx Brothers, whose mother Minnie was so enchanted with Benny's musicianship that she invited him to be their permanent accompanist. The plan was foiled by Benny's parents, who refused to let their son, then 17, go on the road, but it was the beginning of his long friendship with Zeppo Marx.  Benny's future wife Mary Livingstone was a distant cousin of the Marx Brothers.

The following year, Benny formed a vaudeville musical duo with pianist Cora Salisbury, a buxom 45-year-old widow who needed a partner for her act. This provoked famous violinist Jan Kubelik, who thought that the young vaudeville entertainer with a similar name (Kubelsky) would damage his reputation. Under pressure from Kubelik's lawyer, Benjamin Kubelsky agreed to change his name to Ben K. Benny (sometimes spelled Bennie). When Salisbury left the act, Benny found a new pianist, Lyman Woods, and re-named the act "From Grand Opera to Ragtime". They worked together for five years and slowly added comedy elements to the show. They even reached the Palace Theater, the "Mecca of Vaudeville", but bombed. Benny left show business briefly in 1917 to join the U.S. Navy during World War I, and he often entertained the troops with his violin playing. One evening, his violin performance was booed by the troops, so with prompting from fellow sailor and actor Pat O'Brien, he ad-libbed his way out of the jam and left them laughing. He got more comedy spots in the revues and was a big hit, and earned himself a reputation as a comedian as well as a musician.

In 1922, Jack accompanied Zepp Marx to a Passover seder where he met Sadye (Sadie) Marks, whom he married in 1927 after meeting again on a double-date. She was working in the hosiery section of the Hollywood Boulevard branch of the May Company and Benny would court her there.  Called on to fill in for the "dumb girl" part in one of Benny's routines, Sadie proved a natural comedienne and a big hit. Adopting Mary Livingstone as her stage name, Sadie became Benny's collaborator throughout most of his career. They later adopted a daughter, Joan.

In 1929, Benny's agent Sam Lyons convinced MGM's Irving Thalberg to catch Benny's act at the Orpheum Theatre in Los Angeles. Benny was signed to a five-year contract and his first film role was in The Hollywood Revue of 1929.  His next movie, Chasing Rainbows, was a flop and after several months, Benny was released from his contract and returned to Broadway in Earl Carroll's Vanities. At first dubious about the viability of radio, Benny was eager to break into the new medium. In 1932, after a four-week nightclub run, he was invited onto Ed Sullivan's radio program, uttering his first radio spiel "This is Jack Benny talking. There will be a slight pause while you say, 'Who cares?'..."

Benny had been only a minor vaudeville performer, but he became a national figure with The Jack Benny Program, a weekly radio show which ran from 1932 to 1948 on NBC and from 1949 to 1955 on CBS.  It was consistently among the most highly rated programs during most of that run.

Benny's stage character was just about everything the actual Jack Benny was not: cheap, petty, vain, and self-congratulatory. His comic rendering of these traits became the linchpin to the Benny show's success. Benny set himself up as the comedic foil, allowing his supporting characters to draw laughs at the expense of his character's flaws. By allowing such a character to be seen as human and vulnerable, in an era where few male characters were allowed such obvious vulnerability, Benny made what might have been a despicable character into a lovable Everyman character. Benny himself said on several occasions: "I don't care who gets the laughs on my show, as long as the show is funny." In her book, Benny's daughter Joan said her father always said it doesn't matter who gets laughs, because come the next day they will say, "Remember the Jack Benny Show, last night, it was good, or it was bad." Jack felt he got the credit or blame either way, not the actor saying the lines, so it had better be funny.

The supporting characters who amplified that vulnerability only too gladly included wife Mary Livingstone as his wisecracking and not especially deferential female friend (not quite his girlfriend, since Benny would often try to date movie stars like Barbara Stanwyck, and occasionally had stage girlfriends such as "Gladys Zybisco"); rotund announcer Don Wilson (who also served as announcer for Fanny Brice's hit, Baby Snooks); bandleader Phil Harris as a jive-talking, wine-and-women type whose repartee was rather risqué for its time; boy tenor Dennus Day, who was cast as a sheltered, naïve youth who still got the better of his boss as often as not (this character was originated by Kenny Baker, but perfected by Day); and, especially, Eddie Anderson as valet-chauffeur Rochester van Jones who was as popular as Benny himself.

And that was itself a radical proposition for the era: unlike the protagonists of Amos 'n' Andy,  Rochester was a black man allowed to one-up his vain, skinflint boss. In more ways than one, with his mock-befuddled one-liners and his sharp retorts, he broke a comedic racial barrier. Unlike many black supporting characters of the time, Rochester was depicted and treated as a regular member of Benny's fictional household. Benny, in character, tended if anything to treat Rochester more like an equal partner than as a hired domestic, even though gags about Rochester's flimsy salary were a regular part of the show. (Frederick W. Slater, newsman of St. Joseph, Missouri, recalled when Benny and his staff stayed at the restricted Robidioux Hotel during their visit to that town. When the desk staff told Benny that "Rochester" could not stay at the hotel, Benny replied, "If he doesn't stay here, neither do I." The hotel's staff eventually relented.)

Rochester seemed to see right through his boss's vanities and knew how to prick them without overdoing it, often with his famous line "Oh, Boss, come now!" Benny deserves credit for allowing this character and the actor who played him (it is difficult, if not impossible, to picture any other performer giving Rochester what Anderson gave him) to transcend the era's racial stereotype and for not discouraging his near-equal popularity. A New Year's Eve episode, in particular, shows the love each performer had for the other, quietly toasting each other with champagne.  That this attention to Rochester's race was no accident became clearer during World War II, when Benny would frequently pay tribute to the diversity of Americans who had been drafted into service.

After the war, once the depths of Nazi race hatred had been revealed, Benny made a conscious effort to remove the most stereotypical aspects of Rochester's character. In 1948, it became apparent to Benny how much the times had changed when a 1941 script for "The Jack Benny Program" was re-used for one week's show. The script included mention of several African-American stereotypes (i.e. a reference to Rochester carrying a razor), and prompted a number of listeners, who didn't know the script was an old one, to send in angry letters protesting the stereotypes. Thereafter, Benny insisted that his writers should make sure that no racial jokes or references should be heard on his show. Benny also often gave key guest-star appearances to African-American performers such as Louis Armstrong and the Ink Spots.

The Jack Benny Program evolved from a variety show blending sketch comedy and musical interludes into the situation comedy form we know even now, crafting particular situations and scenarios from the fictionalization of Benny the radio star. Any situation from hosting a party to income tax time to a night on the town was good for a Benny show, and somehow the writers and star would find the right ways and places to insert musical interludes from Phil Harris and Dennis Day. With Day, invariably, it would be a brief sketch that ended with Benny ordering Day to sing the song he planned to do on that week's show.
                                                   Jack and Mary Benny
One extremely popular scenario that became an annual tradition on The Jack Benny Program was the "Christmas Shopping" episode, in which Benny would head to a local department store. Each year, Benny would buy a ridiculously cheap Christmas gift for Don Wilson from a store clerk played by Mel Blanc. Benny would then have second (then third, and even fourth) thoughts about his gift choice, driving Blanc (or, in two other cases, his wife and his psychiatrist, as well) to hilarious insanity by exchanging the gift, pestering about the Christmas card or wrapping paper countless times throughout the episode: in many cases, the clerk would commit suicide, or attempt and fail to commit suicide ("Look what you done! You made me so nervous, I missed!") as a result.

In the 1946 Christmas episode, for example, Benny buys shoelaces for Don, and then is unable to make up his mind whether to give Wilson shoelaces with plastic tips or shoelaces with metal tips. After Benny exchanges the shoelaces repeatedly, Mel Blanc is heard screaming insanely, "Plastic tips! Metal tips! I can't stand it anymore!" A variation in 1948 concerned Benny buying an expensive wallet for Don, but repeatedly changing the greeting card inserted—prompting Blanc to shout: "I haven't run into anyone like you in 20 years! Oh, why did the governor have to give me that pardon!?" – until Benny realizes that he should have gotten Don a wallet for $1.98, whereupon the put-upon clerk immediately responds by committing suicide. Over the years, in these Christmas episodes, Benny bought and repeatedly exchanged cuff links, golf tees, a box of dates, a paint set, and even a gopher trap.

In 1936, after a few years broadcasting from New York, Benny moved the show to Los Angeles, allowing him to bring in guests from among his show business friends — such as Frank Sinatra, James Stewart, Judy Garland, Barbara Stanwyck, Bing Crosby, Burns and Allen (George Burns was Benny's closest friend), and many others. Burns and Allen and Orson Welles guest hosted several episodes in March and April 1943 when Benny was seriously ill with pneumonia, while Ronald Colman and his wife Benita Hume appeared frequently in the 1940s as Benny's long-suffering neighbors.
Benny was notable for employing a small group of writers, most of whom stayed with him for many years. This was very much in contrast to other successful radio or television comedians, such as Bob Hope, who would change writers frequently. Historical accounts (like those by longtime Benny writer Milt Josefsberg) indicate that Benny's role was essentially that of both head writer and director of his radio programs, though he was not credited in either capacity. In contrast to Fred Allen, who initially personally wrote all of his own radio scripts (and continued to extensively rewrite scripts produced in later years by a writing staff), Jack Benny was often described by his writers as a consummate comedy editor rather than a writer per se. He would often debate the precise wording of jokes, and always sought to ensure the preservation of the integrity of the show's characters above easy gags.

A master of the carefully timed pregnant pause, Benny and his writers used it to set up what is popularly (but incorrectly) believed to be the longest laugh in radio history. It climaxed an episode (broadcast March 28, 1948) in which Benny borrowed neighbor Ronald Colman's Oscar and was returning home when accosted by a mugger (voiced by comedian Eddie Marr).  After asking for a match to light a cigarette, the mugger demanded, "Don't make a move, this is a stickup. Now, come on. Your money or your life." Benny paused, and the studio audience—knowing his skinflint character—laughed. The robber then repeated his demand: "Look, bud! I said your money or your life!" And that's when Benny snapped back, without a break, "I'm thinking it over!" This time, the audience laughed louder and longer than they had during the pause. 

The television version of The Jack Benny Program ran from October 28, 1950 to 1965. Initially scheduled as a series of five "specials" during the 1950–1951 season, the show appeared every six weeks for the 1951–1952 season, every four weeks for the 1952–1953 season and every three weeks in 1953–1954. For the 1953–1954 season, half the episodes were live and half were filmed during the summer, to allow Benny to continue doing his radio show. From the fall of 1954 to 1960, it appeared every other week, and from 1960 to 1965 it was seen weekly.

Benny also acted in movies, including the Academy Award-winning The Hollywood Revue of 1929, Broadway Melody of 1936 (as a benign nemesis for Eleanor Powell and Robert Taylor), George Washington Slept Here (1942), and notably, Charley's Aunt and To Be or Not To Be. He and Livingstone also appeared in Ed Sullivan's Mr. Broadway (1933) as themselves. Benny often parodied contemporary movies and movie genres on the radio program, and the 1940 film Buck Benny Rides Again features all the main radio characters in a funny Western parody adapted from program skits. The failure of one Benny vehicle, The Horn Blows at Midnight, became a running gag on his radio program, although contemporary viewers may not find the film as disappointing as the jokes suggest.

In trying to explain his successful life, Benny summed it up by stating "Everything good that happened to me happened by accident. I was not filled with ambition nor fired by a drive toward a clear-cut goal. I never knew exactly where I was going."

Upon his death, his family donated to UCLA his personal, professional, and business papers, as well as a collection of his television shows. The university established the Jack Benny Award in his honor in 1977 to recognize outstanding people in the field of comedy.  Johnny Carson was the first award recipient. Benny also donated a Stradivarius violin purchased in 1957 to the Los Angeles Philharmonic Orchestra.  Benny had commented, "If it isn't a $30,000 Strad, I'm out $120."


Jack Benny has one star each on the Hollywood Walk of Fame for TV, film, and radio.

Posthumous Tributes

When the price of a standard first-class US postal stamp was increased to 39 cents in 2006, fans petitioned for a Jack Benny stamp to honor his stage persona’s perpetual age. The U.S. Postal Service had issued a stamp depicting Jack Benny in 1991, as part of a booklet of stamps honoring comedians; however, the stamp was issued at the then-current rate of 29 cents.

Jack Benny Middle School in Waukegan, Illinois, is named after the famous comedian. Its motto matches his famous statement as "Home of the '39ers".

= = = = = = = = = = = = = = = = = = = = = = = = = = = = =

The blog author recommends this book on Benny, written by his long-time agent:
Fein, Irving, Jack Benny: An Intimate Biography, Putnam, ISBN 978-0671809171, OCLC 3694842.

A notable anecdote about Jack Benny’s personality and values:

Benny was asked for advice on just how to be as successful as he was.

He replied by saying, that for his entire career he had a note he taped to his dressing room mirror no matter where he went it said
"Someone in the audience had do without to buy a ticket to see you. You had better be that good and deserving".

Tuesday, June 21, 2011

Supernaturalism vs Apathist Agnostics

The Nature of the Supernatural
Posted to a blog on: June 20, 2011 11:13 PM, by Josh Rosenau

Astronomer Phil Plait notes a webcomic addressing testability and the supernatural, and makes an odd endorsement of this position:

there's no such thing as the supernatural.  Either something is natural -- that is, part of the Universe -- or else it doesn't exist.
If you posit some thing that has no perceivable or measurable effect, then it may as well not exist. And as soon as you claim it does have an effect -- it can be seen, heard, recorded, felt -- then it must be in some way testable, and therefore subject to science.
Not quite. The issue with the supernatural is not whether it's part of the universe, but whether it is bound by the same laws as all the other elements of the universe. The bizarre claim about ghosts is that they somehow obey some laws but not others, for no obvious reasons.

Something supernatural could, in principle, interact with the universe sometimes but not at others. If it is operating outside of natural laws, that doesn't obviously preclude it from sometimes interacting with things that do obey those laws, either by its own choice to obey those laws ("186,000 miles per second, it's not just a good idea, it's the law"), or by accident in the course of some random fluctuation of its supernatural nature.

Plait's point, and the webvcomic’s, is more about the transcendent than the supernatural. The explicitly miraculous could, in principle, exist and interact with nature, without being scientifically explainable or otherwise within the scope of science. Of course, if these miracles were common enough, we might detect some lawlike nature to them. If they could be reliably induced through prayer or other actions, that would itself be a law of nature, and would help naturalize those supernatural phenomena. That path has been followed before, with phenomena like gravity being seen first as supernatural ("spooky action at a distance") to fully natural and lawlike.

The comic concludes with the thought that, if the supernatural exists but doesn't interact with the world, "then why do we care?" That's a fair question, but notone that serves atheists. It's the reply of an apathist agnostic, who acknowledges that god(s) or other supernatural entities might exist, but are so uninteresting and irrelevant as to be unworthy of the time and energy so many people invest in either affirming or rejecting their existence. .

Monday, June 20, 2011

Summer Starts Tomorrow

Why Summer Begins Tuesday
LiveScience Staff – Mon Jun 20, 2011

The steamy temperatures would make it seem summer had already begun, but according to the astronomical calendar summer officially begins Tuesday.

The summer solstice occurs at 1:16 p.m. EDT (17:16 UTC), when the sun will be as high in the sky as possible, and it will be up a fraction of a second longer than the day prior or the day after. Though it's the longest day of the year in the Northern Hemisphere, the length of the full day, including night, doesn't change, of course.

Here's how it works: Earth is tilted on its axis 23.5 degrees, so it leans one way as it spins around its axis while orbiting the sun. On June 21 this year (some years it's June 20), the North Pole is pointing toward the sun as much as is possible. (The winter solstice occurs when the top half of our planet, everything north of the equator, faces directly away from the sun, leaving the North Pole in complete darkness.)

Tuesday may mark the sun's peak, but it doesn't typically mark summer's peak heat. That's because the oceans take time to heat up (or cool down). By mid-June the oceans of the Northern Hemisphere are still cool from winter's chill, delaying the peak air temperatures by a month and a half, according to NASA.

Earth is actually farther away from the sun during the summer than it is during the winter months [in the Northern Hemisphere], because our planet's orbit is elliptical, a squished circle of sorts. The difference is about 3 million miles (5 million kilometers), and it makes a difference in radiant heat received by the entire Earth of nearly 7 percent. But the difference is more than made up for by the longer days in the Northern Hemisphere summer with the sun higher in the sky.

If you're a night owl, not to worry. Days may be longest during the summer, but nights aren’t at their shortest. Before sunrise and after sunset some light gets scattered over the horizon by the atmosphere. This light, called twilight, lasts longest during this time of year.

There are three types of twilight:
  • Civil twilight is the time when the sun is less than six degrees below the horizon (about half an hour before sunrise or after sunset at mid-latitudes). .
  • Nautical twilight occurs when the sun is six to 12 degrees below the horizon, and when the horizon can still be used for navigation
  • Astronomical twilight happens when the sun is 12 to 18 degrees below the horizon.
And once the sun is more than 18 degrees below the horizon, the sky is dark enough to view the stars. Some places at high altitude don't experience all flavors of twilight. Saint Petersburg, Russia, is famous for its "white nights," when civil twilight reigns over the summer.

Sunday, June 19, 2011

Positive Quiddity: McIlroy Sets Record for US Open

Rory McIlroy Wins US Open at 16 Under Par
Jack Nicklaus Approves
By EDDIE PELLS, AP National Writer

BETHESDA, Md. June 19, 2011 – Rory McIlroy has the U.S. Open trophy. And if he needs any more affirmation, he's also got the seal of approval from Jack Nicklaus.

Nicklaus and McIlroy struck up a friendship over the last couple years when the Northern Irishman went to see The Golden Bear for tips on the game, notably, how to win at the majors.

The Bear, who knows a little something about good golf swings, said the 22-year-old has the kind of swing that can hold up over time. It certainly held up Sunday, over a closing round 2-under 69 that helped McIlroy break the U.S. Open scoring record that Nicklaus shared with Woods and two others.

"His rhythm is so beautiful, his tempo, it just stays the same all the time," Nicklaus said in an interview on NBC. "He doesn't try to kill it, doesn't try to do anything unusual with the golf ball. He hits it a little harder at times when he wants to put a little power in it and I think that's fine."

Nicklaus was 22 when he won the first of his 18 majors in 1962 — three months older than McIlroy is now.

Their most recent discussion came last month, while McIlroy was still reeling from his collapse at the Masters, where he took a four-shot lead into the final round but shot 80 and finished 15th. Nicklaus said he didn't really give McIlroy advice, but instead asked if he'd learned anything from his mistakes.

"He said, `Yeah, I think I did,'" Nicklaus said. "I said, `Well, make sure you did, because the next time you get yourself in that position, you've got to remember what you did and what you didn't do and remember what you want to do and go do it.' "

As the week at Congressional progressed, McIlroy never shied away from talking about his mistakes at Augusta. In fact, he embraced the experience, used it as a learning experience. After getting a big lead going into the weekend, he didn't look back.

"You sit down with the most successful player that's ever lived and for him to say that he expects big things from you, that you should embrace the pressure, those are great things to hear from someone like him," McIlroy said. "To be able to come out this week, and after what he said to me, and put a little bit of that into practice so early is a nice feeling."

Even before he had closed out his win, McIlroy was answering questions about becoming a multiple major winner. Padraig Harrington was predicting he might someday eclipse Nicklaus' record.

McIlroy is the youngest U.S. Open champion since Bobby Jones in 1923, and Nicklaus says there's some pressure taken off by winning the first one at such a young age.

"I hope he understands he is a golfer first," Nicklaus said. "He will be and already is a celebrity, but he is a golfer first. Right now he is a very good golfer, but if he wants to be a great golfer, he needs to learn how to deal with it and learn how to handle all the things on the side."

= = = = = = = = = = = = = = = = = = = = = = = = = = = = =

Footnote by the blog author: According to Aristotle, excellence (arete) and a beautiful mind (eunoia) are associated with practical wisdom (phronesis).

Saturday, June 18, 2011

Imagination Affects Perception

Blog author’s translation: What we imagine
filters what we choose to see
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

June 16, 2011
Divya Menon
Association for Psychological Science

Imagination Can Influence Perception

Imagining something with our mind’s eye is a task we engage in frequently, whether we’re daydreaming, conjuring up the face of a childhood friend, or trying to figure out exactly where we might have parked the car. But how can we tell whether our own mental images are accurate or vivid when we have no direct comparison? That is, how do we come to know and judge the contents of our own minds?

Mental imagery is typically thought to be a private phenomenon, which makes it difficult to test people’s metacognition of – or knowledge about –their own mental imagery. But a novel study, to be published in a forthcoming issue of Psychological Science, a journal of the Association for Psychological Science, capitalizes on the visual phenomenon of binocular rivalry as a way to test this kind of metacognition.

The study’s authors, Joel Pearson of the University of New South Wales, Rosanne Rademaker of Maastricht University, and Frank Tong of Vanderbilt University, wanted to find out if people have accurate knowledge about their own imagery performance. Participants were asked to imagine a particular pattern – a green circle with vertical lines or a red circle with horizontal lines – and rate how vivid the circle was for them and the amount of effort it took to imagine the circle.

To test the accuracy of the vividness and effort ratings, participants were presented with a binocular rivalry display so that participants’ left and right eyes were exposed to different patterns. As a result of binocular rivalry, one pattern becomes more dominant, and participants report seeing only this dominant pattern. Pearson and his co-authors theorized that if participants have accurate knowledge about their own mental imagery, then the imagined patterns that participants reported as being most vivid should emerge as the dominant patterns during the binocular rivalry display.

Results of the study confirmed the authors’ suspicions, suggesting that imagined experiences are not merely epiphenomenal – that is, our evaluations of mental imagery bear a direct relationship to our performance on perceptual and cognitive tasks in the real world. The authors used control conditions in order to rule out the influence of other factors, like whether participants might have paid attention to one pattern more than the other or simply chose one pattern more than another. Results from these control conditions indicated that neither attention nor decisional bias could account for the findings from the binocular rivalry condition.

According to Pearson, "our ability to consciously experience the world around us has been dubbed one of the most amazing yet enigmatic processes under scientific investigation today." But, he argues, "if we stop for a moment and think about it, our ability to imagine the world around us in the absence of stimulation from that world is perhaps even more amazing." With mental imagery, we can ‘see’ how things might have been or could be in the future. It is perhaps not surprising, then, that strong mental imagery is associated with creativity.

Mental imagery is also critical when organizing our lives on a day-to-day basis. Being able to imagine objects and scenarios is "one of the fundamental abilities that allows us to successfully think about and plan future events," says Pearson. Mental imagery "allows us to, in a sense, run through a dress rehearsal in our mind’s eye."

It’s clear that mental imagery contributes to our everyday functioning. There are some instances, however, when incredibly vivid mental imagery may not be a good thing, such as in the case of visual hallucinations. According to Pearson, future research on our experiences of mental imagery will not only help to reveal the inner workings of this fundamental ability, but it may also help in research and treatment in cases of hallucination, when mental imagery becomes disruptive.

Friday, June 17, 2011

Major Glitches of Today's Electric Cars

Introduction by the blog author:  Want to buy an electric car?  "Wait until they fix these shortcomings."

= = = = = = = = = = = = = = = = = = = = = = = = = = = = =

7 Reasons Your Electric

Car Won't Go So Far

By Wil Milan, TechNewsDaily Contributor
27 September 2010

The biggest concern buyers have about electric vehicles is how far the battery will take them. For manufacturers, therefore, it's tempting to be optimistic in quoting the vehicle's range. Thus the claims almost always include the qualifier "up to," as in "range up to 100 miles" (or 40 miles, or 200 miles, or whatever the manufacturer is claiming as maximum range).

And yes, it's probably true that at least one time, under optimum conditions, very carefully driven by a factory driver, with a topped-off, brand-new battery pack, that model of electric vehicle did reach that maximum range before the wheels absolutely stopped turning.

And while some manufacturers state that their range numbers are computed by EPA standards, you still shouldn't count on reaching that range in everyday use, perhaps not even coming close. Keep reading to learn why...

EPA Estimates Are Not Realistic

As former Tesla Motors spokesman Darryl Siry has revealed, the current EPA standard for electric vehicle range measurement is very optimistic and manufacturers like Tesla take advantage of that. That has allowed electric-car makers like Tesla to publish "EPA test" range numbers that few will ever get. EPA test rules allow procedures that a real-life driver should never do and the manufacturer may even forbid.

For example, the EPA standard allows the battery to be topped off to its absolute maximum charge (harmful to the battery in the long term, therefore a bad idea) and to run the battery completely dead (a very bad thing because that too reduces the life of the battery). In practice, to maximize the useful life of the battery pack, most manufacturers design their vehicle chargers so they don't top it off to absolute maximum, and the cars themselves are often designed not to allow the battery to be run completely flat in normal operation.

Those factors alone will account for a significant difference between the "EPA tested" mileage claims and what a real-life driver would achieve.

Battery Packs Lose Capacity

Just like the battery in your cell phone, an electric car's battery pack will work best when brand new, but its capacity will decrease significantly over time. Factory range tests will, of course, be done with brand-new batteries, but over the years and miles the battery pack in your car will drop to 90 percent capacity, then 80 percent, 70 percent, and even lower until it's finally replaced.

Weather and Other Conditons Shorten Range

Depending on where you live and where you drive you may encounter other conditions that reduce your range significantly. One such condition is hot weather: While on a gasoline vehicle heavy use of air conditioning may reduce range by 5 percent or so, on an electric car the A/C may reduce the range by
25percent or more. In hot-weather urban driving the loss could approach 40 percent.

Ironically, very cold weather will als signifiaty reduce the range, in part because of the need to run the
heater, but even more so because of the lower performance and capacity of batteries when they're very cold. (A heated garage or plug-in battery heater can help to some extent.)

You Might Use the Headlights

At night you'll need to run the car's lights, which causes another significant drain on the batteries. In some locations and some weather conditions (such as fog or rain) you may need to run the car's lights in daytime, which exacerbates the problem.

The result is that when it's hot, cold, dark, foggy, rainy, or other less-than-optimcal conditions, an electric car's range will be significantly shorter than the manufacturer claims, at times drastically shorter.

Radios and Entertainment Systems

Most people will want to be able to run the car's audio and entertainment systems while they drive, but like everything else they too drain the battery and reduce the range.

Tire Pressures

Most people don't pay much attention to tire pressures, and as a result it's common for cars to drive around with low tire pressures. That reduces the efficiency of any car, but due to an electric car's higher dependence on very low-friction operation, low tire pressure has a disproportionate effect on the range of an electric car vs. its effect on a gas-powered vehicle.

How Drivers Think

Driver psychology shortens effective range. In theory an electric car with a "true" 100-mile range could be operated for 100 miles. But because the actual range varies with so many factors, and because no one wants to be stuck with a dead electric car far from the nearest charging station, prudent drivers always avoid testing the limits. Unlike a gas-powered car, there's no quick fill-up or "gallon of gas" for a dead or near-dead electric car. Some electric cars do hold back some of the battery capacity as an emergency reserve, but once that's gone, a long session at a charging station is the only thing that will bring the car back to life.

Example: The Nissan Leaf requires 20 hours for a full recharge using its built-in power cord and a standard 120-volt, 15-amp outlet. If a 220/240-volt, 30-amp outlet is available, the charge time can be reduced to eight hours. For commercial establishments or custom home installations Nissan makes a special 480-volt, 125-amp charging station that can provide a 30-minute quick-charge to 80% capacity, but it costs US $17,000 and Nissan warns that frequent use of the quick-charger will reduce battery life.

The result is that if the car is capable of an "honest" 100 miles most drivers will turn around less than 40 miles from home and plan to be home (or at a known charging station) at well under 100 miles total.


Should those factors – and the reduced range they imply – keep you from buying an electric? No, not necessarily. If your planned uses for an electric car will never tax its range, then you should only be concerned about unplanned situations (could you ever have unexpected side trips from work that lengthen the distance home?) and evaluate how you would deal with those.

Over time the range of electrics is likely to increase and charging stations will likely become more available, though even then fill-ups will still take hours. But in thinking about an electric car purchase, consider what its real range might be for you, under the conditions you might have to face (heat, cold, night, fog, etc.) and evaluate the car accordingly.

Thursday, June 16, 2011

Source of Prolonged Gamma Ray Is Studied

An enormous black hole has torn apart a star similar to our own sun. Normally this event produces a short burst of gamma ray radiation. Astronomers regard these events as signs that a star is collapsing in its death.
But these gamma rays are lasting for months and continue even now. Initially detected on March 28th, the radiation from the constellation Draco is unusual. The black hole was apparently dormant until this happened. Joshua Bloom of the University of California at Berkeley figues this kind of intense gamma ray may happen once per black hole in a million years.
The gravitation pull of the black hole on this particular star was so enormous that exerted what is termed a tidal disruption on the star.

Bloom said this event could be used to learn more about how black holes grow. "We still don’t understand how black hoes an the universe grow. We think most black holes start off as being no more than the mass of our Sun… How they go from 10 solar masses to a billion solar masses is critical."
-- summarized by the blog author from

Wednesday, June 15, 2011

Sunspots Are Shrinking

Sunspots Are Shrinking -- Yet -- Global
Warming Is Not Being Recalculated

Sunspots are disappearing -- fast!  This hasn't happened for centuries. Th reason the Earth isn't a cold, dead rock with no atmosphere is due to its nearly spherical orbit of a star at a distance that makes most of the water on this planet in a liquid rather than in a gaseous nor solid state. We may have a couple of decades of global cooling -- but -- the IPCC 2007 study is holy and global warming is a certainty even if it doesn't really get the earth's surface temperature cooking for another century! At least that's what they say at the end of this article from the French Press Agency (AFP):

               = = = = = there’s a problem: IPCC 2007 ACCOUNTED for sunspots = = = = =

"Sunspots Small dark areas on the Sun. The number of sunspots is higher during periods of high solar activity, and varies in particular with the solar cycle. "
                        --Climate Change 2007: Working Group I: The Physical Science Basis (located at: )

                                          = = = = = concluson = = = = = =

There is a major unscientific difficulty here. IPCC 2007 predicted substantial global warming throughout the 21st century. Sunspots were known about and included in the model. Now we know the sunspot projections were wrong. There is no correction of the model forthcoming from IPCC.

Therefore, on this basis alone, we can state definitively that the global warming consensus from the UN in 2007 on global warming is now out of date.
Further, there are overwhelming scientific and statistical problems with IPCC 2007 that have been addressed at the "Liberals – Faith in Global Warming 104" posting of the companion Quiddity blog of December 15, 2010.

Tuesday, June 14, 2011

Theory of a Possible Superconductor

Introduction by the blog author: Superconductors (wires that conduct electricity without any resistance or loss of energy because of transfer to heat energy) are useful in transmission of electric power to distant locations as well as in powering powerful magnets for mag-lev train transportation and many other uses. The existing superconductors all perform at extremely cold temperatures near absolute zero.

A superconductor able to retain its efficient properties at room temperature would revolutionize electric power production worldwide and change the world’s modes of transportation. Hydrogen acts as a superconductor when at extreme temperature or close to absolute zero. The news release below speculates about a high pressure amalgam of hydrogen with sodium or lithium that might provide superconducting qualities at significantly lower pressures than pure hydrogen.
= = = = = = = = = = = = = = = = = = = = = = = = = = = =

Under Pressure, Sodium and Hydrogen Could Undergo a Metamorphosis, Emerging As a Superconductor

June 13, 2011
BUFFALO, N.Y. -- In the search for superconductors, finding ways to compress hydrogen into a metal has been a point of focus ever since scientists predicted many years ago that electricity would flow, uninhibited, through such a material.

Liquid metallic hydrogen is thought to exist in the high-gravity interiors of Jupiter and Saturn. But so far, on Earth, researchers have been unable to use static compression techniques to squeeze hydrogen under high enough pressures to convert it into a metal. Shock-wave methods have been successful, but as experiments with diamond anvil cells have shown, hydrogen remains an insulator even under pressures equivalent to those found in the Earth's core.

To circumvent the problem, a pair of University at Buffalo [UB] chemists has proposed an alternative solution for metallizing hydrogen: Add sodium to hydrogen, they say, and it just might be possible to convert the compound into a superconducting metal under significantly lower pressures.

The research, published June 10 in Physical Review Letters, details the findings of UB Assistant Professor Eva Zurek and UB postdoctoral associate Pio Baettig.

Using an open-source computer program that UB PhD student David Lonie designed, Zurek and Baettig looked for sodium polyhydrides that, under pressure, would be viable superconductor candidates. The program, XtalOpt, is an evolutionary algorithm that incorporates quantum mechanical calculations to determine the most stable geometries or crystal structures of solids.

In analyzing the results, Baettig and Zurek found that NaH9, which contains one sodium atom for every nine hydrogen atoms, is predicted to become metallic at an experimentally achievable pressure of about 250 gigapascals -- about 2.5 million times the Earth's standard atmospheric pressure, but less than the pressure at the Earth's core (about 3.5 million atmospheres).

"It is very basic research," says Zurek, a theoretical chemist. "But if one could potentially metallize hydrogen using the addition of sodium, it could ultimately help us better understand superconductors and lead to new approaches to designing a room-temperature superconductor."

By permitting electricity to travel freely, without resistance, such a superconductor could dramatically improve the efficiency of power transmission technologies.

Zurek, who joined UB in 2009, conducted research at Cornell University as a postdoctoral associate under Roald Hoffmann, a Nobel Prize-winning theoretical chemist whose research interests include the behavior of matter under high pressure.

In October 2009, Zurek co-authored a paper with Hoffman and other colleagues in the Proceedings of the National Academy of Sciences predicting that LiH6 -- a compound containing one lithium atom for every six hydrogen atoms -- could form as a stable metal at a pressure of around 1 million atmospheres.
Neither LiH6 and NaH9 exists naturally as stable compounds on Earth, but under high pressures, their structure is predicted to be stable.

"One of the things that I always like to emphasize is that chemistry is very different under high pressures," Zurek says. "Our chemical intuition is based upon our experience at one atmosphere. Under pressure, elements that do not usually combine on the Earth's surface may mix, or mix in different proportions. The insulator iodine becomes a metal, and sodium becomes insulating. Our aim is to use the results of computational experiments in order to help develop a chemical intuition under pressure, and to predict new materials with unusual properties."

The University at Buffalo is a premier research-intensive public university, a flagship institution in the State University of New York system and its largest and most comprehensive campus. UB's more than 28,000 students pursue their academic interests through more than 300 undergraduate, graduate and professional degree programs. Founded in 1846, the University at Buffalo is a member of the Association of American Universities.

Monday, June 13, 2011

A Fierce Critique of Cloud Computing

A fiery criticism of cloud computing has been posted to the Liticism website. The author of the Liticism website, himself employed in troubleshooting and repairing internet connections, has a stormy and angry series of objections to cloud computing in a piece titled "Cloud Computing Is a Dangerous Proposition." The objections offer significant additional depth to what this Daily Quiddity site has had to say about it recently. Computer owners should pay heed to this particular critic.

He starts with listing data breaches from Healthnet, Dell, and Sony, firms that have cloud computing and have been sending out notices about their problems. He points out that a cloud has two central elements – remote data storage and remote software (which is software installed within the cloud but not on the user monitor nor in-house computers -- "software as service," also called "SAS").

The Liticism article goes on to point out that remote data storage isn’t new, and that the glut of it around 2008 created "cloud computing" as a marketing ploy to do something with it. The article doubts that off-site data storage is safer. The data center may be climate controlled, monitored, equipped with fire-suppressant systems and using data-grade hardware, but there is no standard for reliability, no government or third party regulation and no accountability. "That data should be kept internal, off the Internet, and within arm’s reach," the article says. Remotely, off-site and probably out-of-state, "…it’s not a matter of if you’ll have a breach, it is when…"

The article also scoffs at software as service (SAS). It asks the reader to remember how slow a large Excel spreadsheet loads on a PC. Now imagine that both the spreadsheet data and the Excel program are off-site and have to be loaded. The writer’s conjecture is that the money "saved" by not buying nor upgrading software is blown on inactive employees waiting for the cloud to load data into a cloudy application.

The writer’s experience is that internet connections are fickle and unreliable, and this technology is the heart and soul of cloud computing. So the iffy internet can, at any moment, for an unpredictable period of time, wind up costing employee productivity and customers. "To move important company data offsite is the biggest mistake a business can make…Furthermore, how can you exact the same confidentiality and privacy on an outsourced company thousands of miles away?"

The conclusion of the article warns investors "who have fallen for this shiny nickel" that cloud computing "is the biggest farce of the decade."

--the above review is a summarization by the Daily Quiddity blog author of this link:

Sunday, June 12, 2011

Tiny Rockets Race Through Blood Samples

Summary and Introduction
by the blog author

Gold and platinum microtubes use oxygen from hydrogen peroxide to zoom around in a blood sample. A special coating on the outer surface catches cancer cells rather than normal cells. The microtubes are then retrieved and the attached cells analyzed. It’s quicker and more definitive than separating the cancerous cells from the rest of the blood through chemical and biological processes.

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
Tiny Rockets Zoom Into Blood
to Capture Cancer Cells
By Charles Q. Choi, InnovationNewsDaily Contributor
April 29, 2011

Blasting through blood samples like a missile roaring through the sky, a new breed of microscopic rocket searches out cancer, propelled by chemical boosters. Designed by scientists to speed up the identification of chemicals, these tiny rockets take the power of the space age down to the nanoscale.

The micro-rockets consist of tapered metal tubes 60 microns long, or about two-thirds the width of a human hair. Their interiors are lined with platinum, a metal that breaks down hydrogen peroxide, a common ingredient of bleach. This generates oxygen bubbles that naturally rush out each tube's wider end – jet propulsion for speeds of roughly 85 microns per second.

"This is the first time we showed we can propel micro-rockets in biological media like blood," said Joseph Wang, a nanoengineer at the University of California, San Diego, "In the future, we can think of putting other receptor molecules on these micro-rockets to capture other targets of interest."

An iron layer sandwiched between the platinum interior and gold exteriors of the roughly 5-micron-wide tubes lets the researchers steer the rockets using magnetic fields.

The outer surfaces of these microtubes are platforms onto which the scientists can chemically attach antibodies, the molecules our immune system employs to latch onto and tag invaders. The researchers use these antibodies to bind the microtubes onto specific targets, such as a protein known as carcinoembryonic antigen, which is found in overly large amounts in approximately 95 percent of colorectal, gastric and pancreatic cancers.

In tests with human blood serum where scientists added hydrogen peroxide, pancreatic cancer cells and noncancerous immature white blood cells, the micro-rockets latched onto their cancerous targets 70 percent of the time, zipping around even while carrying their prizes with only a slight drop in speed. The microtubes did not bind the same way to noncancerous cells when rubbed against them.

These micro-rockets could serve as a novel way to capture relatively rare circulating tumor cells for cancer diagnosis tests without the substantial preparation of blood samples needed for other techniques that exist for isolating and counting these cells.

This article was provided by InnovationNewsDaily, a sister site of TechNewsDaily.

[this link has some nice graphics and microscopic photographs of he rockets in action]