Thursday, July 31, 2014

Master Clown "Red" Skelton

Richard Bernard "Red" Skelton (July 18, 1913 – September 17, 1997) was an American entertainer best known for his national radio and television acts between 1937 and 1971 and as host of the television program The Red Skelton Show.  Skelton, who has stars on the Hollywood Walk of Fame for his work in radio and television, also appeared in vaudeville, films, nightclubs, and casinos, all while he pursued an entirely separate career as an artist [especially as a painter of clowns].

Skelton began developing his comedic and pantomime skills from the age of 10, when he became part of a traveling medicine show. He then spent time on a showboat, worked the burlesque circuit, then entered into vaudeville in 1934. The Doughnut Dunkers, a pantomime sketch of how different people ate doughnuts written by Skelton and his wife launched a career for him in vaudeville, in radio and in films. Skelton's radio career began in 1937 with a guest appearance on The Fleischmann’s Yeast Hour which led to his becoming the host of Avalon Time in 1938. He became the host of The Raleigh Cigarettes Program in 1941 where many of his comedy characters were created and had a regularly scheduled radio program until 1957. Skelton made his film debut in 1938 alongside Ginger Roger and Douglas Fairbanks, Jr. in Alfred Santell’s Having Wonderful Time, and he went on to appear in numerous musical and comedy films throughout the 1940s and early 1950s such as Whistling in the Dark (1941), Lady by Good (1941), I Dood It (1943), Ziegfeld Follies (1946), Texas Carnival (1951) and The Clown (1953). However, he was never comfortable in films and was most eager to work in television, even when the medium was in its infancy. The Red Skelton Show made its television premiere on September 30, 1951 on NBC. By 1954, Skelton's program moved to CBS, where it was expanded to one hour and renamed The Red Skelton Hour in 1962. Despite high ratings, his television show was canceled by CBS in 1970 as the network believed more youth-oriented programs were needed to attract younger viewers and their spending power. Skelton moved his program to NBC, where he completed his last year with a regularly scheduled television show in 1971. After he no longer had a television program, Skelton's time was spent making up to 125 personal appearances a year and on his artwork.

Skelton's artwork of clowns remained a hobby until 1964, when his wife, Georgia, convinced him to have a showing of his work at the Sands Hotel in Las Vegas while he was performing there. Sales of his originals were successful and Skelton also sold prints and lithographs of them, earning $2.5 million yearly on lithograph sales. At the time of his death, his art dealer believed that Skelton had earned more money through his paintings than from his television work.

Skelton believed his life's work was to make people laugh and wanted to be known as a clown, because he defined it as being able to do everything. He had a 70-year career as a performer and entertained three generations of Americans during this time. Many of Skelton's personal and professional effects, including prints of his artwork, were donated to Vincennes University by his widow, where they are part of the Red Skelton Museum of American Comedy.

Legacy and Tributes

Skelton preferred to be described as a clown rather than a comic: "A comedian goes out and hits people right on. A clown uses pathos. He can be funny, then turn right around and reach people and touch them with what life is like."  "I just want to be known as a clown," he said, "because to me that's the height of my profession. It means you can do everything—sing, dance and above all, make people laugh."  His purpose in life, he believed, was to make people laugh.

In Groucho and Me, Groucho Marx called Skelton "the most unacclaimed clown in show business", and "the logical successor to [Charlie] Chaplin", largely because of his ability to play a multitude of characters with minimal use of dialogue and props. "With one prop, a soft battered hat," Groucho wrote, describing a performance he had witnessed, "he successfully converted himself into an idiot boy, a peevish old lady, a teetering-tottering drunk, an overstuffed clubwoman, a tramp, and any other character that seemed to suit his fancy. No grotesque make-up, no funny clothes, just Red." He added that Skelton also "plays a dramatic scene about as effectively as any of the dramatic actors."  In late 1965 ventriloquist Edgar Bergen, reminiscing about the entertainment business, singled out Skelton for high praise. "It's all so very different today. The whole business of comedy has changed — from 15 minutes of quality to quantity. We had a lot of very funny people around, from Charleyt Chase to Charlie Chaplin and Laurel and Hardy.  The last one of that breed is Red Skelton." Harry Cohn of Columnia Pictures also praised Skelton, saying, "He's a clown in the old tradition. He doesn't need punch lines. He's got heart."

Skelton and Marcel Marceau shared a long friendship and admiration of each other's work. Marceau appeared on Skelton's CBS television show three times, including one turn as the host in 1961 as Skelton recovered from surgery.  He was also a guest on the three Funny Faces specials that Skelton produced for HBO.  In a TV Guide interview after Skelton's death, Marceau said, "Red, you are eternal for me and the millions of people you made laugh and cry. May God bless you forever, my great and precious companion. I will never forget that silent world we created together."  CBS issued the following statement upon his death: "Red's audience had no age limits. He was the consummate family entertainer—a winsome clown, a storyteller without peer, a superb mime, a singer and a dancer."

The Red Skelton Performing Arts Center was dedicated in February 2006 on the campus of Vincennes University, one block from the home in Vincennes where Skelton was born. The building includes an 850-seat theater, classrooms, rehearsal rooms, and dressing rooms. Its grand foyer is a gallery for Skelton's paintings, statues, and film posters.  The theater hosts theatrical and musical productions by Vincennes University, as well as special events, convocations and conventions.  The adjacent Red Skelton Museum of American Comedy opened on July 18, 2013, on what would have been Skelton's 100th birthday.  It houses his personal and professional materials, which he had collected since the age of ten, in accordance with his wishes that they be made available in his hometown for the public's enjoyment. Skelton's widow, Lothian, noted that he expressed no interest in any sort of Hollywood memorial.  The museum is funded jointly by the Red Skelton Museum Foundation and the Indiana Historical Society.  Other Foundation projects include a fund that provides new clothes to Vincennes children from low-income families.  The Foundation also purchased Skelton's birthplace and continues to finance its restoration.  Restoration continues as well at the historic Vincennes Pantheon Theatre, where Skelton performed during his youth.

The town of Vincennes has held an annual Red Skelton Festival since 2005. A "Parade of a Thousand Clowns", billed as the largest clown parade in the Midwest, is presented, followed by family-oriented activities and live music performances.

Wednesday, July 30, 2014

New, Accurate Cancer Diagnosis

Potential Universal Blood Test for Cancer
University of Bradford, July 28, 2014

Researchers from the University of Bradford have devised a simple blood test that can be used to diagnose whether people have cancer or not.
The test will enable doctors to rule out cancer in patients presenting with certain symptoms, saving time and preventing costly and unnecessary invasive procedures such as colonoscopies and biopsies being carried out. Alternatively, it could be a useful aid for investigating patients who are suspected of having a cancer that is currently hard to diagnose.
Early results have shown the method gives a high degree of accuracy diagnosing cancer and pre-cancerous conditions from the blood of patients with melanoma, colon cancer and lung cancer.  The research is published online in FASEB Journal, the US journal of the Federation of American Societies for Experimental Biology.

The Lymphocyte Genome Sensitivity (LGS) test looks at white blood cells and measures the damage caused to their DNA when subjected to different intensities of ultraviolet light (UVA), which is known to damage DNA. The results of the empirical study show a clear distinction between the damage to the white blood cells from patients with cancer, with pre-cancerous conditions and from healthy patients.

Professor Diana Anderson, from the University’s School of Life Sciences led the research. She said: “White blood cells are part of the body’s natural defence system. We know that they are under stress when they are fighting cancer or other diseases, so I wondered whether anything measureable could be seen if we put them under further stress with UVA light.We found that people with cancer have DNA which is more easily damaged by ultraviolet light than other people, so the test shows the sensitivity to damage of all the DNA – the genome – in a cell.”

The study looked at blood samples taken from 208 individuals. Ninety-four healthy individuals were recruited from staff and students at the University of Bradford and 114 blood samples were collected from patients referred to specialist clinics within Bradford Royal Infirmary prior to diagnosis and treatment. The samples were coded, anonymised, randomised and then exposed to UVA light through five different depths of agar. 

The UVA damage was observed in the form of pieces of DNA being pulled in an electric field towards the positive end of the field, causing a comet-like tail. In the LGS test, the longer the tail the more DNA damage, and the measurements correlated to those patients who were ultimately diagnosed with cancer (58), those with pre-cancerous conditions (56) and those who were healthy (94).  

“These are early results completed on three different types of cancer and we accept that more research needs to be done; but these results so far are remarkable,” said Professor Anderson. "Whilst the numbers of people we tested are, in epidemiological terms, quite small, in molecular epidemiological terms, the results are powerful. We’ve identified significant differences between the healthy volunteers, suspected cancer patients and confirmed cancer patients of mixed ages at a statistically significant level of P<0.001. This means that the possibility of these results happening by chance is 1 in 1000. We believe that this confirms the test’s potential as a diagnostic tool.”

Professor Anderson believes that if the LGS proves to be a useful cancer diagnostic test, it would be a highly valuable addition to the more traditional investigative procedures for detecting cancer.

A clinical trial is currently underway at Bradford Royal Infirmary. This will investigate the effectiveness of the LGS test in correctly predicting which patients referred by their GPs with suspected colorectal cancer would, or would not, benefit from a colonoscopy – currently the preferred investigation method.  

The University of Bradford has filed patents for the technology and a spin-out company, Oncascan, has been established to commercialise the research.

Tuesday, July 29, 2014

Attrition Warfare

Attrition warfare is a military strategy in which a belligerent side attempts to win a war by wearing down the enemy to the point of collapse through continuous losses in personnel and materiel. The war will usually be won by the side with greater such resources.  The word attrition comes from the Latin root atterere to rub against, similar to the "grinding down" of the opponent's forces in attrition warfare.

Strategic Considerations

Military theorists and strategists like Sun Tzu have viewed attrition warfare as something to be avoided. In the sense that attrition warfare represents an attempt to grind down an opponent through superior numbers, it represents the opposite of the usual principles of war, where one attempts to achieve decisive victories by using minimal necessary resources and in minimal amount of time, through maneuver, concentration of force, surprise, and the like.

On the other hand, a side which perceives itself to be at a marked disadvantage in maneuver warfare or unit tactics may deliberately seek out attrition warfare to neutralize its opponent's advantages. If the sides are nearly evenly matched, the outcome of a war of attrition is likely to be a Pyrrhic victory.

The difference between war of attrition and other forms of war is somewhat artificial, since war always contains an element of attrition. However, one can be said to pursue a strategy of attrition when one makes it the main goal to cause gradual attrition to the opponent eventually amounting to unacceptable or unsustainable levels for the opponent while limiting one's own gradual losses to acceptable and sustainable levels. This should be seen as opposed to other main goals such as the conquest of some resource or territory or an attempt to cause the enemy great losses in a single stroke (e.g., by encirclement and capture).

Historically, attritional methods are tried when other methods have failed or are obviously not feasible. Typically, when attritional methods have worn down the enemy sufficiently to make other methods feasible, attritional methods are abandoned in favor of other strategies. In World War I, improvements in firepower but not communications and mobility forced military commanders to rely on attrition, with terrible loss of life.

Attritional methods are in themselves usually sufficient to cause a nation to give up a non-vital ambition, but other methods are generally necessary to achieve unconditional surrender.


It is often argued that the best-known example of attrition warfare was during World War I on the Western Front.  Both military forces found themselves in static defensive positions in trenches running from Switzerland to the English Channel. For years, without any opportunity for maneuvers, the only way the commanders thought they could defeat the enemy was to repeatedly attack head on, to grind the other down.

One of the most enduring examples of attrition warfare on the Western Front is the Battle of Verdun.  Erich von Falkenhayn later claimed that his tactics at Verdun were designed not to take the city, but rather to destroy the French Army in its defense. In practice the German Offensive was intended to go as far as possible and had no obvious design to minimize German casualties and maximize French casualties. Falkenhayn is described as wanting to "bleed France white" and thus the attrition tactics were employed in the battle.

Attritional warfare in World War I has been shown by historians such as Hew Strachan to have been used as a post hoc excuse for failed offensives. The decision to launch a war of attrition at Verdun was also used to mask the failure of Falkenhayn's original tactical failure.

Attrition to the enemy was easy to assert and difficult to refute, and thus may have been a convenient face-saving exercise in the wake of many indecisive battles. It is in many cases hard to see the logic of warfare by attrition because of the obvious uncertainty of the level of damage to the enemy, and of the damage that the attacking force may sustain to its own limited and expensive resources, while trying to achieve that damage.

That is not to say that a general will not be prepared to sustain high casualties while trying to reach an objective. An example in which one side used attrition warfare to neutralize the other side's advantage in maneuverability and unit tactics occurred during the latter part of the American Civil War when Ulysses S. Grant pushed the Confederate Army continually, in spite of losses, confident that the Union’s supplies and manpower would overwhelm the Confederacy even if the casualty ratio was unfavorable; this indeed proved to be the case.

Monday, July 28, 2014

Stanford Improves Lithium Anodes

Stanford Team Achieves 'Holy Grail' of Battery Design

Stanford School of Engineering, July 28, 2014

Engineers use carbon nanospheres to protect lithium from the reactive and expansive problems that have restricted its use as an anode
Engineers across the globe have been racing to design smaller, cheaper and more efficient rechargeable batteries to meet the power storage needs of everything from handheld gadgets to electric cars.

In a paper published today in the journal Nature Nanotechnology, researchers at Stanford University report that they have taken a big step toward accomplishing what battery designers have been trying to do for decades – design a pure lithium anode.

All batteries have three basic components: an electrolyte to provide electrons, an anode to discharge those electrons, and a cathode to receive them.

Today, we say we have lithium batteries, but that is only partly true. What we have are lithium ion batteries. The lithium is in the electrolyte, but not in the anode. An anode of pure lithium would be a huge boost to battery efficiency.

"Of all the materials that one might use in an anode, lithium has the greatest potential. Some call it the Holy Grail," said Yi Cui, a professor of Material Science and Engineering and leader of the research team. "It is very lightweight and it has the highest energy density. You get more power per volume and weight, leading to lighter, smaller batteries with more power."

But engineers have long tried and failed to reach this Holy Grail.

"Lithium has major challenges that have made its use in anodes difficult. Many engineers had given up the search, but we found a way to protect the lithium from the problems that have plagued it for so long," said Guangyuan Zheng, a doctoral candidate in Cui's lab and first author of the paper.

In addition to Zheng, the research team includes Steven Chu, the former U.S. Secretary of Energy and Nobel Laureate who recently resumed his professorship at Stanford.

"In practical terms, if we can improve the capacity of batteries to, say, four times today's, that would be exciting. You might be able to have cell phone with double or triple the battery life or an electric car with a range of 300 miles that cost only $25,000—competitive with an internal combustion engine getting 40 mpg," Chu said.

The engineering challenge

In the paper, the authors explain how they are overcoming the problems posed by lithium.

Most lithium ion batteries, like those you might find in your smart phone or hybrid car, work similarly. The key components include an anode, the negative pole from which electrons flow out and into a power-hungry device, and the cathode, where the electrons re-enter the battery once they have traveled through the circuit. Separating them is an electrolyte, a solid or liquid loaded with positively charged lithium ions that travel between the anode and cathode.

During charging, the positively charged lithium ions in the electrolyte are attracted to the negatively charged anode and the lithium accumulates on the anode. Today, the anode in a lithium ion battery is actually made of graphite or silicon.

Engineers would like to use lithium for the anode, but so far they have been unable to do so. That's because the lithium ions expand as they gather on the anode during charging.

All anode materials, including graphite and silicon, expand somewhat during charging, but not like lithium. Researchers say that lithium's expansion during charging is "virtually infinite" relative to the other materials. Its expansion is also uneven, causing pits and cracks to form in the outer surface, like paint on the exterior of a balloon that is being inflated.

The resulting fissures on the surface of the anode allow the precious lithium ions to escape, forming hair-like or mossy growths, called dendrites. Dendrites, in turn, short circuit the battery and shorten its life.

Preventing this buildup is the first challenge of using lithium for the battery's anode.

The second engineering challenge is that a lithium anode is highly chemically reactive with the electrolyte. It uses up the electrolyte and reduces battery life.

An additional problem is that the anode and electrolyte produce heat when they come into contact. Lithium batteries, including those in use today, can overheat to the point of fire, or even explosion, and are, therefore, a serious safety concern. The recent battery fires in Tesla cars and on Boeing's Dreamliner are prominent examples of the challenges of lithium ion batteries.

Building the nanospheres

To solve these problems the Stanford researchers built a protective layer of interconnected carbon domes on top of their lithium anode. This layer is what the team has called nanospheres

The Stanford team's nanosphere layer resembles a honeycomb: it creates a flexible, uniform and non-reactive film that protects the unstable lithium from the drawbacks that have made it such a challenge. The carbon nanosphere wall is just 20 nanometers thick. It would take some 5,000 layers stacked one atop another to equal the width of single human hair.

"The ideal protective layer for a lithium metal anode needs to be chemically stable to protect against the chemical reactions with the electrolyte and mechanically strong to withstand the expansion of the lithium during charge," Cui said.

The Stanford nanosphere layer is just that. It is made of amorphous carbon, which is chemically stable, yet strong and flexible so as to move freely up and down with the lithium as it expands and contracts during the battery's normal charge-discharge cycle.

Ideal within reach

In technical terms, the nanospheres improve the coulombic efficiency of the battery—a ratio of the amount of lithium that can be extracted from the anode when the battery is in use compared to the amount put in during charging. A single round of this give-and-take process is called a cycle.

Generally, to be commercially viable, a battery must have a coulombic efficiency of 99.9 percent or more, ideally over as many cycles as possible. Previous anodes of unprotected lithium metal achieved approximately 96 percent efficiency, which dropped to less than 50 percent in just 100 cycles—not nearly good enough. The Stanford team's new lithium metal anode achieves 99 percent efficiency even at 150 cycles.

"The difference between 99 percent and 96 percent, in battery terms, is huge. So, while we're not quite to that 99.9 percent threshold, where we need to be, we're close and this is a significant improvement over any previous design," Cui said. "With some additional engineering and new electrolytes, we believe we can realize a practical and stable lithium metal anode that could power the next generation of rechargeable batteries."

Saturday, July 26, 2014

Rhodium -- A Rare Element

Rhodium is a chemical element that is a rare, silvery-white, hard, and chemically inert transition metal and a member of the platinum group.  It has the chemical symbol Rh and atomic number 45. It is composed of only one naturally-occurring isotope, 103Rh. Naturally occurring rhodium is usually found as the free metal, alloyed with similar metals, and rarely as a chemical compound in minerals such as bowieite and rhodplumsite. It is one of the rarest and most valuable precious metals.

Rhodium is a so-called noble metal, resistant to corrosion, found in platinum- or nickel ores together with the other members of the platinum group metals.  It was discovered in 1803 by William Hyde Wollaston in one such ore, and named for the rose color of one of its chlorine compounds, produced after it reacted with the powerful acid mixture aqua regia.

The element's major use (approximately 80% of world rhodium production) is as one of the catalysts in the three-way catalytic converters in automobiles. Because rhodium metal is inert against corrosion and most aggressive chemicals, and because of its rarity, rhodium is usually alloyed with platinum or palladium and applied in high-temperature and corrosion-resistive coatings.  White gold is often plated with a thin rhodium layer to improve its appearance while sterling silver is often rhodium plated for tarnish resistance.


Rhodium is a hard, silvery, durable metal that has a high reflectance.  Rhodium metal does not normally form an oxide, even when heated.  Oxygen is absorbed from the atmosphere only at the melting point of rhodium, but is released on solidification.  Rhodium has both a higher melting point and lower density than platinum.  It is not attacked by most acids: it is completely insoluble in nitric acid and dissolves slightly in aqua regia.
Mining and Price

The industrial extraction of rhodium is complex, as the metal occurs in ores mixed with other metals such as palladium, silver, platinum, and gold. It is found in platinum ores and extracted as a white inert metal which is very difficult to fuse. Principal sources are located in South Africa; in river sands of the Ural Mountains; and in North America, including the copper-nickel sulfide mining area of the Sudbury, Ontario, region. Although the quantity at Sudbury is very small, the large amount of processed nickel ore makes rhodium recovery cost-effective.

The main exporter of rhodium is South Africa (approximately 80% in 2010) followed by Russia.  The annual world production of this element is 30 tonnes and there are very few rhodium-bearing minerals. The price of rhodium is historically highly variable. In 2007, rhodium cost approximately eight times more than gold, 450 times more than silver, and 27,250 times more than copper by weight. In 2008, the price briefly rose above $10,000 per ounce ($350,000 per kilogram). The economic slowdown of the 3rd quarter of 2008 pushed rhodium prices sharply back below $1,000 per ounce ($35,000 per kilogram); they rebounded to $2,750 by early 2010 ($97,000 per kilogram) (over twice the gold price), but in late 2013, the prices were a bit lower than $1000


The primary use of this element is in automobiles as a catalytic converter, which changes harmful unburned hydrocarbons, carbon monoxide, and nitrogen oxide emissions from the engine into less noxious gases. Of 30,000 kg of rhodium consumed worldwide in 2012, some 24,300 kg (81%) went into and 8,060 kg recovered from this application. About 964 kg of rhodium was used in the glass industry, mostly for production of fiberglass and flat-panel glass, and 2,520 kg in the chemical industry.


Being a noble metal, pure rhodium is inert. However, chemical complexes of rhodium can be reactive. Median lethal dose (LD50) for rats is 198 mg of rhodium chloride (RhCl
) per kilogram of body weight. Rhodium compounds can strongly stain human skin.  Like the other noble metals, all of which are too inert to occur as chemical compounds in nature, rhodium has not been found to play, or suspected to play, any biological role. If used in elemental form rather than as compounds, the metal is harmless.

Book Describes Fight Among Psychiatrists

The Book of Woe: The DSM and the Unmaking of Psychiatry by Gary Greenberg, published May 3, 2013

Summary from

For more than two years, author and psychotherapist Gary Greenberg has embedded himself in the war that broke out over the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders—the DSM—the American Psychiatric Association’s compendium of mental illnesses and what Greenberg calls “the book of woe.”

Since its debut in 1952, the book has been frequently revised, and with each revision, the “official” view on which psychological problems constitute mental illness. Homosexuality, for instance, was a mental illness until 1973, and Asperger’s gained recognition in 1994 only to see its status challenged nearly twenty years later. Each revision has created controversy, but the DSM-5, the newest iteration, has shaken psychiatry to its foundations. The APA has taken fire from patients, mental health practitioners, and former members for extending the reach of psychiatry into daily life by encouraging doctors to diagnose more illnesses and prescribe more therapies—often medications whose efficacy is unknown and whose side effects are severe. Critics—including Greenberg—argue that the APA should not have the naming rights to psychological pain or to the hundreds of millions of dollars the organization earns, especially when even the DSM’s staunchest defenders acknowledge that the disorders listed in the book are not real illnesses.
Greenberg’s account of the history behind the DSM, which has grown from pamphlet-sized to encyclopedic since it was first published, and his behind-the-scenes reporting of the deeply flawed process by which the DSM-5 has been revised, is both riveting and disturbing. Anyone who has received a diagnosis of mental disorder, filed a claim with an insurer, or just wondered whether daily troubles qualify as true illness should know how the DSM turns suffering into a commodity, and the APA into its own biggest beneficiary. Invaluable and informative, The Book of Woe is bound to spark intense debate among expert and casual readers alike.

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

Editorial Reviews

From Publishers Weekly

Greenberg delivers a detailed and critical account of the history of the Diagnostic and Statistical Manual of Mental Disorders (DSM), the go-to resource for psychiatrists and other professionals in the mental health field. In particular, the author identifies the various problems, shortcomings, and questionable motivations (such as the overwhelming influence of the pharmaceutical industry) that have plagued the DSM up through its most recent edition. Throughout the book, Greenberg draws upon interviews and research within the field of psychiatry and beyond to make a comprehensive case that will have many raising an eyebrow at psychiatric diagnoses. Narrator David Drummond's deep—almost foreboding—voice matches up well with Greenberg's prose. And Drummond's tone, timing, and emphasis help clarify points for listeners and keep them engaged during the more information-heavy sections of text. A Blue Rider hardcover. (May) --This text refers to the Audio CD edition.

Review Comments

“[I]ndustrious and perfervid... Mr. Greenberg [argues] that the [DSM] and its authors, the American Psychiatric Association, wield their power arbitrarily and often unwisely, encouraging the diagnosis of too many bogus mental illnesses in patients (binge eating disorder, for example) and too much medication to treat them....Mr. Greenberg argues that psychiatry needs to become more humble, not more certain and aggressive....Greenberg is a fresher, funnier writer. He paces the psychiatric stage as if he were part George Carlin, part Gregory House.”
—Dwight Garner, The New York Times

“Greenberg’s documentation of the DSM-5 revision process is an essential read for practicing and in-training psychotherapists and psychiatrists and is an important contribution to the history of psychiatry.”
Library Journal

“The rewriting of the bible of psychiatry shakes the field to its foundations in this savvy, searching exposé.  Deploying wised-up, droll reportage from the trenches of psychiatric policy-making and caustic profiles of the discipline’s luminaries, Greenberg subjects the practices of the mental health industry—his own included—to a withering critique. The result is a compelling insider’s challenge to psychiatry’s scientific pretensions—and a plea to return it to its humanistic roots.”—Publisher’s Weekly, starred review

“Greenberg is an entertaining guide through the treacheries and valuable instances of the DSM, interviewing members on both sides of the divide and keeping the proceedings conversational even when discussing the manual’s pretensions toward epistemic iteration. He also brings his own practice into [The Book of Woe], with examples of the DSM falling woefully short in capturing the complexity of personality. Bright, humorous and seriously thoroughgoing, Greenberg takes all the DSMs for a spin as revealing as the emperor’s new clothes.”Kirkus Reviews

“[A] brilliant look at the making of DSM-5...entertaining, biting and essential...Greenberg builds a splendid and horrifying read....[he] shows us vividly that psychiatry’s biggest problem may be a stubborn reluctance to admit its immaturity.”
—David Dobbs,

“Gary Greenberg is a thoughtful comedian and a cranky philosopher and a humble pest of a reporter, equal parts Woody Allen, Kierkegaard, and Columbo. The Book of Woe is a profound, and profoundly entertaining, riff on malady, power, and truth. This book is for those of us (i.e., all of us) who've ever wondered what it means, and what's at stake, when we try to distinguish the suffering of the ill from the suffering of the human.”
—Gideon Lewis-Kraus, author of A Sense of Direction

“This could be titled The Book of ... Whoa! An eye-popping look at the unnerving, often tawdry politics of psychiatry.”
—Gene Weingarten, two-time Pulitzer Prize winning author of The Fiddler in the Subway

“Bringing the full force of his wit, warmth, and tenacity to this accessible inside account of the latest revision of psychiatry’s diagnostic bible, Gary Greenberg has written a book to rival the importance of its subject. Keenly researched and vividly reported, The Book of Woe is frank, impassioned, on fire for the truth—and best of all, vigorously, beautifully alive to its story’s human stakes.”
—Michelle Orange, author of This Is Running for Your Life

“Gary Greenberg has become the Dante of our psychiatric age, and the DSM-5 is his Inferno. He guides us through the not-so-divine comedy that results when psychiatrists attempt to reduce our hopelessly complex inner worlds to an arbitrary taxonomy that provides a disorder for everybody. Greenberg leads us into depths that Dante never dreamed of. The Book of Woe is a mad chronicle of so-called madness.”
—Errol Morris, Academy Award–winning director, and author of A Wilderness of Error

“In this gripping, devastating account of psychiatric hubris, Gary Greenberg shows that the process of revising the DSM remains as haphazard and chaotic as ever. His meticulous research into the many failures of DSM-5 will spark concern, even alarm, but in doing so will rule out complacency. The Book of Woe deserves a very wide readership.”
—Christopher Lane, author of Shyness: How Normal Behavior Became a Sickness

“Gary Greenberg’s The Book of Woe is about the DSM in the way that Moby-Dick is about a whale—big-time, but only in part. An engaging history of a profession’s virtual bible, The Book of Woe is also a probing consideration of those psychic depths we cannot know and those social realities we pretend not to know, memorably rendered by a seasoned journalist who parses the complexities with a pickpocket’s eye and a mensch’s heart.  If I wanted a therapist, and especially if I wanted to clear my mind of cant, I’d make an appointment with Dr. Greenberg as soon as he could fit me in.”
—Garret Keizer, author of Privacy and The Unwanted Sound of Everything We Want

The Book of Woe is a brilliant, ballsy excursion into the minefield of modern psychiatry. Greenberg has wit, energy, and a wonderfully skeptical mind. If you want to understand how we think of mental suffering today—and why, and to what effect—read this book.”
—Daniel Smith, author of Monkey Mind

“[Greenberg’s] fascinating history of the Diagnostic and Statistical Manual of Mental Disorders (the DSM)[s] just how muddled the boundaries of mental health truly are.”
—Chloë Schama, Smithsonian

“Greenberg argues persuasively that the current DSM encourages psychiatrists to reach beyond their competence....I’m impressed by Greenberg’s reporting, his subtlety of thought, his dedication to honesty, and his literacy....a very good book.”
—Benjamin Nugent,

“The process of assembling [DSM-5] has been anything but smooth, as The Book of Woe relates....Greenberg argues—persuasively—that this fifth edition of the DSM arises not out of any new scientific understanding but from one of the periodic crises of psychiatry....invaluable.”
—Laura Miller,

“In The Book of Woe, Greenberg takes the lay reader through a history of the DSM, which is really a history of psychiatry....[a] fascinating and well-researched account.”
—Suzanne Koven, The Boston Globe

“[E]ngaging, radical and generally delectable...Greenberg is a practicing psychotherapist who writes with the insight of a professional and the panache of a literary journalist....[a] brilliant take-down of the psychiatric profession...The Book of Woe offers a lucid and useful history.”
—Julia M. Klein, The Chicago Tribune

“This is a landmark book about a landmark book....Greenberg paints a picture so compelling and bleak that it could easily send the vulnerable reader into therapy....takes the reader deep inside the secretive world of the panels and personalities that have spent years arguing about which disorders and symptoms they would keep and which they would discard in the new DSM.
—Robert Epstein, Scientific American

Thursday, July 24, 2014

An Unknown X-Ray Signature

Mystery in the Perseus Cluster
By Dr. Tony Phillips

NASA, July 24, 2014:  The Universe is a big place, full of unknowns.  Astronomers using NASA's Chandra X-ray Observatory have just catalogued a new one.

"I couldn't believe my eyes," says Esra Bulbul of the Harvard Center for Astrophysics.  "What we found, at first glance, could not be explained by known physics."

Together with a team of more than a half-dozen colleagues, Bulbul has been using Chandra to explore the Perseus Cluster, a swarm of galaxies approximately 250 million light years from Earth.   Imagine a cloud of gas in which each atom is a whole galaxy—that's a bit what the Perseus cluster is like.  It is one of the most massive known objects in the Universe.

The cluster itself is immersed in an enormous 'atmosphere' of superheated plasma—and it is there that the mystery resides. 

Bulbul explains:  "The cluster's atmosphere is full of ions such as Fe XXV,  Si XIV, and S XV.  Each one produces a 'bump' or 'line' in the x-ray spectrum, which we can map using Chandra. These spectral lines are at well-known x-ray energies."

Yet, in 2012 when Bulbul added together 17 day's worth of Chandra data, a new line popped up where no line should be.

"A line appeared at 3.56 keV (kilo-electron volts) which does not correspond to any known atomic transition," she says.  "It was a great surprise."

At first, Bulbul herself did not believe it. "It took a long time to convince myself that this line is neither a detector artifact, nor a known atomic line," she says. "I have done very careful checks.  I have re-analyzed the data; split the data set into different sub groups; and checked the data from four other detectors on board two different observatories. None of these efforts made the line disappear."

In short, it appears to be real.  The reality of the line was further confirmed when Bulbul's team found the same spectral signature in X-ray emissions from 73 other galaxy clusters.  Those data were gathered by Europe's XMM-Newton, a completely independent X-ray telescope.

Moreover, about a week after Bulbul team posted their paper online, a different group led by Alexey Boyarsky of Leiden University in the Netherlands reported evidence for the same spectral line in XMM-Newton observations of the Andromeda galaxy.  They also confirmed the line in the outskirts of the Perseus cluster.

The spectral line appears not to come from any known type of matter, which shifts suspicion to the unknown: dark matter.

"After we submitted the paper, theoreticians came up with about 60 different dark matter types which could explain this line. Some particle physicists have jokingly called this particle a 'bulbulon'," she laughs.

The menagerie of dark matter candidates that might produce this kind of line include axions, sterile neutrinos, and "moduli dark matter" that may result from the curling up of extra dimensions in string theory.

Solving the mystery could require a whole new observatory.  In 2015, the Japanese space agency is planning to launch an advanced X-ray telescope called "Astro-H." It has a new type of X-ray detector, developed collaboratively by NASA and University of Wisconsin scientists, which will be able to measure the mystery line with more precision than currently possible.

"Maybe then," says Bulbul, "we'll get to the bottom of this."

Wednesday, July 23, 2014

What Ignited the 2003 Iraq War

Iraq and weapons of mass destruction

The fifth president of Iraq, Saddam Hussein, was internationally known for his use of chemical weapons in the 1980s against Iranian and Kurdish civilians during and after the Iran-Iraq War. In the 1980s he pursued an extensive biological weapons program and a nuclear weapons program, though no nuclear bomb was built.

After the 1990-1991 Persian Gulf War, the United Nations located and destroyed large quantities of Iraqi chemical weapons and related equipment and materials throughout the early 1990s, with varying degrees of Iraqi cooperation and obstruction. In response to diminishing Iraqi cooperation with UNSCOM, the United States called for withdrawal of all UN and IAEA inspectors in 1998, resulting in Operation Desert Fox. The United States and the UK asserted that Saddam Hussein still possessed large hidden stockpiles of WMD in 2003, and that he was clandestinely procuring and producing more. Inspections by the UN to resolve the status of unresolved disarmament questions restarted from November 2002 until March 2003, under UN Security Council Resolution 1441, which demanded Saddam give "immediate, unconditional and active cooperation" with UN and IAEA inspections, shortly before his country was attacked.

During the lead-up to war in March 2003, United Nations weapons inspector Hans Blix said that Iraq made significant progress toward resolving open issues of disarmament noting the "proactive" but not always "immediate" cooperation as called for by UN Security Council Resolution 1441. He concluded that it would take “but months” to resolve the key remaining disarmament tasks. The United States asserted this was a breach of Resolution 1441 but failed to convince the UN Security Council to pass a new resolution authorizing the use of force due to lack of evidence.

Despite being unable to get a new resolution authorizing force and citing section 3 of the Joint Resolution passed by the U.S. Congress, President George W. Bush asserted peaceful measures could not disarm Iraq of the weapons he alleged it to have and launched a second Gulf War, despite multiple dissenting opinions and questions of integrity about the underlying intelligence. Later U.S.-led inspections agreed that Iraq had earlier abandoned its WMD programs, but asserted Iraq had an intention to pursue those programs if UN sanctions were ever lifted.

Bush later said that the biggest regret of his presidency was "the intelligence failure" in Iraq, while the Senate Intelligence Committee found in 2008 that his administration "misrepresented the intelligence and the threat from Iraq". A key CIA informant in Iraq admitted that he lied about his allegations, "then watched in shock as it was used to justify the war".

= = = = = = = = = = = = = = = = = = = = = = = = = = = = =

A formative element in the second Iraq war was the effort in mid-1998 by the United States to halt surprise inspection of Iraqi sites which might have contained weapons of mass destruction or facilities for manufacturing such weapons.  The Clinton White House and then-Secretary of State Madeleine Albright leaned on the head UN inspector, Richard Butler of Australia, not to conduct the inspections.  The efforts to halt the surprise inspections were successful.

An excellent article by Barton Gellman of the Washington Post covered this story on page 1 of the Friday, August 14, 1998 Washington Post (continued on page A29).

The Blog author hypothecates that Saddam Hussein, having dealt with American bluffing about these inspections before, and not having the goods to implicate himself, assumed that the Bush administration was just posturing with its tough talk in 2003.  In that atmosphere, it took only one lying informant to light the fuse of another war.  But this does not excuse the American actions in August of 1998.  In throwing away the inspections that were central to the cease-fire that ended the first Gulf War, William Clinton and Madeleine Albright guaranteed a second war for whoever succeeded them in power.

Tuesday, July 22, 2014

Climbing Out of Poverty

“We did live in dire poverty. And one of the things that I hated was poverty. Some people hate spiders. Some people hate snakes. I hated poverty. I couldn't stand it. My mother couldn't stand the fact that we were doing poorly in school, and she prayed and she asked God to give her wisdom. What could she do to get her young sons to understand the importance of developing their minds so that they control their own lives? God gave her the wisdom. At least in her opinion. My brother and I didn't think it was that wise. Turn off the TV, let us watch only two or three TV programs during the week. And with all that spare time read two books a piece from the Detroit Public Libraries and submit to her written book reports, which she couldn't read but we didn't know that. I just hated this. My friends were out having a good time. Her friends would criticize her. My mother didn't care. But after a while I actually began to enjoy reading those books. Because we were very poor, but between the covers of those books I could go anywhere. I could be anybody. I could do anything. I began to read about people of great accomplishment. And as I read those stories, I began to see a connecting thread. I began to see that the person who has the most to do with you, and what happens to you in life, is you. You make decisions. You decide how much energy you want to put behind that decision. And I came to understand that I had control of my own destiny. And at that point I didn't hate poverty anymore, because I knew it was only temporary. I knew I could change that. It was incredibly liberating for me. Made all the difference.”
― Ben Carson, M.D.
Retired pediatric neurosurgeon at Johns Hopkins

Monday, July 21, 2014

Twisted Tales from Shakespeare

Introduction by the Blog Author

Richard Armour wrote Twisted Tales from Shakespeare back around 1957.  I found it on my best High School English teacher’s personal bookshelf and started laughing at it uproariously as soon as I opened it.  Armour takes the plot summary or cheat-by-using-Cliff-Notes approach to literature and adds first rate comedy to the notion of reducing Shakespeare’s best plays to précis-like summaries the length of a short story.  If you are bored and annoyed by Shakespeare, try this humorous approach.  If you love Shakespeare, then, well, you will fully understand the insider approach of Armour and laugh even harder!

Below are three reader reviews from  The average of all the reviews on Amazon is five stars.

 = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
5 Stars
As You Like It
By Bridgette on  November 18, 2002

This book is a must read for all those who've read Shakespeare. One word of caution though: Unless you've read and understood the bard's plays vis a vis Hamlet, Macbeth, Romeo and Juliet etc, you won't understand the humor that comes through in Armour's witty dialogues. It helps if you knew a bit of Shakespeare's life/background as well. Chuckles at every few lines are inevitable, so get ready for some curious glances if you read this in public. Although Armour gives a comical interpretation to Shakespeare's plays, it’s obvious that he's "not come to bury Shakespeare but to honor him." :-)

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

5 Stars
Making Sense of Shakespeare
By Jessica on August 26, 2000

How many times have you burst out laughing when forced to read Shakespeare for an English assignment? And how many times would you actually pick up a book with the dreaded word 'Shakespeare' on it willingly? While staying at a bed and breakfast in Newburyport, I came across the last scene of "Midsummer Night's Dream" being performed in the park. When I returned to the b&b, I was browsing through the shelves of the library when I found this book. I read the previously mentioned play and Romeo and Juliet. Armour makes them easy to understand and also hilarious! He has the rare talent of being able to take the untouchable classics, edit out the unnecessary, and add some unique pizzazz. This book receives my highest recommendation. I have also read his book of light verse, and it is just as enjoyable. This book, as well as his others, is worth the effort to find them out of print.

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

5 Stars
Love Twisted Tales
By EJB on January 29, 2005

I read Twisted Tales many years ago and loved it. Richard Armour has made the Shakespeare plays a hilarious read. When my daughter was in the second grade, she mentioned something about Shakespeare and I said I had a very funny book about his plays which I would give to her when she was older. She insisted on reading the book then anyway, loved it and goes back to it frequently. She is now 16 and recently asked for the book again!

I actually came to the Amazon website to look for more books by Richard Armour. We definitely recommend this book to anyone who loves puns, jokes and great humor, all at the expense of the great Shakespeare plays. You can even follow all the plot twists and characters in Midsummer Nights Dream.

A Multiverse of Bubbles?

Is the Universe a Bubble?  Let’s Check
Perimeter Institute for Theoretical Physics, July 16, 2014

Perimeter Associate Faculty member Matthew Johnson and his colleagues are working to bring the multiverse hypothesis, which to some sounds like a fanciful tale, firmly into the realm of testable science.

Never mind the big bang; in the beginning was the vacuum. The vacuum simmered with energy (variously called dark energy, vacuum energy, the inflation field, or the Higgs field). Like water in a pot, this high energy began to evaporate – bubbles formed.

Each bubble contained another vacuum, whose energy was lower, but still not nothing. This energy drove the bubbles to expand. Inevitably, some bubbles bumped into each other. It’s possible some produced secondary bubbles. Maybe the bubbles were rare and far apart; maybe they were packed close as foam.

But here’s the thing: each of these bubbles was a universe. In this picture, our universe is one bubble in a frothy sea of bubble universes.

That’s the multiverse hypothesis in a bubbly nutshell.

It’s not a bad story. It is, as scientists say, physically motivated – not just made up, but rather arising from what we think we know about cosmic inflation.

Cosmic inflation isn’t universally accepted – most cyclical models of the universe reject the idea. Nevertheless, inflation is a leading theory of the universe’s very early development, and there is some observational evidence to support it.

Inflation holds that in the instant after the big bang, the universe expanded rapidly – so rapidly that an area of space once a nanometer square ended up more than a quarter-billion light years across in just a trillionth of a trillionth of a trillionth of a second. It’s an amazing idea, but it would explain some otherwise puzzling astrophysical observations.

Inflation is thought to have been driven by an inflation field – which is vacuum energy by another name. Once you postulate that the inflation field exists, it’s hard to avoid an “in the beginning was the vacuum” kind of story. This is where the theory of inflation becomes controversial – when it starts to postulate multiple universes.

Proponents of the multiverse theory argue that it’s the next logical step in the inflation story. Detractors argue that it is not physics, but metaphysics – that it is not science because it cannot be tested. After all, physics lives or dies by data that can be gathered and predictions that can be checked.

That’s where Perimeter Associate Faculty member Matthew Johnson (cross-appointed at York University) comes in. Working with a small team that also includes Perimeter Faculty member Luis Lerhner, Johnson is working to bring the multiverse hypothesis firmly into the realm of testable science.

“That’s what this research program is all about,” he says. “We’re trying to find out what the testable predictions of this picture would be, and then going out and looking for them.”

Specifically, Johnson has been considering the rare cases in which our bubble universe might collide with another bubble universe. He lays out the steps: “We simulate the whole universe. We start with a multiverse that has two bubbles in it, we collide the bubbles on a computer to figure out what happens, and then we stick a virtual observer in various places and ask what that observer would see from there.”

Simulating the whole universe – or more than one – seems like a tall order, but apparently that’s not so.

“Simulating the universe is easy,” says Johnson. Simulations, he explains, are not accounting for every atom, every star, or every galaxy – in fact, they account for none of them.

“We’re simulating things only on the largest scales,” he says. “All I need is gravity and the stuff that makes these bubbles up. We’re now at the point where if you have a favourite model of the multiverse, I can stick it on a computer and tell you what you should see.”

That’s a small step for a computer simulation program, but a giant leap for the field of multiverse cosmology. By producing testable predictions, the multiverse model has crossed the line between appealing story and real science.

In fact, Johnson says, the program has reached the point where it can rule out certain models of the multiverse: “We’re now able to say that some models predict something that we should be able to see, and since we don’t in fact see it, we can rule those models out.”

For instance, collisions of one bubble universe with another would leave what Johnson calls “a disk on the sky” – a circular bruise in the cosmic microwave background. That the search for such a disk has so far come up empty makes certain collision-filled models less likely.

Meanwhile, the team is at work figuring out what other kinds of evidence a bubble collision might leave behind. It’s the first time, the team writes in their paper, that anyone has produced a direct quantitative set of predictions for the observable signatures of bubble collisions. And though none of those signatures has so far been found, some of them are possible to look for.

The real significance of this work is as a proof of principle: it shows that the multiverse can be testable. In other words, if we are living in a bubble universe, we might actually be able to tell.

     -- Erin Bow

Sunday, July 20, 2014

Domestic Animals and Genetics

Domestication syndrome: White patches,
baby faces and tameness
Neural crest hypothesis could explain why
domestic mammals share characteristic traits
by Cristy Gelling, The GSA Journals, July 14, 2014      

More than 140 years ago, Charles Darwin noticed something peculiar about domesticated mammals. Compared to their wild ancestors, domestic species are more tame, and they also tend to display a suite of other characteristic features, including floppier ears, patches of white fur, and more juvenile faces with smaller jaws. Since Darwin's observations, the explanation for this pattern has proved elusive, but now, in a Perspectives article published in the journal GENETICS, a new hypothesis has been proposed that could explain why breeding for tameness causes changes in such diverse traits.

The underlying link between these features could be the group of embryonic stem cells called the neural crest, suggest the authors. Although this proposal has not yet been tested, it is the first unified hypothesis that connects several components of the "domestication syndrome." It not only applies to mammals like dogs, foxes, pigs, horses, sheep and rabbits, but it may even explain similar changes in domesticated birds and fish.

"Because Darwin made his observations just as the science of genetics was beginning, the domestication syndrome is one of the oldest problems in the field. So it was tremendously exciting when we realized that the neural crest hypothesis neatly ties together this hodge-podge of traits," says Adam Wilkins, from the Humboldt University of Berlin. Wilkins is an editor at GENETICS and one of the paper's authors.

Neural crest cells are formed near the developing spinal cord of early vertebrate embryos. As the embryo matures, the cells migrate to different parts of the body and give rise to many tissue types. These tissues include pigment cells and parts of the skull, jaws, teeth, and ears—as well as the adrenal glands, which are the center of the "fight-or-flight" response. Neural crest cells also indirectly affect brain development.

In the hypothesis proposed by Wilkins and co-authors Richard Wrangham of Harvard University and Tecumseh Fitch of the University of Vienna, domesticated mammals may show impaired development or migration of neural crest cells compared to their wild ancestors.

"When humans bred these animals for tameness, they may have inadvertently selected those with mild neural crest deficits, resulting in smaller or slow-maturing adrenal glands," Wilkins says. "So, these animals were less fearful."

But the neural crest influences more than adrenal glands. Among other effects, neural crest deficits can cause depigmentation in some areas of skin (e.g. white patches), malformed ear cartilage, tooth anomalies, and jaw development changes, all of which are seen in the domestication syndrome. The authors also suggest that the reduced forebrain size of most domestic mammals could be an indirect effect of neural crest changes, because a chemical signal sent by these cells is critical for proper brain development.

"This interesting idea based in developmental biology brings us closer to solving a riddle that's been with us a long time. It provides a unifying hypothesis to test and brings valuable insight into the biology of domestication," says Mark Johnston, Editor-in-Chief of GENETICS.

Tests of the neural crest hypothesis may not be far off, as other scientists are rapidly mapping the genes that have been altered by domestication in the rat, fox, and dog. The hypothesis predicts that some of these genes will influence neural crest cell biology.

If so, we will have a much deeper understanding of the biology underlying a significant evolutionary event, Wilkins says. "Animal domestication was a crucial step in the development of human civilizations. Without these animals, it's hard to imagine that human societies would have thrived in the way they have."

Friday, July 18, 2014

The Opium of the Intellectuals

Introduction by the Blog Author

Yesterday’s entry dealt with biographical details of 20th century French intellectual Raymond Aron.  Today’s entry follows that on with reviews of Aron’s signature book, The Opium of the Intellectuals.

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

The Opium of the Intellectuals
By Raymond Aron

Book Description

Raymond Aron's 1955 masterpiece The Opium of the Intellectuals, is one of the great works of twentieth- century political reflection. Aron shows how noble ideas can slide into the tyranny of "secular religion" and emphasizes how political thought has the profound responsibility of telling the truth about social and political reality-in all its mundane imperfections and tragic complexities.

Aron explodes the three "myths" of radical thought: the Left, the Revolution, and the Proletariat. Each of these ideas, Aron shows, are ideological, mystifying rather than illuminating. He also provides a fascinating sociology of intellectual life and a powerful critique of historical determinism in the classically restrained prose for which he is justly famous.

For this new edition, prepared by Daniel J. Mahoney and Brian C. Anderson as part of Transaction's ongoing "Aron Project," political scientist Harvey Mansfield provides a luminous introduction that underscores the permanent relevance of Aron's work. The new edition also includes as an appendix "Fanaticism, Prudence, and Faith," a remarkable essay that Aron wrote to defend Opium from its critics and to explain further his view of the proper role of political thinking. The book will be of interest to all students of political theory, history, and sociology.

Editorial Review

"Raymond Aron's analysis of French intellectual culture of the 1940s and 1950s retains its relevance into the 21st century, helping to illuminate the minds of intellectuals so that we can understand their penchant for irrational utopianism. Although the particular controversies have changed somewhat, our modern intellectuals partake of the same opium."

Ideas on Liberty

Customer Reviews
Most Helpful Customer Reviews

200 of 212 people found the following review helpful

5.0 out of 5 stars One of the most profound books of the 20th century! November 23, 1999
By T. Rosati

Aron's book deserves recognition as one of the classic works of 20th century intellectual history.
Written 40 years ago during the battle of ideas between communism and liberal democracy, "The Opium of the Intellectuals" provided profound insight into the mind of the communist intellectual. Aron, a renowned French historian and philosopher, wrote this devastating critique of French radicals (such as John Paul Sartre) during the height of the Cold War. Unlike Albert Camus in his famous book "The Rebel", Aron fires his guns without mercy and exposes these intellectuals' penchant for irrationalism and extremism.
The book's title was derived from Marx's famous quote "Religion is the opium of the people". Marx's belief was that religion diverted people's attention from misery on earth by promising a glorious afterlife. Aron explains communism served this role for radical intellectuals who eloquently rationalized and apologized for communism's barbarism because its promise to deliver utopia on earth. In a nutshell, communism replaced Christianity and other established religions as a new faith, but one grounded in the secular world, not in the heavens. As in all religions, faith is paramount, not reason. Communism's monstrous crimes and wholesale destruction of the individual did not bother these radicals because they believed in the ultimate "means / ends" justification. Since only communism could deliver humanity to the promised-land, it was privileged by its goal, thus any crime could be rationalized as the part of the twisted path to salvation.
This masterpiece illustrates the dangers of radical intellectuals who take a wild leap into political fantasy for the sake of an idea. Fredrich Hayek, the famous Austrian economist, summarized it best 50 years ago when he stated "The distance between a single-minded idealist and a fanatic is just one step".

 = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

46 of 57 people found the following review helpful

5.0 out of 5 stars Continuing relevance of Aron's classic March 18, 2000

Although Aron's treatise was published many decades ago as a brilliant and unsurpassed analysis of French intellectual culture, it has direct relevance for contemporary fads and foibles of Western cultural and intellectual life. Much of what goes on in the academy today becomes lucid when read within Aron's analytical framework. This book should be read by all who care about the education of their children.

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

73 of 103 people found the following review helpful

5.0 out of 5 stars a book for our time August 8, 2003
By Inna Tysoe VINE VOICE

Raymond Aron wrote this brilliant book as a devastating attack on intellectuals who were then much taken with communism, revolution and the proletariat. But it has, as he recognized, obvious relevance to the intellectual climate in the post Cold War world.
Consider, for example, this his profound insight that the intellectual "protests against police brutality, the inhuman rhythm of industrial production, the severity of bourgeois courts, the execution of prisoners whose guilt has not been proved beyond doubt.... But as soon as he decides to give his allegiance to a party which is implacably hostile as he is himself to the established order, we find him forgiving.. everything he has hitherto relentlessly denounced. The revolutionary myth bridges the gap between moral intransigence and terrorism."
Hence today's intellectuals no longer defend Stalin; they applaud Arafat. They do not question whether Western democratic values are superior to those of the USSR but whether Western values are superior to those of Mid-East and Africa. It is no longer fashionable to question whether the way we see communist countries is culturally apt; instead we question whether how we see terrorists is culturally apt. After all, should we not look for "root causes" of terrorism? That question is still asked. It's just that before 1989 intellectuals asked that question about communist terrorism; today they ask it about Mid-East terrorism.
The myth of the revolution has moved to a new continent and intellectuals, the Revolution's devotees, have a new Pope.
What a difference a few years make!

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
16 of 22 people found the following review helpful

By brian komyathy VINE VOICE


Intellectuals "are always inclined to judge their country and its institutions by comparing present realities with theoretical ideals rather than with other realitities." It's evocative of "...the monumental impatience of intellectuals with human complexity and imperfection," and the inability to (quoting from the intro by H. Mansfield) "appreciate the inevitability of partisanship; hence they do not understand politics." But one does not understand politics until one sees that it is a permanent feature of human life, and that it defines human imperfection as the striving for perfection of beings incapable of it." After all, man (as in people-kind) is not God. But for those without any faith in a higher being it is far easier to imagine the feasibility of utopian perfection on earth. Marxism is a 'religion' that supplants religion (& faith in a higher being) and "Marxism is in itself a synthesis of all the principal themes of progressive thought." So, it does not beg credulity to see why intellectuals (inclined to "progressive" beliefs & less inclined to be religious---in a traditional sense) imagine anything to be possible. And it explains why many such people excused away the "terrorism erected into a system of government" in the Soviet Union utilizing a belief that any "progressive" government must be given the benefit of any possible doubts: such is the faith of progressives, easily inebriated on great collective utopian schemes.  People are imperfect, though, are they not? Yet intellectuals (far more prone than workers or oppressed minorities) are "merciless toward the failings of democracies but ready to tolerate the worst crimes as long as they are committed in the name of the proper doctrines." It's an inclination that betrays a penchant for aristocratic values and contempt for average people. A little historical context: After the fall of the aristocratic French monarchy "the revolutionary fervour...split into two separate channels, nationalist and socialist," but "the nationalist ideology is nonetheless condemned in Western Europe." After all, how can European 'powers' "get excited about the temporal grandeur of a collectivity which is incapable of manufacturing its own arms" or unwilling to do so. It is much more economical to free-ride off 'nationalist' America, bemoan such a self-imposed predicament, and channel their frustrations into socialist collectivity. Progressive intellectuals in America (who see the proverbial glass more than half-empty) are no different and "are more pained than simpler mortals by the hegemony of the United States." To quote General Wesley Clark, of all people (though many have said this): "people have an instinctive need to feel a part of something greater than themselves." Citizenship in "a second-class nation state [ie., any European state] is not an adequate framework for full human expression," Aron posits. Hence the increased level of anti-Americanism since the defeat of the USSR (a campaign Europe was very much an important contributing part of) & European intellectual elites frantic drive to federally collectivise their continent despite their citizens' mixed inclinations on the matter. So much for the here-&-now-ideals of democracy when judged against the 'limitless' possibilities of the future. Fair warning PS: This very worthy book, like many philosophical works, is heavy reading at times. An example of the author's writing style: "A pseudo-unity is obtained by subordinating the specific meaning of each spiritual universe to the social function which is assigned to it, by setting up equivocal or false propositions as the basis of a doctrine which is alleged to be at once scientific and philosophical."  Cheers!