Saturday, January 31, 2015

Mixed Forests

Mixed forests are a temperate and humid biome. The typical structure of these forests includes four layers. The uppermost layer is the canopy composed of tall mature trees ranging from 30 to 61 m (100 to 200 ft) high. Below the canopy is the three-layered, shade-tolerant understory that is roughly 9 to 15 m (30 to 50 ft) shorter than the canopy. The top layer of the understory is the sub-canopy which is composed of smaller mature trees, saplings, and suppressed juvenile canopy layer trees awaiting an opening in the canopy. Below the sub-canopy is the shrub layer, composed of low growing woody plants. Typically the lowest growing (and most diverse) layer is the ground cover or herbaceous layer.


In the Northern hemisphere, characteristic dominant broadleaf trees in this biome include oaks, beeches, maples, and birches.  The term "mixed forest" comes from the inclusion of coniferous trees as a canopy component of these forests. Typical coniferous trees include: Pines, firs, and spruces. In some areas of this biome the conifers may be a more important canopy species than the broadleaf species. In the Southern hemisphere, endemic genera such as Nothofagus and Eucalyptus occupy this biome.


Temperate broadleaf and mixed forests occur in areas with distinct warm and cool season, which give it a moderate annual average temperature—3 to 15.6 °C (37 to 60 °F). These forests occur in relatively warm and rainy climates, sometimes also with a distinct dry season. A dry season occurs in the winter in East Asia and in summer on the wet fringe of the Mediterranean climate zones. Other areas have a fairly even distribution of rainfall; annual rainfall is typically over 600 mm (24 in) and often over 1,500 mm (59 in). Temperatures are typically moderate except in parts of Asia such as Ussuriland where temperate forests can occur despite very harsh conditions with very cold winters.

Friday, January 30, 2015

Parallel Computing -- the Basics

Parallel computing is a form of computation in which many calculations are carried out simultaneously, operating on the principle that large problems can often be divided into smaller ones, which are then solved concurrently ("in parallel"). There are several different forms of parallel computing: bit-level, instruction level, data, and task parallelism.  Parallelism has been employed for many years, mainly in high-performance computing, but interest in it has grown lately due to the physical constraints preventing frequency scaling.  As power consumption (and consequently heat generation) by computers has become a concern in recent years, parallel computing has become the dominant paradigm in computer architecture, mainly in the form of multi-core processors.

Parallel computers can be roughly classified according to the level at which the hardware supports parallelism, with multi-core and multi-processor computers having multiple processing elements within a single machine, while clusters, MPPs, and grids use multiple computers to work on the same task. Specialized parallel computer architectures are sometimes used alongside traditional processors, for accelerating specific tasks.

Parallel computer programs are more difficult to write than sequential ones, because concurrency introduces several new classes of potential software bugs, of which race conditions  are the most common.  Communication and synchronization between the different subtasks are typically some of the greatest obstacles to getting good parallel program performance.

The maximum possible speed up of a single program as a result of parallelization is known as Amdahl’s law.


Traditionally, computer software has been written for serial computation. To solve a problem, an algorithm is constructed and implemented as a serial stream of instructions. These instructions are executed on a central processing unit on one computer. Only one instruction may execute at a time—after that instruction is finished, the next is executed.

Parallel computing, on the other hand, uses multiple processing elements simultaneously to solve a problem. This is accomplished by breaking the problem into independent parts so that each processing element can execute its part of the algorithm simultaneously with the others. The processing elements can be diverse and include resources such as a single computer with multiple processors, several networked computers, specialized hardware, or any combination of the above.

Frequency scaling was the dominant reason for improvements in computer performance from the mid-1980s until 2004. The runtime of a program is equal to the number of instructions multiplied by the average time per instruction. Maintaining everything else constant, increasing the clock frequency decreases the average time it takes to execute an instruction. An increase in frequency thus decreases runtime for all compute-bound programs.

However, power consumption by a chip is given by the equation P = C × V2 × F, where P is power, C is the capacitance being switched per clock cycle (proportional to the number of transistors whose inputs change), V is voltage, and F is the processor frequency (cycles per second).  Increases in frequency increase the amount of power used in a processor. Increasing processor power consumption led ultimately to Intel's May 2004 cancellation of its Tejas and Jayhawk processors, which is generally cited as the end of frequency scaling as the dominant computer architecture paradigm.

Moore’s Law is the empirical observation that transistor density in a microprocessor doubles every 18 to 24 months.  Despite power consumption issues, and repeated predictions of its end, Moore's law is still in effect. With the end of frequency scaling, these additional transistors (which are no longer used for frequency scaling) can be used to add extra hardware for parallel computing.

Amdahl’s Law

Optimally, the speed-up from parallelization would be linear—doubling the number of processing elements should halve the runtime, and doubling it a second time should again halve the runtime. However, very few parallel algorithms achieve optimal speed-up. Most of them have a near-linear speed-up for small numbers of processing elements, which flattens out into a constant value for large numbers of processing elements.

The potential speed-up of an algorithm on a parallel computing platform is given by Amdahl’s law, originally formulated by Gene Amdahl in the 1960s.  It states that a small portion of the program which cannot be parallelized will limit the overall speed-up available from parallelization.

Thursday, January 29, 2015

Easter Island Mystery

A UCSB Geographer Teams up with Archeologists to Clarify Factors that Contributed to the Demise of Early Rapa Nui  Society
By Julie Cohen, The UC Santa Barbara Current, January 27, 2015

Long before the Europeans arrived on Easter Island in 1722, the native Polynesian culture known as Rapa Nui showed signs of demographic decline. However, the catalyst has long been debated in the scientific community. Was environmental degradation the cause, or could a political revolution or an epidemic of disease be to blame?

A new study by a group of international researchers, including UC Santa Barbara’s Oliver Chadwick, offers a different explanation and helps to clarify the chronological framework. The investigators expected to find that changes coincided with the arrival of the Europeans, but their work shows instead that the demise of the Rapa Nui culture began prior to that. Their findings are published in the Proceedings of the National Academy of Sciences.

“In the current Easter Island debate, one side says the Rapa Nui decimated their environment and killed themselves off,” said Chadwick, a professor in UC Santa Barbara’s Department of Geography and the Environmental Studies Program. “The other side says it had nothing to do with cultural behavior, that it was the Europeans who brought disease that killed the Rapa Nui. Our results show that there is some of both going on, but the important point is that we show evidence of some communities being abandoned prior to European contact.”

Chadwick joined archaeologists Christopher Stevenson of Virginia Commonwealth University, Cedric Puleston of UC Davis and Thegn Ladefoged of the University of Auckland in examining six agriculture sites used by the island’s statue-building inhabitants. Their research focused mainly on the three sites for which they had information on climate, soil chemistry and land use trends as determined by an analysis of obsidian spear points.

The team used flakes of obsidian, a natural glass, as a dating tool. Measuring the amount of water that had penetrated the obsidian’s surface allowed them to gauge how long it had been exposed and to determine its age.

The study sites reflected the environmental diversity of the 63-square-mile island situated nearly 2,300 miles off the west coast of Chile. The soil nutrient supply on Easter Island is less than that of the younger Hawaiian Islands, which were also settled by the Polynesians around the same time, 1200 A.D.

The first site the researchers analyzed was near the northwest coast. Lying in the rain shadow of a volcano, it had low rainfall and relatively high soil nutrient availability. The second study site, on the interior side of the volcanic mountain, experienced high rainfall but had a low nutrient supply; the third, another near-coastal area in the northeast, was characterized by intermediate amounts of rainfall and relatively high soil nutrients.

“When we evaluate the length of time that the land was used based on the age distribution of each site’s obsidian flakes, which we used as an index of human habitation, we find that the very dry area and the very wet area were abandoned before European contact,” Chadwick said. “The area that had relatively high nutrients and intermediate rainfall maintained a robust population well after European contact.”

These results suggest that the Rapa Nui reacted to regional variations and natural environmental barriers to producing sufficient crops rather than degrading the environment themselves. In the nutrient-rich center where they could produce food well, they were able to maintain a viable culture even under the threat of external factors, including European diseases such as smallpox, syphilis and tuberculosis.

“The pullback from the marginal areas suggests that the Rapa Nui couldn’t continue to maintain the food resources necessary to keep the statue builders in business,” Chadwick concluded. “So we see the story as one of pushing against constraints and having to pull back rather than one of violent collapse.”

The study sites reflected the environmental diversity of the 63-square-mile island situated nearly 2,300 miles off the west coast of Chile. The soil nutrient supply on Easter Island is less than that of the younger Hawaiian Islands, which were also settled by the Polynesians around the same time, 1200 A.D.

The first site the researchers analyzed was near the northwest coast. Lying in the rain shadow of a volcano, it had low rainfall and relatively high soil nutrient availability. The second study site, on the interior side of the volcanic mountain, experienced high rainfall but had a low nutrient supply; the third, another near-coastal area in the northeast, was characterized by intermediate amounts of rainfall and relatively high soil nutrients.

“When we evaluate the length of time that the land was used based on the age distribution of each site’s obsidian flakes, which we used as an index of human habitation, we find that the very dry area and the very wet area were abandoned before European contact,” Chadwick said. “The area that had relatively high nutrients and intermediate rainfall maintained a robust population well after European contact.”

These results suggest that the Rapa Nui reacted to regional variations and natural environmental barriers to producing sufficient crops rather than degrading the environment themselves. In the nutrient-rich center where they could produce food well, they were able to maintain a viable culture even under the threat of external factors, including European diseases such as smallpox, syphilis and tuberculosis.

“The pullback from the marginal areas suggests that the Rapa Nui couldn’t continue to maintain the food resources necessary to keep the statue builders in business,” Chadwick concluded. “So we see the story as one of pushing against constraints and having to pull back rather than one of violent collapse.”

Wednesday, January 28, 2015

Last Boss -- Carmine DeSapio

Carmine Gerard DeSapio (December 10, 1908 – July 27, 2004) was an American politician from New York City.  He was the last head of the Tammany Hall political machine to be able to dominate municipal politics.


DeSapio was born in lower Manhattan.  His father was an Italian immigrant, while his mother was of the second generation. He graduated from Fordham University in 1931.

He started his career in the Tammany Hall organization as an errand boy and messenger for precinct captains. Tammany Hall was the name given to the Democratic political machine that dominated New York City politics from the mayoral victory of Fernando Wood in 1854 through the election of Fiorello H. La Guardia in 1933.  He was first elected a district captain in 1939, but was rejected by the leadership in the struggle between Irish and Italian interests for control of the organization.  In 1943 he was accepted as district leader for lower Greenwich Village.

In 1949, DeSapio became the youngest Boss in the history of Tammany Hall, succeeding Hugo Rogers. His Italian heritage signaled the end of Tammany's longtime dominance by Irish-American politicians, and DeSapio became the first nationally prominent Italian-American political boss.  Throughout his political life, DeSapio gained notoriety from alleged involvement with organized crime, even though he fought to distance the organization from the unsavory days of Boss Tweed, and allegations of corruption.  In 1951, Senator Estes Kefauver of Tennessee concluded that DeSapio was feeding the interests of New York's most powerful mobster Frank Costello, and that Costello had become the lead person who influenced decisions made by the Tammany Hall council. DeSapio admitted to having met Costello several times, "but insisted that politics was never discussed".

Unlike many of the previous "bosses," however, DeSapio always made his decisions known to the public and promoted himself as a reformer. As boss of Tammany, he demonstrated liberal credentials when he diversified Tammany's leadership by naming the first Puerto Rican Manhattan district leader, Anthony Mandez, and backed Hulan Jack as Manhattan's first African-American Borough President. His ties with Costello also failed to halt his rise to power in the local political scene.

In 1953, he earned new respect and public admiration when he turned against the other Democratic leaders in New York City and used the power of Tammany Hall to help lead the defeat of highly unpopular incumbent mayor Vincent R. Impellitteri in the Democratic Party primary by Robert F. Wagner, Jr., a highly outspoken pro-reform Democrat, and then helped assure Wagner's victory in the general election. Following Wagner's election, DeSapio became a powerful and well-respected kingmaker in the New York political scene.

DeSapio always seemed a personally modest man. Even though he operated out of four lavish offices, he lived for fifty years in a middle-class apartment on Washington Square with his wife Theresa Natale ("Natalie") and daughter Geraldine. As leader of Tammany Hall DeSapio reveled in the limelight, attending charitable fund-raising events, making himself available to the press and delivering speeches in highbrow venues that were thought off-limits to political bosses. In wielding his enormous political clout, he usually preferred extensive consultations and consensus-building to unilateral decision-making. His 16- to-18-hour workday began with pre-breakfast phone calls at home, where still dressed in pajamas and bathrobe, he received a stream of political associates. DeSapio would then visit his various offices for further meetings, and cram in a half-dozen public functions, including radio and television appearances and a late-night political dinner.

DeSapio succeeded in shucking Tammany's notoriety and fashioning himself as a sophisticated, enlightened and modern political boss. He favored well-tailored, dark suits and striped ties, and always looked as if he had just stepped out of a barber's chair. The only incongruity was the dark glasses he was forced to always wear because of chronic iritis.

However, it later became apparent that he was also selling out to benefit local mobsters such as Costello. DeSapio was accused of staffing New York City's government with clubhouse hacks. He followed the Tammany custom of selling judicial nominations—though he did cut the fee that would-be judges were required to pay.  He steered valuable city contracts for streetlights and parking meters to the Broadway Maintenance Corporation, a company that according to the State Investigation Commission cheated taxpayers out of millions of dollars.[2]

His leadership ended in 1961, and with it the dynasty that was Tammany Hall. It took several years of work by Eleanor Roosevelt to bring this about. She had vowed revenge because she felt DeSapio had derailed her son's (Franklin D. Roosevelt, Jr.) political ambitions by persuading him to abandon his run for Governor of New York in 1954 and instead run for New York Attorney General. After Roosevelt dropped out, DeSapio then got the local Democratic Party officials to accept former banker and diplomat W. Averell Harriman as the Democratic Party's nominee for Governor in the New York state election.  Harriman barely managed to secure victory as Governor of New York and Roosevelt would lose his bid to become the New York Attorney General. Following Harriman's victory, DeSapio served in Harriman's cabinet as Secretary of State of New York.

In 1958, DeSapio's image was severely damaged after he foisted his own candidate for Senate, Manhattan District Attorney Frank Hogan.  New Yorkers now saw DeSapio as an old-time Tammany Hall boss and Hogan would lose the Senate election to Republican Kenneth Keating; Republican Nelson Rockefeller would also be elected Governor the same year as well. Democrats who once praised DeSapio now found it expedient to excoriate him. In 1961, Wagner won re-election by running a reformist campaign that denounced his former patron, DeSapio, as an undemocratic practitioner of Tammany machine politics.  The same year, DeSapio lost the district leadership of his native Greenwich Village, a post he had held for two decades, to an upstart reform Democrat, James Lanigan, who was backed by nationally known liberal Democrats such as Wagner, Eleanor Roosevelt and former Senator Herbert H. Lehman.  Following his loss, Eleanor Roosevelt told local journalist Murray Kempton, who published her remarks many years later in 1991 when he was a columnist for Newsday, "I told Carmine I would get him for what he did to Franklin, and get him I did."

In 1963 and 1965, after Lanigan stepped down, DeSapio tried to retake his position as Greenwich Village district leader, but was twice defeated by another reform candidate, Edwarr I. Koch, who would later go on to become mayor. DeSapio reached a low point in 1969 when he was convicted in a Federal Court of conspiracy and bribery after it was acknowledged that he conspired to bribe the former New York City water commissioner, James L. Marcus, and extort contracts from Consolidated Edison that would result in kickbacks.  He served two years in federal prison (1971–1973). After his release, he never re-entered politics, but did support many community, charitable, and civic causes. He regained some of his former popularity through his skill as a speaker. In 1992, former Mayor Ed Koch, his previous opponent in 1963 and 1965, whom he now befriended and met with on occasions, said of him: "He is a crook, but I like him.... Most politicians still like DeSapio. He always gets the most applause when he is introduced at Democratic dinners".

Among his accomplishments were support of the Fair Employment Practices Law, the New York City rent control laws, and the lowering of the voting age to 18.  DeSapio died at age 95 on July 27, 2004, at St. Vincent’s Hospital in Manhattan.  He was interred in a private mausoleum at Calvary Cemetery in Woodside, Queens. He was survived by his daughter Geraldine A. DeSapio.

Tuesday, January 27, 2015

Five Good Emperors

The rulers commonly known as the "Five Good Emperors" were Nerva, Trajan, Hadrian, Antonius Pius and Marcus Aurelius.  The term Five Good Emperors was coined by the political philosopher Niccolo Machiavelli in 1503:

From the study of this history we may also learn how a good government is to be established; for while all the emperors who succeeded to the throne by birth, except Titus, were bad, all were good who succeeded by adoption, as in the case of the five from Nerva to Marcus. But as soon as the empire fell once more to the heirs by birth, its ruin recommenced.

Machiavelli argued that these adopted emperors, through good rule, earned the respect of those around them:

Titus, Nerva, Trajan, Hadrian, Antoninus, and Marcus had no need of praetorian cohorts, or of countless legions to guard them, but were defended by their own good lives, the good-will of their subjects, and the attachment of the senate.

The 18th-century historian Edward Gibbon, in his work The History of the Decline and Fall of the Roman Empire, opined that their rule was a time when "the Roman Empire was governed by absolute power, under the guidance of wisdom and virtue".  Gibbon believed these benevolent dictators and their moderate policies were unusual and contrasted with their more tyrannical and oppressive successors (their predecessors are not covered by Gibbon).

Gibbon went so far as to state:

If a man were called to fix the period in the history of the world during which the condition of the human race was most happy and prosperous, he would, without hesitation, name that which elapsed from the death of Domitian to the accession of Commodus. The vast extent of the Roman Empire was governed by absolute power, under the guidance of virtue and wisdom. The armies were restrained by the firm but gentle hand of four successive emperors, whose characters and authority commanded respect. The forms of the civil administration were carefully preserved by Nerva, Trajan, Hadrian and the Antonines, who delighted in the image of liberty, and were pleased with considering themselves as the accountable ministers of the laws. Such princes deserved the honour of restoring the republic, had the Romans of their days been capable of enjoying a rational freedom.

Alternative hypothesis

This hypothesis posits that adoptive succession is thought to have arisen because of a lack of biological heirs. All but the last of the adoptive emperors had no legitimate biological sons to succeed them. They were thus obliged to pick a successor somewhere else; as soon as the Emperor could look towards a biological son to succeed him, adoptive succession was set aside.

The dynasty may be broken up into the Nerva-Trajan dynasty (also called the Ulpian dynasty after Trajan's nomen gentile 'Ulpius') and Antonine dynasty (after their common name Antoninus).

Monday, January 26, 2015

1995 Near-Nuclear Incident

The firing of a scientific rocket from a small Norwegian island nearly started a nuclear exchange in 1995.  Details are found in yesterday’s news story from then Boston Globe at:

This scary incident has three parallels in past near-misses also chronicled on this blog, one posted November 30, 2013 regarding Vasili Arkhipov, another on December 1, 2013 regarding Stanislov Petrov and a third on June 12, 2014 regarding a near-nuclear detonation in North Carolina in 1961.

Sunday, January 25, 2015

Investing through Software

Two and one –half years ago, investing professional Nick Shalek wrote a wonderful piece on individual investing.  He said don’t do it unless you have the time to commit to it.  He also recommended using computer software to make your investing decisions.

The reasoning is very sound, clever and respectable.  The article is still available on-line at:

Saturday, January 24, 2015

Bad Man Yet Great Dad

STORY NUMBER ONE            

Many years ago, Al Capone virtually owned Chicago . Capone wasn't famous for anything heroic. He was notorious for enmeshing the windy city in everything from bootlegged booze and prostitution to murder.
Capone had a lawyer nicknamed "Easy Eddie ." He was Capone 's lawyer for a good reason . Eddie was very good! In fact, Eddie 's skill at legal maneuvering kept Big Al out of jail for a long time.

To show his appreciation, Capone paid him very well. Not only was the money big, but Eddie got special dividends, as well. For instance, he and his family occupied a fenced-in mansion with live-in help and all of the conveniences of the day. The estate was so large that it filled an entire Chicago City block.

Eddie lived the high life of the Chicago mob and gave little consideration to the atrocity that went on around him.

Eddie did have one soft spot, however. He had a son that he loved dearly. Eddie saw to it that his young son had clothes, cars, and a good education. Nothing was withheld. Price was no object.            

And, despite his involvement with organized crime, Eddie even tried to teach him right from wrong. Eddie wanted his son to be a better man than he was.            

Yet, with all his wealth and influence, there were two things he couldn't give his son; he couldn't pass on a good name or a good example.

One day, Easy Eddie reached a difficult decision. Easy Eddie wanted to rectify wrongs he had done.            

He decided he would go to the authorities and tell the truth about Al " Scarface " Capone , clean up his tarnished name, and offer his son some semblance of integrity. To do this, he would have to testify against The Mob, and he knew that the cost would be great. So, he testified.            

Within the year, Easy Eddie 's life ended in a blaze of gunfire on a lonely Chicago Street . But in his eyes, he had given his son the greatest gift he had to offer, at the greatest price he could ever pay. Police removed from his pockets a rosary , a crucifix , a religious medallion, and a clipping from a magazine. It read:            

"The clock of life is wound but once, and no man has the power to tell just when the hands will stop, at late or early hour. Now is the only time you own. Live, love, toil with a will. Place no faith in time. For the clock may soon be still."

STORY NUMBER TWO            

World War II produced many heroes. One such man was Lieutenant Commander Butch O'Hare .

He was a fighter pilot assigned to the aircraft carrier Lexington in the South Pacific.

One day his entire squadron was sent on a mission. After he was airborne, he looked at his fuel gauge and realized that someone had forgotten to top off his fuel tank.

He would not have enough fuel to complete his mission and get back to his ship.

His flight leader told him to return to the carrier. Reluctantly, he dropped out of formation and headed back to the fleet.

As he was returning to the mother ship, he saw something that turned his blood cold; a squadron of Japanese aircraft was speeding its way toward the American fleet.

The American fighters were gone on a sortie , and the fleet was all but defenseless . He couldn't reach his squadron and bring them back in time to save the fleet. Nor could he warn the fleet of the approaching danger. There was only one thing to do. He must somehow divert them from the fleet.            

Laying aside all thoughts of personal safety, he dove into the formation of Japanese planes. Wing-mounted 50 caliber's blazed as he charged in, attacking one surprised enemy plane and then another. Butch wove in and out of the now broken formation and fired at as many planes as possible until all his ammunition was finally spent.

Undaunted, he continued the assault. He dove at the planes, trying to clip a wing or tail in hopes of damaging as many enemy planes as possible, rendering them unfit to fly.

Finally, the exasperated Japanese squadron took off in another direction            

Deeply relieved, Butch O'Hare and his tattered fighter limped back to the carrier.            

Upon arrival, he reported in and related the event surrounding his return. The film from the gun-camera mounted on his plane told the tale. It showed the extent of Butch 's daring attempt to protect his fleet. He had, in fact, destroyed five enemy aircraft

This took place on February 20, 1942 , and for that action Butch became the Navy's first Ace of W.W.II, and the first Naval Aviator to win the Medal of Honor .

A year later Butch was killed in aerial combat at the age of 29. His home town would not allow the memory of this WW II hero to fade, and today, O'Hare Airport in Chicago is named in tribute to the courage of this great man.           

So, the next time you find yourself at O'Hare International, give some thought to visiting Butch 's memorial displaying his statue and his Medal of Honor . It's located between Terminals 1 and 2.


Butch O'Hare was "Easy Eddie 's" son.
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

See also:


Friday, January 23, 2015

Karmarkar's Algorithm

Karmarkar's algorithm is an algorithm introduced by Narendra Karmarkar in 1984 for solving linear programming problems. It was the first reasonably efficient algorithm that solves these problems in polynomial time.  The ellipsoid method is also polynomial time but proved to be inefficient in practice.

Karmarkar's algorithm falls within the class of interior point methods: the current guess for the solution does not follow the boundary of the feasible set as in the simplex method, but it moves through the interior of the feasible region, improving the approximation of the optimal solution by a definite fraction with every iteration, and converging to an optimal solution with rational data.

Patent Controversy – “Can Mathematics Be Patented?”

At the time he invented the algorithm, Narendra Karmarkar was employed by AT&T. After applying the algorithm to optimizing AT&T 's telephone network, they realized that his invention could be of practical importance. In April 1985, AT&T promptly applied for a patent on Karmarkar's algorithm and that became more fuel for the ongoing controversy over the issue of software patents. This left many mathematicians uneasy, such as Ronald Rivest (himself one of the holders of the patent on the RSA algorithm), who expressed the opinion that research proceeded on the basis that algorithms should be free. Even before the patent was actually granted, it seemed that there was prior art that might have been applicable. Mathematicians who specialize in numerical analysis such as Philip Gill and others claimed that Karmarkar's algorithm is equivalent to a projected Newton barrier method with a logarithmic barrier function, if the parameters are chosen suitably.  However, some say Gill's argument is flawed, insofar as the method they describe does not even qualify as an "algorithm", since it requires choices of parameters that don't follow from the internal logic of the method, but rely on external guidance, essentially from Karmarkar's algorithm. Furthermore, Karmarkar's contributions are considered far from obvious in light of all prior work, including Fiacco-McCormick, Gill and others cited by Saltzman. The patent was debated in the U.S. Senate and granted in recognition of the essential originality of Karmarkar's work, as U.S. Patent 4,744,026: "Methods and apparatus for efficient resource allocation" in May 1988. AT&T supplied the KORBX system based on this patent to the Pentagon, which enabled them to solve mathematical programming problems which were previously considered unsolvable. Opponents of software patents have further alleged that the patents ruined the positive interaction cycles that previously characterized the relationship between researchers in linear programming and industry, and specifically it isolated Karmarkar himself from the network of mathematical researchers in his field.

The patent itself expired in April 2006, and the algorithm is presently in the public domain.

Thursday, January 22, 2015

Oil Shows the Folly of Forecasts

By Charles Rotblut, CFA, AAII Journal Editor, January 22, 2015  [AAII is the American Association of Individual Investors, which has its own monthly publication]

….Whenever Mr. Market wants to change something, he can turn the dials pretty swiftly and cause prices to move significantly. The speed and the magnitude of the changes are often greater than many investors realize while the price adjustments are occurring.

Oil provides a good example. At the end of last June, oil traded at $105.37 per barrel. Over the next three months, the price declined to $91.16. Then, in the fourth quarter, oil plunged by more than 41% to $53.27 on December 31, 2014. Oil has continued to fall this month, trading at $46.50 per barrel today. Put another way, oil has fallen by 56% since the end of June and 49% during the 16-week span starting at the end of September.

As a driver, I’m happy. It was great to be able to fill up my wife’s SUV for $25 last week. (She drives a Hyundai Santa Fe.) As an investor, I’m cognizant of the impact the drop has had on shares of energy companies and energy funds. I’m even more cognizant of how this move reveals the folly of forecasters.

Let’s go back to last June. Twice a year, Barron’s holds a roundtable discussion of Wall Street experts to discuss their outlook for the financial markets and investment ideas. During last summer’s roundtable, fund manager Mario Gabelli recommended oil service company Weatherford International (WFT) and money manager Scott Black listed exploration and production company PDC Energy (PDCE) among his picks. In January 2014, Goldman Sachs’ Abby Joseph Cohen responded to a question about her firm’s 2014 oil forecast by saying, “We're not big bulls. Our analysts forecast $90 a barrel for West Texas Intermediate at year end.”

I’m not purposely singling out the Barron’s roundtable participants; they are smart people and I always look forward to the edited transcripts of their semiannual conversations. But the fact that none of the roundtable participants called for a big drop in oil prices last year—and even went in the opposite direction by recommending oil stocks—shows how difficult it is to predict what will happen in the future. As Gabelli pointed out during the latest roundtable discussion (published in this week’s issue of Barron’s), “A year ago, nobody thought about oil, or Putin invading Crimea, or the spread of Ebola.”

It’s not just oil. Yields on the benchmark 10-year Treasury note have been making significant moves as well. After falling to 1.63% on May 2, 2013, yields spiked to 2.90% over a period of less than four months (May 2 to August 22, 2013) before ending 2013 at 3.03%. Last year (2014), yields reversed course and fell, leading to today's close of 1.90%. It likely would not take much effort to find forecasts predicting yields would be higher now than they actually are.

Then there is monetary policy. Last week, I attended a presentation by St. Louis Federal Reserve president James Bullard. He discussed how the Federal Open Market Committee (FOMC) has been pulled in different directions by better-than-forecast employment and weaker-than-forecast inflation. He added that below-target inflation and the possibility of lower inflation in the future is weighing on the committee’s decision process. If the people responsible for setting monetary policy are unsure about when to start raising rates, ask yourself how much you want to rely on someone else’s forecast about when rates will start to be raised.

There are people who get forecasts right. On Wall Street, people make careers out of getting one or two big forecasts correct. That doesn’t mean they possess soothsaying abilities; rather, it means that they simply got lucky. The role of luck is vastly underestimated. It can make a good strategy seem bad, and a bad call seem brilliant. It also means you should never forget the advice James Montier of GMO once gave: “If you don’t know what is going to happen, don’t structure your portfolio as though you do!”

I fully understand the desire for certainty. It’s a human trait. But Mr. Market doesn’t reward certainty; he rewards you for taking risks. When you take risks, there will be costly market moves. There will also be profitable moves. Which brings me back to reiterating something I’ve said before and will say again in the future: Go with the historical odds about what has worked over the long term and don’t worry about what might happen tomorrow, next week, next month or next year. The only forecast we can make with any accuracy is that something we weren’t anticipating to happen will happen.

Wednesday, January 21, 2015

Good Marriage = More Money

Want to Make More Money? Marry
the Right Person, Science Says
By Jeff Haden, Ghostwriter, Speaker, Inc Magazine Contributing Editor
January 19, 2015

Lots of things matter where your job satisfaction, earning power, and the success of your career are concerned. Your boss matters. So does your education, the industry you've chosen, and macroeconomics.

And luck. Luck definitely plays a part.

But while those are all important factors in your career -- and your earning power -- here's one factor you probably haven't considered:
Your spouse.

Researchers at Washington University in St. Louis found that people with relatively prudent and reliable partners tend to perform better at work, earning more promotions, making more money, and feeling more satisfied with their jobs.

That's true for men and women. "Partner conscientiousness" predicted future job satisfaction, income, and likelihood of promotion (even after factoring in the participants' level of conscientiousness) for both sexes.
According to the researchers, "conscientious" partners perform more household tasks, exhibit more pragmatic behaviors that their spouses are likely to emulate, and promote a more satisfying home life... all of which enables their spouse to focus more on work.
As one researcher said, "These results demonstrate that the dispositional characteristics of the person one marries influence important aspects of one's professional life." (Or, in non-researcher lingo, a good partner sets a good example and makes it possible for you to be an even better you.)

I know that's true for me. My wife is the most organized person I know. She juggles family, multiple jobs, multiple interests... she's a goal-achieving machine.
For a while her "conscientiousness" got on my nerves until I realized the reason it bugged me was because her level of focus and drive implicitly challenged my inherent laziness. I finally realized the best way to get more done, something we all want to do, is to actually get more done -- and she definitely helps me do that.

And I try to do the same for her. Since my daily commute is two flights of stairs, I take care of most of the household stuff: laundry, groceries, cleaning (I don't do all the cleaning but I make sure it gets done), etc, so when she comes home she can just be home.
So, while she's still considerably more conscientious and organized than I am, she's definitely rubbed off on me in a very positive way.

Which of course makes sense: as Jim Rohn (and others) likes to say, we are the average of the five people we spend the most time with -- and that's particularly true where our significant others are concerned. Bad habits rub off. Poor tendencies rub off. We all know that.
But great habits and great tendencies rub off too.

Plus, if one person is extremely organized and keeps the household trains running on time that frees the other up to focus more on work. (In a perfect world both would more or less equally share train-engineer duties so that both can better focus on their careers, whether those careers are inside or outside the home.)
Of course I'm not recommending you choose your significant other solely on the basis of criteria like conscientiousness and prudence. As the researchers say, "Marrying a conscientious partner could at first sound like a recipe for a rigid and lackluster lifestyle."

Nor am I suggesting you end your relationship if you feel your partner is lacking in those areas. But it does appear that having a conscientious and prudent partner is part of the recipe for a better and more rewarding career.
So here's what you can do. Instead of expecting your partner to change, think about what you can do to be more supportive of your significant other. Maybe you can take on managing finances, or take on more household chores or schedules.

Since the best way to lead is to lead by example, in time you may find that you and your significant other make a great, mutually supportive team, each of you genuinely, and actively, supporting each other... and supporting each other's goals and dreams.
You don't need research to tell you that kind of relationship would be awesome.

Now it's your turn. How has your significant other affected (positively or negatively) your career? Better yet, how have you helped your significant other?

Tuesday, January 20, 2015

Exercise Postpones Death

Lack of  Exercise Responsible for
Twice as Many Deaths as Obesity
Cambridge University, January 14, 2015

A brisk 20 minute walk each day could be enough to reduce an individual’s risk of early death, according to new research published today. The study of over 334,000 European men and women found that twice as many deaths may be attributable to lack of physical activity compared with the number of deaths attributable to obesity, but that just a modest increase in physical activity could have significant health benefits

Physical inactivity has been consistently associated with an increased risk of early death, as well as being associated with a greater risk of diseases such as heart disease and cancer. Although it may also contribute to an increased body mass index (BMI) and obesity, the association with early death is independent of an individual’s BMI.

To measure the link between physical inactivity and premature death, and its interaction with obesity, researchers analysed data from 334,161 men and women across Europe participating in the European Prospective Investigation into Cancer and Nutrition (EPIC) Study. Between 1992 and 2000, the researchers measured height, weight and waist circumference, and used self-assessment to measure levels of physical activity. The participants were then followed up over 12 years, during which 21,438 participants died. The results are published today in the American Journal of Clinical Exercise.

The researchers found that the greatest reduction in risk of premature death occurred in the comparison between inactive and moderately inactive groups, judged by combining activity at work with recreational activity; just under a quarter (22.7%) of participants were categorised as inactive, reporting no recreational activity in combination with a sedentary occupation. The authors estimate that doing exercise equivalent to just a 20 minute brisk walk each day – burning between 90 and 110 kcal (‘calories’) – would take an individual from the inactive to moderately inactive group and reduce their risk of premature death by between 16-30%. The impact was greatest amongst normal weight individuals, but even those with higher BMI saw a benefit.

Using the most recent available data on deaths in Europe the researchers estimate that 337,000 of the 9.2 million deaths amongst European men and women were attributable to obesity (classed as a BMI greater than 30): however, double this number of deaths (676,000) could be attributed to physical inactivity.

Professor Ulf Ekelund from the Medical Research Council (MRC) Epidemiology Unit at the University of Cambridge, who led the study, says: “This is a simple message: just a small amount of physical activity each day could have substantial health benefits for people who are physically inactive. Although we found that just 20 minutes would make a difference, we should really be looking to do more than this – physical activity has many proven health benefits and should be an important part of our daily life.”

Professor Nick Wareham, Director of the MRC Unit, adds: “Helping people to lose weight can be a real challenge, and whilst we should continue to aim at reducing population levels of obesity, public health interventions that encourage people to make small but achievable changes in physical activity can have significant health benefits and may be easier to achieve and maintain.”

Reference Ekelund, U et al. Activity and all-cause mortality across levels of overall and abdominal adiposity in European men and women: the European Prospective Investigation into Cancer and Nutrition Study (EPIC). American Journal of Clinical Nutrition; 14 Jan 2015 - See more at:

Monday, January 19, 2015

de Montfort's Parliament

Brief Introduction by the Blog Author

A radically different and more modern English parliament was held beginning exactly 750 years ago today (January 20th).  This was part of a century of unrest and civil wars following the signing of Magna Carta on June 15, 1215.  Below are some historical details about the new parliament.

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

De Montfort's Parliament was an English parliament held from 20 January 1265 until mid March the same year, instigated by Simon de Montfort, a baronial rebel leader. De Montfort had seized power in England following his victory over Henry III at the Battle of Lewes during the Second Baron’s War, but his grip on the country was under threat. In attempt to gather more support, he summoned representatives from not only the barons and the county knights, which had occurred in previous parliaments, but also burgesses from the major towns. The resulting parliament in London discussed radical reforms and temporarily stabilised de Montfort's political situation. De Montfort was killed at the Battle of Evesham later that year, but the idea of inviting both knights and burgesses to parliaments became more popular under the reign of Edward I and by the 14th century had become the norm, becoming known as the House of Commons. As a result, this parliament is sometimes referred to as the first English parliament, and de Montford himself is often termed the founder of the Commons.

Although de Montfort claimed to be ruling in the King's name through a council of officials, he had effective political control over the government, the first time this had happened in English history.  De Montfort successfully held a parliament in London in June 1264 to confirm new constitutional arrangements for England; four knights were summoned from each county, chosen by the county court, and were allowed to comment on general matters of state – the first time this had occurred.  De Montfort was unable to consolidate his victory at Lewes, however, and widespread disorder persisted across the country.  In France, Eleanor made plans for an invasion of England with the support of Louis.

In response, and hoping to win wider support for his government, de Montfort summoned a new parliament for 20 January 1265 which continued until mid March that year.  It was held at short notice, with the summons being issued on 14 December, leaving little time for attendees to respond.  He summoned not only the barons, senior churchmen and two knights from each county, but also two burgesses from each of the major towns such as York, Lincoln, Sandwich, and the Cinque Ports, the first time this had been done.  Due to the lack of support for de Montfort among the barons, only 23 of them were summoned to parliament, in comparison to the summons issued to 120 churchmen, who largely supported the new government. It is unknown how many burgesses were called but crucially included London, which was the largest city in England whose continuing loyalty was essential to de Montfort's cause.  The event was overseen by King Henry, and held in the Palace of Westminster.

This parliament was a populist, tactical move by de Montfort in an attempt to gather support from the regions, and the historian Jeffrey Hamilton characterises it as a "a very partisan assembly, not some sort of proto-democratic representative body". Once again the representatives were allowed to comment on wider political matters than just the usual issues of taxation.  The business of the parliament focused on enforcing the Provisions of Westminster, in particular its restrictions on the major nobles, and promising judicial help to those who felt they were suffering from unfair feudal lordship.

Sunday, January 18, 2015

Positive Quiddity: Eidetic Memory

Eidetic memory (/ˈdɛtɪk/) is an ability to recall images, sounds, or objects in memory after only a few instants of exposure, with high precision for sometime after exposure, without using mnemonics. It occurs in a small number of children and generally is not found in adults  The word eidetic comes from the Greek word εἶδος (pronounced [êːdos], eidos, "seen").


Eidetic memory is the ability to recall images in great detail after only a few minutes of exposure. It is found in early childhood (between 2 percent and 10 percent of that age group) and is unconnected with the person’s intelligence level. Like other memories, they are often subject to unintended alterations usually because of outside influences (such as the way an adult may ask a query about a memory). If the ability is not nurtured it usually begins to fade after the age of 6, perhaps as growing verbal skills alter the memory process.

Eidetic images are only available to a small percentage of children 6 through 12 years old and are virtually nonexistent in adults. However, extensive research has failed to demonstrate consistent correlates between the presence of eidetic imagery and any cognitive, intellectual, neurological, or emotional measure.

The popular culture concept of “photographic memory,” where someone can briefly look at a page of text and then recite it perfectly from memory, is not the same as seeing eidetic images, and photographic memory has never been demonstrated to exist.

A few adults have had phenomenal memories (not necessarily of images), but their abilities are also unconnected with their intelligence levels and tend to be highly specialized. In extreme cases, like those of Solomon Shereshervsky and Kim Peek, memory skills can reportedly hinder social skills.  Shereshevsky was a trained mnemonist, not a photographic memorizer, and there are no studies that confirm whether Kim Peek had true photographic memory.

Persons identified as having a condition known as Hyperthymesia (a k a highly superior autobiographical memory or HSAM) are able to remember very intricate details of their own personal life, but this ability seems not to extend to other, nonautobiographical information.  People with hyperthymesia have vivid recollections of such minutiae as what shoes a stranger wore or what they ate and how they felt on a specific date many years in the past. In cases where HSAM has been identified and studied, patients under study may show significantly different patterns of MRI brain activity from other individuals, or even have differences in physical brain structure. Possibly because of these extraordinary abilities, certain individuals have difficulties in social interactions with others who have normal memories (only 2 of 55 in the United States have successful marriages), and may additionally suffer from depression stemming from the inability to forget unpleasant memories and experiences from the past.  It has also been proposed that HSAM can be explained as a result of obsessive–compulsive thoughts about memories rather than “photographic memory.”

Saturday, January 17, 2015

Positive Quiddity: Thomas Nast

Thomas Nast (September 27, 1840 – December 7, 1902) was a German-born American caricaturist and editorial cartoonist who was the "Father of the American Cartoon".  He was the scourge of Boss Tweed and the Tammany Hall political machine.  Among his notable works were the creation of the modern version of Santa Claus and the political symbol of the elephant for the Republican Party. Contrary to popular belief, Nast did not create Uncle Sam (the male personification of the American people), Columbia (the female personification of American values), or the Democratic donkey, though he did popularize these symbols through his art. Nast was associated with the magazine Harper's Weekly from 1859 to 1860 and from 1862 until 1886.

Albert Boime argues that:

As a political cartoonist, Thomas Nast wielded more influence than any other artist of the 19th century. He not only enthralled a vast audience with boldness and wit, but swayed it time and again to his personal position on the strength of his visual imagination. Both Lincoln and Grant acknowledged his effectiveness in their behalf, and as a crusading civil reformer he helped destroy the corrupt Tweed Ring that swindled New York City of millions of dollars. Indeed, his impact on American public life was formidable enough to profoundly affect the outcome of every presidential election during the period 1864 to 1884.

In February 1860, he went to England for the New York Illustrated News to depict one of the major sporting events of the era, the prize fight between the American John C. Heenan and the English Thomas Sayers sponsored by George Wilkes, publisher of Wilkes' Spirit of the Times. A few months later, as artist for The Illustrated London News, he joined Garibaldi in Italy. Nast's cartoons and articles about the Garibaldi military campaigh to unify Italy captured the popular imagination in the U.S.  In February 1861, he arrived back in New York.  In September of that year, he married Sarah Edwards, whom he had met two years earlier.

He left the New York Illustrated News to work again, briefly, for Frank Leslie's Illustrated News.  In 1862, he became a staff illustrator for Harper's Weekly. In his first years with Harper's, Nast became known especially for compositions that appealed to the sentiment of the viewer. An example is "Christmas Eve" (1862), in which a wreath frames a scene of a soldier's praying wife and sleeping children at home; a second wreath frames the soldier seated by a campfire, gazing longingly at small pictures of his loved ones.  One of his most celebrated cartoons was "Compromise with the South" (1864), directed against those in the North who opposed the prosecution of the American Civil War.  He was known for drawing battlefields in border and southern states.  These attracted great attention, and Nast was called by President Abraham Lincoln "our best recruiting sergeant".

After the war, Nast strongly opposed the Reconstruction policy of President Andrew Johnson, who he depicted in a series of trenchant cartoons that marked "Nast's great beginning in the field of caricature". (includes many illustrations)

Friday, January 16, 2015

Phlogiston = chemical quackery

The phlogiston theory is an obsolete scientific theory that postulated a fire-like element called phlogiston, contained within combustible bodies and released during combustion. The name comes from the Ancient Greek φλογιστόν phlogistón (burning up), from φλόξ phlóx (flame). It was first stated in 1667 by Johann Joachim Becher.  The theory attempted to explain burning processes such as combustion and rusting, which are now collectively known as oxidation.


Phlogisticated substances are substances that contain phlogiston and dephlogisticate when burned.

In general, substances that burned in air were said to be rich in phlogiston; the fact that combustion soon ceased in an enclosed space was taken as clear-cut evidence that air had the capacity to absorb only a finite amount of phlogiston. When air had become completely phlogisticated it would no longer serve to support combustion of any material, nor would a metal heated in it yield a calx; nor could phlogisticated air support life. Breathing was thought to take phlogiston out of the body.

Thus, Becher described phlogiston as a process that explained combustion through a process that was opposite to that of oxygen.

Joseph Black’s student Daniel Rutherford discovered nitrogen in 1772 and the pair used the theory to explain his results. The residue of air left after burning, in fact a mixture of nitrogen and carbon dioxide, was sometimes referred to as phlogisticated air, having taken up all of the phlogiston. Conversely, when oxygen was first discovered, it was thought to be dephlogisticated air, capable of combining with more phlogiston and thus supporting combustion for longer than ordinary air.

History of the theory

In 1667, Johann Joachim Becher published his book Physical Education, which was the first mention of what would become the phlogiston theory. Traditionally, alchemists considered that there were four classical elements: fire, water, air, and earth. In his book, Becher eliminated fire and air from the classical element model and replaced them with three forms of earth: terra lapidea, terra fluida, and terra pinguis. Terra pinguis was the element that imparted oily, sulphurous, or combustible properties. Becher believed that terra pinguis was a key feature of combustion and was released when combustible substances were burned. In 1703 Georg Ernst Stahl, professor of medicine and chemistry at Halle, proposed a variant of the theory in which he renamed Becher's terra pinguis to phlogiston, and it was in this form that the theory probably had its greatest influence.

Challenge and Demise

Eventually, quantitative experiments revealed problems, including the fact that some metals, such as magnesium, gained mass when they burned, even though they were supposed to have lost phlogiston. Some phlogiston proponents explained this by concluding that phlogiston had negative mass; others, such as Louis-Bernard Guyton de Morveau, gave the more conventional argument that it was lighter than air. However, a more detailed analysis based on the Archimedean principle and the densities of magnesium and its combustion product shows that just being lighter than air cannot account for the increase in mass.

During the eighteenth century, as it became clear that metals gained mass when they were oxidized, phlogiston was increasingly regarded as a principle rather than a material substance. By the end of the eighteenth century, for the few chemists who still used the term phlogiston, the concept was linked to hydrogen.  Joseph Priestley, for example, in referring to the reaction of steam on iron, whilst fully acknowledging that the iron gains mass as it binds with oxygen to form a calx, iron oxide, iron also loses “the basis of inflammable air (hydrogen), and this is the substance or principle, to which we give the name phlogiston.” Following Lavoisier’s description of oxygen as the oxidizing principle (hence its name, from Ancient Greek: oksús, “sharp”; génos, “birth”, referring to oxygen's role in the formation of acids.). Priestley described phlogiston as the alkaline principle.

Phlogiston remained the dominant theory until the 1780s when Antoine-Laurent Lavoisier showed that combustion requires a gas that has mass (oxygen) and could be measured by means of weighing closed vessels. The use of closed vessels also negated the buoyancy that had disguised the mass of the gases of combustion. These observations solved the mass paradox and set the stage for the new caloric theory of combustion.

Enduring Aspects

Phlogiston theory permitted chemists to bring clarification of apparently different phenomena into a coherent structure: combustion, metabolism, and configuration of rust. The recognition of the relation between combustion and metabolism was a forerunner of the recognition that the metabolism of living organisms and combustion can be understood in terms of fundamentally related chemical processes.

Thursday, January 15, 2015

Clever Chemist George de Hevesy

George Charles de Hevesy (German: Georg Karl von Hevesy) (1 August 1885 – 5 July 1966) was a Hungarian radiochemist and Nobel laureate, recognized in 1943 for his key role in the development of radioactive tracers to study chemical processes such as in the metabolism of animals. He also co-discovered the element hafnium.

Early Years

Hevesy György was born in Budapest, Hungary, to a wealthy and ennobled Roman Catholic of Hungarian Jewish descent family, the fifth of eight children to his parents Lajos (Louis) Bischitz and Baroness Eugenia (Jenny) Schossberger (ennobled as "De Tornya"). Grandparents from both sides of the family had provided the presidents of the Jewish community of Pest.  George grew up in Budapest and graduated high school in 1903 from Piarista Gimnázium. The family's name in 1904 was Hevesy-Bischitz, and Hevesy later changed his own.

De Hevesy began his studies in chemistry at the University of Budapest for one year, and at the Technical University of Berlin for several months, but changed to the University of Freiburg. There he came in contact with Ludwig Gattermann. In 1906 he started his Ph.D. thesis with Georg Franz Julius Meyer, acquiring his doctorate in physics in 1908. In 1908 Hevesy got a position at the ETH (Eidgenössische Technische Hochschule in Zurich, Switzerland).


In 1922 de Hevesy co-discovered hafnium (72Hf) (Latin Hafnia for "Copenhaghen", the home town of Niels Bohr), with Dirk Coster.  Mendeleev’s periodic table in 1869 put the chemical elements into a logical system, however there was missing a chemical element with 72 protons. On the basis of Bohr's atomic model Hevesy came to the conclusion that there must be a chemical element that goes there. The mineralogical museum of Norway and Greenland in Copenhagen furnished the material for the research. Characteristic X-ray spectra recordings made of the sample indicated that a new element was present. This earned him the 1943 Nobel Prize in Chemistry.

Hevesy was offered and accepted a job from the University of Freiburg. Supported financially by the Rockefeller Foundation, he had a very productive year. He developed the X-ray fluorescence analytical method, and discovered the Samarium alpha-ray. It was here he began the use of radioactive isotopes in studying the metabolic processes of plants and animals, by tracing chemicals in the body by replacing part of stable isotopes with small quantities of the radioactive isotopes. In 1923, Hevesy published the first study on the use of the naturally radioactive 212Pb as radioactive tracer to follow the absorption and translocation in the roots, stems and leaves of Vicia faba, also known as the broad bean.

World l War II and Beyond

When Nazi Germany occupied Denmark from April 1940, during World War II, de Hevesy dissolved the gold Nobel Prizes of Max von Laue and James Franck with aqua regia; it was illegal at the time to send gold out of the country, and were it discovered that Laue and Franck had done so to prevent them from being stolen, they could have faced prosecution in Germany. He placed the resulting solution on a shelf in his laboratory at the Niels Bohr Institute. After the war, he returned to find the solution undisturbed and precipitated the gold out of the acid. The Nobel Society then recast the Nobel Prizes using the original gold.

In 1943 Copenhagen was no longer seen as safe for a Jewish scientist and de Hevesy fled to Sweden, where he worked at the Stockholm University College until 1961. In Stockholm de Hevesy was received at the department of German by the Swedish professor and Nobel Prize winner Hans von Euler-Chelpin, who remained strongly pro-German throughout the war. Despite this, de Hevesy and von Euler-Chelpin collaborated on many scientific papers during and after the war.

During his time in Stockholm, de Hevesy received the Nobel Prize in chemistry. He later was inducted as a member of the Royal Society and received the Copley Medal, of which he was particularly proud. De Hevesy stated: "The public thinks the Nobel Prize in chemistry for the highest honor that a scientist can receive, but it is not so. Forty or fifty received Nobel chemistry prizes, but only ten foreign members of the Royal Society and two (Bohr and Hevesy) received a medal-Copley." George de Hevesy was elected a foreign member of the Royal Swedish Academy of Sciences in 1942, and his status was later changed to Swedish member. He received the Atoms for Peace Award in 1958 for his peaceful use of radioactive isotopes.

See also this short video: