Monday, February 29, 2016

February 29th -- "Leap Day"

February 29, also known as leap day or leap year day, is a date added to most years that are divisible by 4, such as 2008, 2012, 2016, 2020, and 2024. A leap day is added in various solar calendars (calendars based on the Earth's rotation around the Sun), including the Gregorian calendar standard in most the world. Lunisolar calendars (calendars based on the rotation of the Moon) instead add a leap or intercalary month.

In the Gregorian calendar, years that are divisible by 100, but not by 400, do not contain a leap day. Thus, 1700, 1800, and 1900 did not contain a leap day, 2100, 2200, and 2300 will not contain a leap day, while 1600 and 2000 did, and 2400 will. Years containing a leap day are called leap years. February 29 is the 60th day of the Gregorian calendar in such a year, with 306 days remaining until the end of the year. In the Chinese calendar, this day will only occur in years of the monkey, dragon, and rat.

A leap day is observed because a complete revolution around the Sun takes approximately 6 hours longer than 365 days (8,760 hours). It compensates for this lag, realigning the calendar with the Earth's position in the Solar System; otherwise, seasons would occur in a different time than intended in the calendar year.

Leap Years

Although most modern calendar years have 365 days, a complete revolution around the Sun (one solar year) takes approximately 365 days and 6 hours. An extra 24 hours thus accumulates every four years, requiring that an extra calendar day be added to align the calendar with the Sun's apparent position. Without the added day, in future years the seasons would occur later in the calendar, eventually leading to confusion about when to undertake activities dependent on weather, ecology, or hours of daylight.

A solar year is actually slightly shorter than 365 days and 6 hours (365.25 days). As early as the 13th century it was recognized that the year is shorter than the 365.25 days assumed by the Julian calendar: the Earth's orbital period around the Sun was derived from the medieval Alfonsine tables as 365 days, 5 hours, 49 minutes, and 16 seconds (365.2425 days). The currently accepted modern figure is 365 days, 5 hours, 48 minutes, 45 seconds. Adding a calendar day every four years, therefore, results in an excess of around 44 minutes for those four years, or about 3 days every 400 years. To compensate for this, three days are removed every 400 years. The Gregorian calendar reform implements this adjustment by making an exception to the general rule that there is a leap year every four years. Instead, a year divisible by 100 is not a leap year unless that year was also exactly divisible by 400. This means that the years 1600, 2000, and 2400 are leap years, while the years 1700, 1800, 1900, 2100, 2200, 2300, and 2500 are not leap years.

Leap Second

The concepts of the leap year and leap day are distinct from the leap second, which results from changes in the Earth's rotational speed. But the basic problem is the same: the quotient of the larger measure of time by the smaller is a non-integer. There is no way to perfectly fit a whole number of days/months into a year, nor is there a way to perfectly fit a whole number of seconds into a day. Leap seconds and leap years are used to correct the resulting drift.


A person who is born on February 29 may be called a "leapling" or a "leap-year baby". In non-leap years, some leaplings celebrate their birthday on either February 28 or March 1, while others only observe birthdays on the authentic intercalary date, February 29.

Folk Traditions

There is a popular tradition known as Bachelor's Day in some countries allowing a woman to propose marriage to a man on February 29. If the man refuses, he then is obliged to give the woman money or buy her a dress. In upper-class societies in Europe, if the man refuses marriage, he then must purchase 12 pairs of gloves for the woman, suggesting that the gloves are to hide the woman's embarrassment of not having an engagement ring. In Ireland, the tradition is supposed to originate from a deal that Saint Bridget struck with Saint Patrick.

In the town of Aurora, Illinois, single women are deputized and may arrest single men, subject to a four-dollar fine, every February 29.

In Greece, it is considered unlucky to marry on a leap day.

Sunday, February 28, 2016

Slower American Growth


American economic growth is very likely to slow for decades into the future, in large part because the period of spectacular gains already occurred with industrialization and technology from 1870 to 1987.  Robert J. Gordon details his case for inevitable slow growth in a new book, The Rise and Fall of American Growth, published last month.

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

The Rise and Fall of American Growth: The U.S. Standard of Living since the Civil War (The Princeton Economic History of the Western World) Hardcover – January 12, 2016

by Robert J. Gordon

= = = = = = customer review: = = = = = = =

5 Stars
A profoundly important book and a great read
By Mark Witte on January 6, 2016

I think of Rashomon; a gripping story told from a series of different angles, each reinforcing but also changing our perspective on what we learned in the others. Gordon breaks America's unprecedented rise in quality of life into two periods, 1870-1940 and 1940-2015. In the first period and then the second, he lays out a series of chapters where each documents one area of the technological change that re-shaped the lives Americans lived. In the 1870-1940 period, these include food, clothes, lighting, communication, entertainment, transportation, healthcare, and finance. In the 1940-2015 period, the parallel stories of improvement are some repeats (food and clothes transportation, entertainment, and healthcare), but also the way computers have worked their way into our lives and how a large share of modern people have to plan for and live lives that extend far beyond the end of their working years. This is all presented with a readable mix of clear data analysis and wonderful supporting anecdotes. 

This book is something of a spin-off from Gordon’s influential work on the slowing of growth in the US. His “headwinds” argument is roughly that if the US were to keep up the same rate of technological progress going forward as we saw over the decades from 1987-2007, then the US would see markedly slower income growth for average households due to rising headwinds due to an aging population, rising inequality, limited further educational gains, reductions in the amount of CO2 emissions we will allow, etc. The objection that is commonly raised to his pessimism is that maybe technological growth going forward will vastly outshine what the US enjoyed from 1987-2007 (which, aside from the rise of the commercial internet, home computing, cellphones, and advances in pharma, one might argue wasn’t that great a period to change, right?). So, sure, technological growth going forward might be much greater than what we’ve seen so far (Joel Mokyr makes this case eloquently), but the 1987-2007 benchmark is a hard one to beat. Supporting this view, Gordon points out how the changes that brought the huge improvements in quality of life in the past will be difficult to approach in the future. Sure, Skype is great and cellphones are convenient, but their introduction did far less for human happiness than the introduction of the phone. Granted, it would be nice to be able to get from New York to Tokyo in eight hours, but that gain was nowhere near the boon we gained from moving from ships to airplanes. Certainly most of us would be glad to live to 120, but the gain in those years won’t bring us nearly the joy that the rise in life expectancy we say from 42 in 1890 to 62 in 1940.

Nevertheless, we will always continue to hope for breakthroughs, and this book is an example of that. Gordon has written a timeless classic.

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

Margaret Wente wrote a noteworthy review of The Rise and Fall of American Growth for the [Toronto] Globe and Mail at:

Saturday, February 27, 2016

The Kid from Kokomo

The Kid from Kokomo is a 1939 American comedy film directed by Lewis Seiler and written by Richard Macaulay and Jerry Wald. The film stars Pat O'Brien, Wayne Morris, Joan Blondell, May Robson, Jane Wyman and Stanley Fields. The film was released by Warner Bros. on May 23, 1939.

In need of a new prizefighter, manager Billy Murphy and his sweetheart Doris Harvey come across one in Kokomo, Indiana, a kid called Homer Baston who's got great potential. The kid's a little dim, however, explaining how he can't leave Kokomo because his mother abandoned him as a baby but promised to come back.

Billy and Doris convince him to go on the road, where Homer will have a better chance of finding his long-missing mother. Homer gets homesick, so Billy pays the bail of a thief, Maggie Manell, hiring her to pretend to be Homer's ma. She begins spending most of Homer's money, and Billy's scheme to bring in her pal Muscles Malone backfires when Homer's led to believe Muscles is his dad.

While falling for Marian Bronson, a reporter, Homer trains for a title fight against Curley Bender, but is convinced by "Ma" to lose on purpose because she owes money to gamblers. In the ring, Curley insults his mother, so Homer knocks him out. Billy and Doris look on as a double wedding is held, Homer marrying Marian while a reluctant Maggie and Muscles do likewise, becoming his new foster parents.

Afterword by the Blog Author

This is a great, well-acted, short, stunningly hilarious movie.  The screwball comedy of mistaken identification is a good form, and the dialog is snappy, energizing and so witty it is similar to shock humor.  Highly recommended.

Friday, February 26, 2016

Age Means Sensory Loss

Common Problem for Older
Adults: Losing the 5 Key Senses
American Geriatrics Society, February 19, 2016

It's a well-known fact that aging can lead to losing one's senses: vision, smell, hearing, touch, and taste. In previous studies, researchers have learned about the consequences of experiencing a decline in a single sense. For example, losing senses of smell, vision, and hearing have all been linked to cognitive decline, poor mental health, and increased mortality. Losing the sense of taste can lead to poor nutrition and even death in certain instances. However, until now little has been known about losing multiple senses. In a new study, researchers examined how often multisensory losses occur and what their impact on older adults might be.

In a study published in the Journal of the American Geriatrics Society, University of Chicago researchers analyzed data from the National Social Life, Health, and Aging Project (NSHAP), a population-based study of adults ages 57-85. The study collected information about the participants' senses of vision, touch, smell, hearing, and taste. The participants were also asked to rate their physical health.

The researchers reported several key findings:

  • 94 percent of the participants experienced loss in at least one of their senses; 67 percent had two or more sensory losses. Of those with multisensory losses, 65 percent had substantial loss in at least one of their senses, and 22 percent experienced substantial loss in two or more senses.
  • 74 percent of participants suffered impairment in their ability to taste, which was the most common sensory loss.
  • 38 percent of participants had a sense of touch that was "fair;" 32 percent said it was "poor."
  • 22 percent had smell impairment (19 percent fair/3 percent poor function).
  • 14 percent had corrected distance vision that was "fair;" 6 percent said it was "poor."
  • 13 percent rated their corrected hearing as "fair;" 5 percent said it was "poor."

Older age was linked to poorer function in all five senses; the largest differences were in hearing, vision, and smell. What's more, men had worse functioning for hearing, smell, and taste than did women--although men had better corrected vision than women. African Americans and Hispanics tended to have worse sensory function than Caucasians in all senses except hearing. Hispanics tended to have better function in taste than those from other groups.

The researchers said that losing more than one sense might explain why older adults report having a poorer quality of life and face challenges in interacting with other people and the world around them. The researchers suggested that further studies into multisensory loss hold promise for designing better programs to prevent or treat loss and to ease the suffering such losses cause.

Thursday, February 25, 2016

Pronoia -- the useful conspiracy theory

Pronoia is a neologism that is defined as the opposite state of mind to paranoia: having the sense that there is a conspiracy that exists to help the person. It is also used to describe a philosophy that the world is set up to secretly benefit people.

In 2008 the writer and Electronic Frontier Foundation co-founder John Perry Barlow defined pronoia as "the suspicion the Universe is a conspiracy on your behalf".


The concept may have first appeared in 1982, when the academic journal Social Problems published an article entitled "Pronoia" by Dr. Fred H. Goldner of Queens College describing a phenomenon opposite to paranoia and providing numerous examples of specific persons who displayed such characteristics. It received a good deal of publicity at the time, including references in Psychology Today, The New York Daily News, The Wall Street Journal, etc.

It was subsequently picked up in England and written about as described below. Wired Magazine published an article in issue 2.05 (May 1994) titled "Zippie!". The cover of the magazine featured a psychedelic image of a smiling young man with wild hair, a funny hat, and crazy eyeglasses. Written by Jules Marshall, the article announced an organized cultural response to Thatcherism in the United Kingdom.  The opening paragraphs of the article describe "a new and contagious cultural virus" and refer to pronoia as "the sneaking feeling one has that others are conspiring behind your back to help you". The article announces a cultural, musical, invasion of the United States to rival the British Invasion of 1964-1966, culminating with a "Woodstock Revival" to be staged at the Grand Canyon in August 1994. The spokesperson for the Zippies, Fraser Clark, dubs this movement the "Zippy Pronoia Tour".

A New York Times article published August 7, 1994, titled "For Peace and Love, Try Raving Till Dawn" also described the Zippies and their efforts. It contained two references to pronoia.

Pronoia Philosophy

Pronoia can be defined as the opposite of paranoia. A person suffering from paranoia suspects that persons or entities (e.g. governments/deities) conspire against them. A person experiencing pronoia feels that the world around them conspires to do them good.

The writer Philip K Dick referred to pronoia as an antidote to paranoia in his private work, Exegesis, in which it is mentioned in relation to his perceived protection by an entity he called, V.A.L.I.S., an acronym for Vast Active Living Intelligence System. Published posthumously in 2011, the word 'pronoia' first turns up in his Exegesis in January 1980.[5] Dick suggested his own pronoia was based on an 'intelligent analysis' of his mystical experiences, and was not, 'reflexive or mechanical' in its nature.

Pronoia is also a prevalent theme in the 1988 novel The Alchemist, by Paulo Coelho. In it, the protagonist, a young boy is told by an older man to pursue his dreams. He tells the boy, "When you want something, all the universe conspires in helping you to achieve it."  The book also deals with omens, signs that the universe wants the boy to follow a specific path, which will lead to his goal of fulfilling a dream.

In the 1997 movie Fierce Creatures, Jamie Lee Curtis's character describes Kevin Kline's character as excessively pronoid: "It means that despite all the available evidence, you actually think that people like you. Your perception of life is that it is one long benefit dinner in your honor, with everyone cheering you on and wanting you to win everything. You think you're the prince, Vince."

Pronoia philosophy has been proposed by the astrologer Rob Brezsny. Brezsny's book, Pronoia Is the Antidote for Paranoia: How the Whole World Is Conspiring to Shower You with Blessings, published in 2005, explores the philosophy of pronoia.

Pronoia Affliction

Dr. Fred H. Goldner, writing at Queens College in October 1982, published a paper in Social Problems (V.30, N.1:82-91), in which he used the term pronoia to describe a psychological affliction. He characterized pronoia as a mirror image of paranoia, which leads the sufferer to unrealistically believe that persons or entities conspire against them.

Pronoia is the positive counterpart of paranoia. It is the delusion that others think well of one. Actions and the products of one's efforts are thought to be well received and praised by others. Mere acquaintances are thought to be close friends; politeness and the exchange of pleasantries are taken as expressions of deep attachment and the promise of future support. Pronoia appears rooted in the social complexity and cultural ambiguity of our lives: we have become increasingly dependent on the opinions of others based on uncertain criteria.

Long before the term was coined, J.D. Salinger referred to the concept in his 1955 novella, Raise High the Roof Beam, Carpenter.  In it, the character Seymour Glass writes in his diary, “Oh, God, if I'm anything by a clinical name, I'm a kind of paranoiac in reverse. I suspect people of plotting to make me happy."

Other Voices

The Outsourced Zippy and Pronoia Page claims that the Zippie/Pronoia phenomenon is simply a hoax and the Zippy Pronoia Tour was nothing more than a publicity stunt for Wired Magazine. It seems that pronoia, aside from being a system of provisions, took on a practical psychological and linguistic role during the mid-Twentieth century. At first various people, like John Perry Barlow (Grateful Dead) and Jules Marshall (Mediamatic), tried to attribute pronoia to "an open conspiracy" involving Wired Magazine. Pronoia is not merely an ideology about land and property, but also the opposite of paranoia, "the belief that others are conspiring behind your back to help you", or "the idea that the universe is a conspiracy on your behalf".

Pronoia is referenced as the opposite of a noted line from Catch 22 by Joseph Heller, “Just because you're paranoid doesn't mean they aren't after you”, by actress Susan Sarandon in relation to the Cloud Atlas movie, as the "universe being for you".

Wednesday, February 24, 2016

James Webb Space Telescope

The James Webb Space Telescope (JWST), previously known as Next Generation Space Telescope (NGST), is a flagship-class space observatory under construction and scheduled to launch in October 2018. The JWST will offer unprecedented resolution and sensitivity from long-wavelength (orange-red) visible light, through near-infrared to the mid-infrared, and is a successor instrument to the Hubble Space Telescope and the Spitzer Space Telescope. While Hubble has a 2.4-meter (7.9 ft) mirror, the JWST features a larger and segmented 6.5-meter (21 ft) diameter primary mirror and will be located near the Earth–Sun L2 point. A large sunshield will keep its mirror and four science instruments below 50 K (−220 °C; −370 °F).

JWST's capabilities will enable a broad range of investigations across the fields of astronomy and cosmology. One particular goal involves observing some of the most distant objects in the Universe, beyond the reach of current ground and space based instruments. This includes the very first stars, the epoch of reionization, and the formation of the first galaxies. Another goal is understanding the formation of stars and planets. This will include imaging molecular clouds and star-forming clusters, studying the debris disks around stars, direct imaging of exoplanets, and spectroscopic examination of planetary transits.

In gestation since 1996, the project represents an international collaboration of about 17 countries led by NASA, and with significant contributions from the European Space Agency and the Canadian Space Agency. It is named after James E. Webb, the second administrator of NASA, who played an integral role in the Apollo program.

The JWST has a history of major cost overruns and delays. The first realistic budget estimates were that the observatory would cost $1.6 billion and launch in 2011. NASA has now scheduled the telescope for a 2018 launch. In 2011, the United States House of Representatives voted to terminate funding, after about $3 billion had been spent and 75% of its hardware was in production.  Funding was restored in compromise legislation with the US Senate, and spending on the program was capped at $8 billion.  As of December 2014, the telescope remained on schedule and within budget, but at risk of further delays.


The JWST originated in 1996 as the Next Generation Space Telescope (NGST). In 2002 it was renamed after NASA's second administrator (1961–1968) James E. Webb (1906–1992), noted for playing a key role in the Apollo program and establishing scientific research as a core NASA activity.  The JWST is a project of the National Aeronautics and Space Administration, the United States space agency, with international collaboration from the European Space Agency and the Canadian Space Agency.

The telescope has an expected mass about half of Hubble Space Telescope's, but its primary mirror (a 6.5 meter diameter gold-coated beryllium reflector) will have a collecting area about five times as large (25 m2 vs. 4.5 m2). The JWST is oriented towards near-infrared astronomy, but can also see orange and red visible light, as well as the mid-infrared region, depending on the instrument. The telescope will focus on the near to mid-infrared for three main reasons: high-redshift objects have their visible emissions shifted into the infrared, cold objects such as debris disks and planets emit most strongly in the infrared, and this band is difficult to study from the ground or by existing space telescopes such as Hubble.

The JWST will operate near the Earth-Sun L2 Lagrange point, approximately 1,500,000 kilometres (930,000 mi) beyond the Earth. Objects near this point can orbit the Sun in synchrony with the Earth, allowing the telescope to remain at a roughly constant distance and use a single sunshield to block heat and light from the Sun and Earth. This will keep the temperature of the spacecraft below 50 K (−220 °C; −370 °F), necessary for infrared observations.

Launch is scheduled for October 2018 on an Ariane 5 rocket. Its nominal mission time is five years, with a goal of ten years.  The prime contractor is Northrop Grumman.

See also this seven minute video at:

Tuesday, February 23, 2016

Sleep Locks in Memories

Brain Activity Patterns During Sleep
Consolidates Memory
University of Bristol, February 19, 2016

Why does sleeping on it help? This is the question tackled by new research at the University of Bristol, which reveals how brain activity during sleep sorts through the huge number of experiences we encounter every day, filing only the important information in memory.
The new discoveries, made by researchers from Bristol’s Centre for Synaptic Plasticity, provide further evidence for the benefits of a good night’s sleep.  This is important because the bad nights of sleep often experienced by both the healthy population, and people with schizophrenia or Alzheimer’s disease, lead to impaired mental function.

The findings, published today in the journal Cell Reports,and put into context in an article in Trends in Neuroscience, show that patterns of brain activity that occur during the day are replayed at fast-forward speed during sleep.

This replayed activity happens in part of the brain called the hippocampus, which is our central filing system for memories.  The key new finding is that sleep replay strengthens the microscopic connections between nerve cells that are active – a process deemed critical for consolidating memories. Therefore, by selecting which daytime activity patterns are replayed, sleep can sort and retain important information. 

Lead researcher Dr Jack Mellor, from the School of Physiology, Pharmacology and Neuroscience, said: ‘These findings are about the fundamental processes that occur in the brain during the consolidation of memory during sleep. It also seems that the successful replay of brain activity during sleep is dependent on the emotional state of the person when they are learning. This has major implications for how we teach and enable people to learn effectively.’

The research team involved the University of Bristol’s Centre for Synaptic Plasticity within the School of Physiology, Pharmacology & Neuroscience and was supported by the MRC, Wellcome Trust, EPSRC and Eli Lilly & Co.

Monday, February 22, 2016

Body Organs May Have a Sexual Identity

A Deeper Take on our Sexual Nature
Date:     February 17, 2016
Source: MRC Clinical Sciences Centre/Institute of Clinical Sciences (ICS) Faculty of
              Medicine, Imperial Coll

The organs in our body may have a sexual identity of their own, new research suggests. The idea that our organs could be "male" or "female" raises the possibility that women and men may need different treatments as a result. The findings could also shed light on why it is that some cancers are more common in women, and others in men.
The study, published today in Nature, was carried out in fruit files by a team at the MRC Clinical Sciences Centre (CSC), based at Imperial College London.

"We wanted to ask a very basic question: whether it is just the cells of the sex organs of a fully developed organism that 'know' their sexual identity, or whether this is true of cells in other organs too -- and whether that matters," said Irene Miguel-Aliaga, who led the research and heads the Gut Signalling and Metabolism group at the CSC.

To do this, the CSC team examined stem cells in flies' intestines. They used genetic tools that allow them to turn genes "on" and "off" specifically in these cells. This allowed them to tailor the cells to be more "female" or more "male." When the team feminised or masculinised the flies' gut stem cells this changed the extent to which the cells multiplied. Female, or feminised cells were better able to proliferate.

This enhanced ability appears to allow the intestine in female flies to grow during reproduction. Miguel-Aliaga has previously shown that after mating, the female fly gut is re-sized and metabolically remodelled to sustain reproduction. She speculated at the time that if such enlargement also helps to ensure optimum nutrition for a developing fetus in humans, this may help to explain why women don't need to "eat for two" during pregnancy.

In the current study, the team found that the effect of feminising adult gut stem cells was reversible. "If we take a female fly and then in the adult we masculinise the stem cells in the intestine and wait, within three weeks the gut shrinks to the smaller, male-like size," said CSC and EMBO*-postdoc, Bruno Hudry, who is first author on today's paper.

The team also found that the female intestine was more prone to tumours. "We find it's a lot easier to create genetically-induced tumours in females than in males. So we suspect there is a trade-off going on. Females need this increased plasticity to cope with reproduction, but in certain circumstances that can be deleterious and make the female gut more prone to tumours."

It was known that the gonads, or sex organs, of vertebrates retain considerable plasticity: adult ovary and adult testis cells in mice can trans-differentiate into their counterparts following just a single genetic change. So cells in the gonads must have their sexual identity continuously reinforced throughout their postnatal life.

The teams believes this to be the first time, however, that such plasticity has been demonstrated in adult cells outside the gonads. "If we now take the fly in which we've masculinised the stem cells, and mate this fly, we see that its gut is not re-sized in response to reproduction. So we think that what the stem cell sexual identity is doing is to confer differential plasticity on the female gut," Hudry said.

It may be obvious that males and females develop sex-specific organs as a fetus develops, but according to Miguel-Aliaga there has been an assumption that organs that are the same in both sexes function differently only because of different circulating hormones -- for example, the oestrogen in women and testosterone in men that kick-in at puberty.

What's surprising is that in an adult organ found in both sexes, such as the intestine, differences remain, and that these are not due to either developmental history or circulating hormones. The benefit of using a fly model is that the team can masculinise or feminise individual adult stem cells and explore the outcome without changing the fly's developmental history or its circulating hormones.

In the course of this research, the team identified a potentially important new mechanism behind this sex switching, which they suggest may be operating in more of the organs and tissues of our bodies than has previously been recognised. Miguel-Aliaga sees this as an important finding of today's paper.

The formation of female or male characteristics involves a cascade of genetic events. "We saw that the sex determinants at the top of this cascade were active, but parts of the cascade that had previously been shown to be active, for example in the sex organs or during development, did not function in these stem cells. This told us a new branch of the sex determination pathway is at play."

Until now, it has been assumed that the only cells with a sexual identity are those in which this recognised cascade is active. "We have found a new mechanism that is independent of this, which potentially means that every cell in the fly has a sexual identity."

According to Miguel-Aliaga this raises the intriguing possibility that cells in many more fly organs than previously assumed may have their own sexual identity, and that this might be the case in people too.

"We want to know what this new branch is all about. We have found three genes that are important in the gut stem cells, and are intrigued to know whether these three genes play a similar role in cells outside of the gut in other body tissues as well." The team is keen to examine these three genes further, not least because people carry genes that are analogous to the three they've now highlighted as important.

Dr Des Walsh, head of the population and systems medicine board at the MRC, said: "This study is an interesting piece of biological research that extends our understanding of why male and female physiology is different, beyond the obvious.

"Further research is now needed to see how this finding translates to humans. If this intrinsic knowledge held by stem cells is indeed driving the way our organs behave, it could also influence the way these same organs respond to treatment."

The research was supported by the CSC Genomics and Bioinformatics facilities, and funded by the European Research Council and the Medical Research Council.

               *European Molecular Biology Organisation.

Sunday, February 21, 2016

11 American Cultures

A study from Tufts University written by Colin Woodard gives excellent coverage of the eleven cultural regions in America.  My only criticism is that there is a tiny twelfth district – Federal Districts entirely dependent on the central government and obsequiously seeking funding.  This includes Washington, D.C., Hawaii and federal territories such as the Pacific islands in trust and Guam.

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =


YANKEEDOM. Founded on the shores of Massachusetts Bay by radical Calvinists as a new Zion, Yankeedom has, since the outset, put great emphasis on perfecting earthly civilization through social engineering, denial of self for the common good, and assimilation of outsiders. It has prized education, intellectual achievement, communal empowerment, and broad citizen participation in politics and government, the latter seen as the public’s shield against the machinations of grasping aristocrats and other would-be tyrants. Since the early Puritans, it has been more comfortable with government regulation and public-sector social projects than many of the other nations, who regard the Yankee utopian streak with trepidation.

NEW NETHERLAND. Established by the Dutch at a time when the Netherlands was the most sophisticated society in the Western world, New Netherland has always been a global commercial culture—materialistic, with a profound tolerance for ethnic and religious diversity and an unflinching commitment to the freedom of inquiry and conscience. Like seventeenth-century Amsterdam, it emerged as a center of publishing, trade, and finance, a magnet for immigrants, and a refuge for those persecuted by other regional cultures, from Sephardim in the seventeenth century to gays, feminists, and bohemians in the early twentieth. Unconcerned with great moral questions, it nonetheless has found itself in alliance with Yankeedom to defend public institutions and reject evangelical prescriptions for individual behavior.

THE MIDLANDS. America’s great swing region was founded by English Quakers, who believed in humans’ inherent goodness and welcomed people of many nations and creeds to their utopian colonies like Pennsylvania on the shores of Delaware Bay. Pluralistic and organized around the middle class, the Midlands spawned the culture of Middle America and the Heartland, where ethnic and ideological purity have never been a priority, government has been seen as an unwelcome intrusion, and political opinion has been moderate. An ethnic mosaic from the start—it had a German, rather than British, majority at the time of the Revolution—it shares the Yankee belief that society should be organized to benefit ordinary people, though it rejects top-down government intervention.

TIDEWATER. Built by the younger sons of southern English gentry in the Chesapeake country and neighboring sections of Delaware and North Carolina, Tidewater was meant to reproduce the semifeudal society of the countryside they’d left behind. Standing in for the peasantry were indentured servants and, later, slaves. Tidewater places a high value on respect for authority and tradition, and very little on equality or public participation in politics. It was the most powerful of the American nations in the eighteenth century, but today it is in decline, partly because it was cut off from westward expansion by its boisterous Appalachian neighbors and, more recently, because it has been eaten away by the expanding federal halos around D.C. and Norfolk.

GREATER APPALACHIA. Founded in the early eighteenth century by wave upon wave of settlers from the war-ravaged borderlands of Northern Ireland, northern England, and the Scottish lowlands, Appalachia has been lampooned by writers and screenwriters as the home of hillbillies and rednecks. It transplanted a culture formed in a state of near constant danger and upheaval, characterized by a warrior ethic and a commitment to personal sovereignty and individual liberty. Intensely suspicious of lowland aristocrats and Yankee social engineers alike, Greater Appalachia has shifted alliances depending on who appeared to be the greatest threat to their freedom. It was with the Union in the Civil War. Since Reconstruction, and especially since the upheavals of the 1960s, it has joined with Deep South to counter federal overrides of local preference.

DEEP SOUTH. Established by English slave lords from Barbados, Deep South was meant as a West Indies–style slave society. This nation offered a version of classical Republicanism modeled on the slave states of the ancient world, where democracy was the privilege of the few and enslavement the natural lot of the many. Its caste systems smashed by outside intervention, it continues to fight against expanded federal powers, taxes on capital and the wealthy, and environmental, labor, and consumer regulations.

EL NORTE. The oldest of the American nations, El Norte consists of the borderlands of the Spanish American empire, which were so far from the seats of power in Mexico City and Madrid that they evolved their own characteristics. Most Americans are aware of El Norte as a place apart, where Hispanic language, culture, and societal norms dominate. But few realize that among Mexicans, norteños have a reputation for being exceptionally independent, self-sufficient, adaptable, and focused on work. Long a hotbed of democratic reform and revolutionary settlement, the region encompasses parts of Mexico that have tried to secede in order to form independent buffer states between their mother country and the United States.

THE LEFT COAST. A Chile-shaped nation wedged between the Pacific Ocean and the Cascade and Coast mountains, the Left Coast was originally colonized by two groups: New Englanders (merchants, missionaries, and woodsmen who arrived by sea and dominated the towns) and Appalachian midwesterners (farmers, prospectors, and fur traders who generally arrived by wagon and controlled the countryside). Yankee missionaries tried to make it a “New England on the Pacific,” but were only partially successful. Left Coast culture is a hybrid of Yankee utopianism and Appalachian self-expression and exploration—traits recognizable in its cultural production, from the Summer of Love to the iPad. The staunchest ally of Yankeedom, it clashes with Far Western sections in the interior of its home states.

THE FAR WEST. The other “second-generation” nation, the Far West occupies the one part of the continent shaped more by environmental factors than ethnographic ones. High, dry, and remote, the Far West stopped migrating easterners in their tracks, and most of it could be made habitable only with the deployment of vast industrial resources: railroads, heavy mining equipment, ore smelters, dams, and irrigation systems. As a result, settlement was largely directed by corporations headquartered in distant New York, Boston, Chicago, or San Francisco, or by the federal government, which controlled much of the land. The Far West’s people are often resentful of their dependent status, feeling that they have been exploited as an internal colony for the benefit of the seaboard nations. Their senators led the fight against trusts in the mid-twentieth century. Of late, Far Westerners have focused their anger on the federal government, rather than their corporate masters.

NEW FRANCE. Occupying the New Orleans area and southeastern Canada, New France blends the folkways of ancien régime northern French peasantry with the traditions and values of the aboriginal people they encountered in northeastern North America. After a long history of imperial oppression, its people have emerged as down-to-earth, egalitarian, and consensus driven, among the most liberal on the continent, with unusually tolerant attitudes toward gays and people of all races and a ready acceptance of government involvement in the economy. The New French influence is manifest in Canada, where multiculturalism and negotiated consensus are treasured.

FIRST NATION. First Nation is populated by native American groups that generally never gave up their land by treaty and have largely retained cultural practices and knowledge that allow them to survive in this hostile region on their own terms. The nation is now reclaiming its sovereignty, having won considerable autonomy in Alaska and Nunavut and a self-governing nation state in Greenland that stands on the threshold of full independence. Its territory is huge—far larger than the continental United States—but its population is less than 300,000, most of whom live in Canada. –

Saturday, February 20, 2016

Fine-Structure Constant


With an accuracy of about .32 parts per billion, the fine-structure constant is approximately  7.2973525664(17)x10-3.  It is often stated as its reciprocal, which is 137.035999139(31).  These figures are estimates from experimental observation and calculation, but the fine-structure constant has not been reduced to a mathematical equation, and thus it retains an element of theoretical mystery to physicists.  This is compounded by the contention of some physicists, using powerful current telescopes, that this constant was slightly different billions of years ago.

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =


Arnold Sommerfeld introduced the fine-structure constant in 1916, as part of his theory of the relativistic deviations of atomic spectral lines from the predictions of the Bohr model. The first physical interpretation of the fine-structure constant α was as the ratio of the velocity of the electron in the first circular orbit of the relativistic Bohr atom to the speed of light in the vacuum. Equivalently, it was the quotient between the minimum angular momentum allowed by relativity for a closed orbit, and the minimum angular momentum allowed for it by quantum mechanics. It appears naturally in Sommerfeld's analysis, and determines the size of the splitting or fine-structure of the hydrogenic spectral lines.

Is the Fine-Structure Constant Actually Constant?

Physicists have pondered whether the fine-structure constant is in fact constant, or whether its value differs by location and over time. A varying α has been proposed as a way of solving problems in cosmology and astrophysics.  String theory and other proposals for going beyond the Standard Model of particle physics have led to theoretical interest in whether the accepted physical constants (not just α) actually vary.

Feynman on the Fine-Structure Constant

Richard Feynman, one of the originators and early developers of the theory of quantum electrodynamics (QED), referred to the fine-structure constant in these terms:

There is a most profound and beautiful question associated with the observed coupling constant, e – the amplitude for a real electron to emit or absorb a real photon. It is a simple number that has been experimentally determined to be close to 0.08542455. (My physicist friends won't recognize this number, because they like to remember it as the inverse of its square: about 137.03597 with about an uncertainty of about 2 in the last decimal place. It has been a mystery ever since it was discovered more than fifty years ago, and all good theoretical physicists put this number up on their wall and worry about it.) Immediately you would like to know where this number for a coupling comes from: is it related to pi or perhaps to the base of natural logarithms? Nobody knows. It's one of the greatest damn mysteries of physics: a magic number that comes to us with no understanding by man. You might say the "hand of God" wrote that number, and "we don't know how He pushed his pencil." We know what kind of a dance to do experimentally to measure this number very accurately, but we don't know what kind of dance to do on the computer to make this number come out, without putting it in secretly!

—  Richard P. Feynman (1985). QED: The Strange Theory of Light and Matter. Princeton University Press. p. 129. ISBN 0-691-08388-6.

Friday, February 19, 2016

Awesome Computer Game

Not just programming, not just exceptional programming – very great programming on a massive scale.  From programmers who both know and admit that computers can’t generate real random numbers.  This is an awesome example of futuristic game theory.  Make thou some fresh coffee and then read this article…

Thursday, February 18, 2016

Atheism Is Ancient -- and Sturdy

Disbelieve It or Not, Ancient History Suggests that Atheism Is as Natural to Humans as Religion
Tom Kirk, University of Cambridge, February 16, 2016

People in the ancient world did not always believe in the gods, a new study suggests – casting doubt on the idea that religious belief is a “default setting” for humans.

Despite being written out of large parts of history, atheists thrived in the polytheistic societies of the ancient world – raising considerable doubts about whether humans really are “wired” for religion – a new study suggests.

The claim is the central proposition of a new book by Tim Whitmarsh, Professor of Greek Culture and a Fellow of St John’s College, University of Cambridge. In it, he suggests that atheism – which is typically seen as a modern phenomenon – was not just common in ancient Greece and pre-Christian Rome, but probably flourished more in those societies than in most civilisations since.

As a result, the study challenges two assumptions that prop up current debates between atheists and believers: Firstly, the idea that atheism is a modern point of view, and second, the idea of “religious universalism” – that humans are naturally predisposed, or “wired”, to believe in gods.

The book, entitled Battling The Gods, is being launched in Cambridge on Tuesday (February 16).

“We tend to see atheism as an idea that has only recently emerged in secular Western societies,” Whitmarsh said. “The rhetoric used to describe it is hyper-modern. In fact, early societies were far more capable than many since of containing atheism within the spectrum of what they considered normal.”

“Rather than making judgements based on scientific reason, these early atheists were making what seem to be universal objections about the paradoxical nature of religion – the fact that it asks you to accept things that aren’t intuitively there in your world. The fact that this was happening thousands of years ago suggests that forms of disbelief can exist in all cultures, and probably always have.”

The book argues that disbelief is actually “as old as the hills”. Early examples, such as the atheistic writings of Xenophanes of Colophon (c.570-475 BCE) are contemporary with Second Temple-era Judaism, and significantly predate Christianity and Islam. Even Plato, writing in the 4th Century BCE, said that contemporary non-believers were “not the first to have had this view about the gods.”

Because atheism’s ancient history has largely gone unwritten, however, Whitmarsh suggests that it is also absent from both sides of the current monotheist/atheist debate.  While atheists depict religion as something from an earlier, more primitive stage of human development, the idea of religious universalism is also built partly on the notion that early societies were religious by nature because to believe in god is an inherent, “default setting” for humans.

Neither perspective is true, Whitmarsh suggests: “Believers talk about atheism as if it’s a pathology of a particularly odd phase of modern Western culture that will pass, but if you ask someone to think hard, clearly people also thought this way in antiquity.”

His book surveys one thousand years of ancient history to prove the point, teasing out the various forms of disbelief expressed by philosophical movements, writers and public figures.

These were made possible in particular by the fundamental diversity of polytheistic Greek societies. Between 650 and 323 BCE, Greece had an estimated 1,200 separate city states, each with its own customs, traditions and governance. Religion expressed this variety, as a matter of private cults, village rituals and city festivals dedicated to numerous divine entities.

This meant that there was no such thing as religious orthodoxy. The closest the Greeks got to a unifying sacred text were Homer’s epics, which offered no coherent moral vision of the gods, and indeed often portrayed them as immoral. Similarly, there was no specialised clergy telling people how to live: “The idea of a priest telling you what to do was alien to the Greek world,” Whitmarsh said.

As a result, while some people viewed atheism as mistaken, it was rarely seen as morally wrong. In fact, it was usually tolerated as one of a number of viewpoints that people could adopt on the subject of the gods. Only occasionally was it actively legislated against, such as in Athens during the 5th Century BCE, when Socrates was executed for “not recognising the gods of the city.”

While atheism came in various shapes and sizes, Whitmarsh also argues that there were strong continuities across the generations. Ancient atheists struggled with fundamentals that many people still question today – such as how to deal with the problem of evil, and how to explain aspects of religion which seem implausible.

These themes extend from the work of early thinkers – like Anaximander and Anaximenes, who tried to explain why phenomena such as thunder and earthquakes actually had nothing to do with the gods – through to famous writers like Euripides, whose plays openly criticised divine causality. Perhaps the most famous group of atheists in the ancient world, the Epicureans, argued that there was no such thing as predestination and rejected the idea that the gods had any control over human life.

The age of ancient atheism ended, Whitmarsh suggests, because the polytheistic societies that generally tolerated it were replaced by monotheistic imperial forces that demanded an acceptance of one, “true” God. Rome’s adoption of Christianity in the 4th Century CE was, he says, “seismic”, because it used religious absolutism to hold the Empire together.

Most of the later Roman Empire’s ideological energy was expended fighting supposedly heretical beliefs – often other forms of Christianity. In a decree of 380, Emperor Theodosius I even drew a distinction between Catholics, and everyone else – whom he classed as dementes vesanosque (“demented lunatics”). Such rulings left no room for disbelief.

Whitmarsh stresses that his study is not designed to prove, or disprove, the truth of atheism itself. On the book’s first page, however, he adds: “I do, however, have a strong conviction – that has hardened in the course of researching and writing this book – that cultural and religious pluralism, and free debate, are indispensable to the good life.”

Battling The Gods is published by Faber and Faber. Tim Whitmarsh is A G Leventis Professor of Greek Culture and a Fellow of St John’s College, University of Cambridge.

Wednesday, February 17, 2016

Grimm's Fairy Tales

Children's and Household Tales (German: Kinder- und Hausmärchen) is a collection of German fairy tales first published in 1812 by the Grimm brothers, Jacob and Wilhelm. The collection is commonly known in English as Grimm's Fairy Tales.


The first volume of the first edition was published in 1812, containing 86 stories; the second volume of 70 stories followed in 1815. For the second edition, two volumes were issued in 1819 and a third in 1822, totalling 170 tales. The third edition appeared in 1837; fourth edition, 1840; fifth edition, 1843; sixth edition, 1850; seventh edition, 1857. Stories were added, and also subtracted, from one edition to the next, until the seventh held 211 tales. All editions were extensively illustrated, first by Philipp Grot Johann and, after his death in 1892, by German illustrator Robert Leinweber.

The first volumes were much criticized because, although they were called "Children's Tales", they were not regarded as suitable for children, both for the scholarly information included and the subject matter.  Many changes through the editions – such as turning the wicked mother of the first edition in Snow White and Hansel and Gretel (shown in original Grimm stories as Hänsel and Grethel) to a stepmother, were probably made with an eye to such suitability. They removed sexual references—such as Rapunzel's innocently asking why her dress was getting tight around her belly, and thus naively revealing to her stepmother her pregnancy and the prince's visits—but, in many respects, violence, particularly when punishing villains, was increased.

In 1825, the Brothers published their Kleine Ausgabe or "small edition", a selection of 50 tales designed for child readers. This children's version went through ten editions between 1825 and 1858.


The influence of these books was widespread. W. H. Auden praised the collection during World War II as one of the founding works of Western culture. The tales themselves have been put to many uses. Hitler praised them as folkish tales showing children with sound racial instincts seeking racially pure marriage partners, and so strongly that the Allied forces warned against them; for instance, Cinderella with the heroine as racially pure, the stepmother as an alien, and the prince with an unspoiled instinct being able to distinguish. Writers who have written about the Holocaust have combined the tales with their memoirs, as Jane Yolen in her Briar Rose.

The work of the Brothers Grimm influenced other collectors, both inspiring them to collect tales and leading them to similarly believe, in a spirit of romantic nationalism, that the fairy tales of a country were particularly representative of it, to the neglect of cross-cultural influence. Among those influenced were the Russian Alexander Afanasyev, the Norwegians Peter Christen Asbjørnsen and Jørgen Moe, the English Joseph Jacobs, and Jeremiah Curtin, an American who collected Irish tales.  There was not always a pleased reaction to their collection.  Joseph Jacobs was in part inspired by his complaint that English children did not read English fairy tales; in his own words, "What Perrault began, the Grimms completed".

Three individual works of Wilhelm Grimm include Altdänische Heldenlieder, Balladen und Märchen ('Old Danish Heroic Songs, Ballads, and Folktales') in 1811, Über deutsche Runen ('On German Runes') in 1821, and Die deutsche Heldensage ('The German Heroic Saga') in 1829.

Tuesday, February 16, 2016

Deep Earth Microbes

New Microbe Found Thriving Deep in the Earth
Uppsala University, February 15, 2016

They live several kilometers under the surface of the earth, need no light or oxygen and can only be seen in a microscope. By sequencing genomes of a newly discovered group of microbes, the Hadesarchaea, an international team of researchers have found out how these microorganisms make a living in the deep subsurface biosphere of our planet.
Microorganisms that live below the surface of the earth remain one of the last great areas of exploration. Organisms that live there have not been grow in the laboratory and therefore their lifestyles are unknown. An international team led by microbiologists Brett Baker, Assistant Professor at The University of Texas and Thijs Ettema, senior lecturer at Uppsala University, along with scientists from UNC Chapel Hill and the University of Bremen, have discovered how microorganisms, first discovered in a South African gold mine at a depth of two miles, are able to make a living in the absence of oxygen and light. The study is published in Nature Microbiology.

Baker and Ettema found these microbes in vastly different aquatic and terrestrial environments; the deep mud of a temperate estuary in North Carolina and underneath hot springs at Yellowstone National Park.

'This new class of microbes are specialized for survival beneath the surface, so we called them “Hadesarchaea”, after the ancient Greek god of the underworld', says Brett Baker, lead author of the study.

As its name suggests, the Hadesarchaea belong to a relatively unknown group of microorganisms, the archaea. Like bacteria, archaea are single-celled and microscopically small, but from an evolutionary perspective, they differ more from each other than a human does from a tree.

Archaea were discovered only some 40 years ago, by the acclaimed American biologist Carl Woese. To date, archaea remain poorly studied in comparison to bacteria and more complex life forms, such as animals and plants.

'The discovery of the Hadesarchaea will help us increase our understanding of the biology and lifestyle of archaea that thrive in the deep biosphere', says Thijs Ettema.

In order to understand these elusive organisms, Baker and Ettema sequenced the genomes of several Hadesarchaea. They were able to determine how these microbes should be classified and what physiologies they use to survive under these extreme conditions. Hadesarchaea have the ability to live in areas devoid of oxygen and the scientists suggest that they are able to survive there by using carbon monoxide to gain energy. Interestingly, the chemical pathways the Hadesarchaea cells use to metabolize carbon monoxide are unique to what has been seen before.

'Before this essentially nothing was known about the Hadesarchaea’s ecological role and what makes them so prominent throughout the world. The new discovery expands our knowledge of how these organisms may have adapted to the extreme conditions of the deep biosphere', says Jimmy Saw, researcher at Uppsala University and co-author of the paper.

Monday, February 15, 2016

"Good" Bosses Go "Bad"

When the Boss’s Ethical Behavior Snaps
Michigan State University, February 12, 2016

Is your boss ethical? Does he or she do what’s right, as opposed to what’s profitable?

If so, they may turn downright abusive the next day.

New research on leader behavior by Russell Johnson, associate professor of management at Michigan State University, suggests ethical conduct leads to mental exhaustion and the “moral licensing” to lash out at employees.

The study, online in the Journal of Applied Psychology, is called “When ethical leader behavior breaks bad: How ethical behavior can turn abusive via ego depletion and moral licensing.” Moral licensing is a phenomenon in which people, after doing something good, feel they have earned the right to act in a negative manner.

“Ironically, when leaders felt mentally fatigued and morally licensed after displays of ethical behavior, they were more likely to be abusive toward their subordinates on the next day,” said Johnson, an expert on the psychology of the workplace.

Johnson and MSU students Szu-Han Lin and Jingjing Ma surveyed 172 supervisors over a several-day period in various industries including retail, education, manufacturing and health care. The goal: examine the consequences of ethical behavior for the leaders who exhibited it.

Johnson said it’s not easy to be ethical, as it turns out. “Being ethical means leaders often have to suppress their own self-interest (they must do ‘what’s right’ as opposed to ‘what’s profitable’), and they have to monitor not only the performance outcomes of subordinates but also the means (to ensure that ethical/appropriate practices were followed).”

Ethical behavior led to mental fatigue and moral licensing, and this led to leaders being more abusive to their workers. The abuse included ridiculing, insulting and expressing anger toward employees, giving them the silent treatment and reminding them of past mistakes or failures.

To combat mental fatigue, Johnson said managers should build in time for breaks during the workday; get sufficient sleep; eat healthy and exercise; and unplug from work outside of the office (which includes shutting off the smart phone at night).

Dealing with moral licensing is trickier, as there is not much research on the subject. However, Johnson suggested companies could consider formally requiring ethical behavior. “If such behavior is required, then it’s more difficult for people to feel they’ve earned credit for performing something that is mandatory,” he said. “A sense of moral license is more likely when people feel they voluntarily or freely exhibited the behavior.”

Ethical behavior could also be formally rewarded with social praise or money. But the praise or bonus should come relatively soon after the ethical behavior in order to counteract the moral licensing, Johnson said.

Sunday, February 14, 2016

Regent and Regency

A regent (from the Latin regens, "[one] ruling") is "a person appointed to administer a state because the monarch is a minor, is absent or is incapacitated." The rule of a regent or regents is called a regency. A regent or regency council may be formed ad hoc or in accordance with a constitutional rule. "Regent" is sometimes a formal title. If the regent is holding his position due to his position in the line of succession, the compound term prince regent is often used; if the regent of a minor is his mother, she is often referred to as "queen regent".

If the formally appointed regent is unavailable or cannot serve on a temporary basis, a Regent ad interim may be appointed to fill the gap.

In a monarchy, a regent usually governs due to one of these reasons, but may also be elected to rule during the interregnum when the royal line has died out. This was the case in the Kingdom of Finland and the Kingdom of Hungary, where the royal line was considered extinct in the aftermath of World War I. In Iceland, the regent represented the King of Denmark as sovereign of Iceland until the country became a republic in 1944. In the Polish-Lithuanian Commonwealth (1569–1795), kings were elective, which often led to a fairly long interregnum. In the interim, it was the Roman Catholic Primate (the Archbishop of Gniezno) who served as the regent, termed the "interrex" (Latin: ruler "between kings" as in ancient Rome). In the small republic of San Marino, the two Captains Regent, or Capitani Reggenti, are elected semi-annually (they serve a six-month term) as joint heads of state and of government.

Famous regency periods include that of the Prince Regent, later George IV of the United Kingdom, giving rise to many terms such as Regency era and Regency architecture. Strictly this period lasted from 1811 to 1820, when his father George III was insane, though when used as a period label it generally covers a wider period. Philippe II, Duke of Orléans was Regent of France from the death of Louis XIV in 1715 until Louis XV came of age in 1723; this is also used as a period label for many aspects of French history, as "Régence" in French, again tending to cover a rather wider period than the actual regency.

Other Uses

The term regent may refer to positions lower than the ruler of a country. The term may be used in the governance of organisations, typically as an equivalent of "director", and held by all members of a governing board rather than just the equivalent of the chief executive. Some university managers in North America are called regents and a management board for a college or university may be titled the "Board of Regents". The term "regent" is also used for members of governing bodies of institutions such as the national banks of France and Belgium.