Sunday, January 31, 2016

Croissant History


A croissant is a buttery, flaky, viennoiserie or Vienna-style pastry named for its well-known crescent shape. Croissants and other viennoiserie are made of a layered yeast-leavened dough. The dough is layered with butter, rolled and folded several times in succession, then rolled into a sheet, in a technique called laminating. The process results in a layered, flaky texture, similar to a puff pastry.

 


Crescent-shaped food breads have been made since the Middle Ages, and crescent-shaped cakes possibly since antiquity.

Croissants have long been a staple of Austrian and French bakeries and pâtisseries. In the late 1970s, the development of factory-made, frozen, pre-formed but unbaked dough made them into a fast food which can be freshly baked by unskilled labor. The croissanterie was explicitly a French response to American-style fast food, and today 30–40% of the croissants sold in French bakeries and patisseries are baked from frozen dough. Today, the croissant remains popular in a continental breakfast.

The birth of the croissant itself – that is, its adaptation from the plainer form of Kipferl, before the invention of viennoiserie – can be dated to at latest 1839 (some say 1838), when an Austrian artillery officer, August Zang, founded a Viennese bakery ("Boulangerie Viennoise") at 92, rue de Richelieu in Paris.  This bakery, which served Viennese specialities including the Kipferl and the Vienna loaf, quickly became popular and inspired French imitators (and the concept, if not the term, of viennoiserie, a 20th-century term for supposedly Vienna-style pastries). The French version of the Kipferl was named for its crescent (croissant) shape and has become an identifiable shape across the world.

Saturday, January 30, 2016

Thomas Gifford, novelist

Thomas Eugene Gifford (May 16, 1937 – October 31, 2000) was a best-selling American author of thriller novels. He was a graduate of Harvard University.

He gained international fame with the crime novel The Glendower Legacy and later with the Vatican thriller The Assassini. The books posited George Washington as a British spy and the Roman Catholic Church as a criminal organization. The Glendower Legacy was made into a movie in 1981 under the name Dirty Tricks.

Gifford also published under the names Dana Clarins and Thomas Maxwell.

He died of cholangiocarcinoma in his home in Dubuque, Iowa, on Halloween 2000.

Biography

Won awards at Harvard for creative writing; worked at The Sun Newspaper and The Guthrie; won Putnam's prize for best first novel (The Wind Chill Factor).

From Dubuque, Iowa, after graduating from college he moved to the Twin Cities, Minnesota, where he and his wife, Kari Sandven, had two children (Thomas Eaton, Rachel Claire). Divorced in 1969, he went on to marry Camille D'Ambrose, a local actress. They moved to Los Angeles for a few years, then returned to Orono, MN. Novels continued to flow from his fountain pen through the years. Gifford eventually moved to New York—a city he loved whose people were of infinite importance to him. In 1996, he turned his attention to renovating his childhood home in Dubuque, spending more time in Iowa than New York during his last years. He embraced the community of Dubuque, as they embraced their prodigal son. Featured in the Dubuque Telegraph Herald, Gifford recounted his every day occurrences, from learning the pleasure of getting a dog (Katie Maxwell, the Scottie) to peeves and pleasures of the town. Diagnosed with terminal cancer in February 2000, Gifford spent his remaining months reading, watching old movies, and chatting with friends and family. He died on October 31, 2000.

Gifford lived life large, had friends throughout the world, and lived life by his favorite credo—we're not here for a long time; we're here for a good time.


= = = = = = = = = = = = = = = = = = = = = = =

Afterword

The blog author regards The Wind Chill Factor and The Man from Lisbon as Gifford’s best books.

Friday, January 29, 2016

Patristics Defined

Patristics or patrology is the study of the early Christian writers who are designated Church Fathers. The names derive from the combined forms of Latin pater and Greek patḗr (father). The period is generally considered to run from the end of New Testament times or end of the Apostolic Age (c. AD 100) to either AD 451 (the date of the Council of Chalcedon) or to the 8th century Second Council of Nicaea.

Key Persons

Among those whose writings form the basis for Patristics, (i.e. prominent early Church Fathers), are:

  • Ignatius of Antioch (c. 35-c.108),
  • Pope Clement I (c.1st century AD-c.101),
  • Polycarp of Smyrna (c.69-c.-c.155),
  • Justin Martyr (c.100-c.165),
  • Irenaeus of Lyons (c.120-c.202),
  • Clement of Alexandria (c.150-c.215),
  • Tertullian (c.160-c.225),
  • Origen (c.185-c.254),
  • Cyprian of Carthage (d. 258),
  • Athanasius (c.296-c.373),
  • Gregory of Nazianzus (329-389),
  • Basil of Caesarea (c.330-379),
  • Gregory of Nyssa (c.330-c.395)
  • Theodore of Mopsuestia (c.350-428),
  • Jerome (347-430),
  • Augustine of Hippo (354-430),
  • Vincent of Lérins (d. bef. 450),
  • Cyril of Alexandria (d.444),
  • Maximus the Confessor (580-662)
  • Isaac of Nineveh (d. 700)
https://en.wikipedia.org/wiki/Patristics

Thursday, January 28, 2016

Splendid Cancer Breakthrough

A Cancer Riddle Solved
University of Iowa researchers reveal how cancer cells form tumors
By Richard C. Lewis, University of Iowa, January 25, 2016

Cancer is a mysterious disease for many reasons.  Chief among the unknowns are how and why tumors form.

Two University of Iowa studies offer key insights by recording in real time, and in 3-D, the movements of cancerous human breast tissue cells.  It’s believed to be the first time cancer cells’ motion and accretion into tumors has been continuously tracked.

The team discovered that cancerous cells actively recruit healthy cells into tumors by extending a cable of sorts to grab their neighbors—both cancerous and healthy—and reel them in. Moreover, the Iowa researchers report that as little as five percent of cancerous cells are needed to form the tumors, a ratio that heretofore had been unknown.

“It’s not like things sticking to each other,” said David Soll, biology professor at the UI and corresponding author on the paper, published in the American Journal of Cancer Research. “It’s that these cells go out and actively recruit. It’s complicated stuff, and it’s not passive. No one had a clue that there were specialized cells in this process, and that it’s a small number that pulls all the rest in.”

The findings could lead to a more precise identification of tumorigenic cells (those that form tumors) and testing which antibodies would be best equipped to eliminate them. Soll's Monoclonal Antibody Research Institute and the Developmental Studies Hybridoma Bank, created by the National Institutes of Health as a national resource, directed by Soll and housed at the UI, together contain one of the world’s largest collections of antibodies that could be used for the anti-cancer testing, based on the new findings.

In a paper published last spring in the journal PLOS One, Soll’s team showed that only cancerous cells (from a variety of cancers, including lung, skin, and aggressive brain tumors known as glioblastomas) engaged in tumor formation by actively soliciting other cells. Like evil-minded envoys, individual cancer cells extend themselves outward from the original cluster, probing for other cells in the area, the researchers observed. Once it detects one, the extended cell latches on and pulls it in, forming a larger mass. The activity continues, the cancerous extensions drawing in more and more cells—including healthy cells—as the tumor enlarges.

“There’s nothing but tumorigenic cells in the bridge (between cells),” Soll said, “and that’s the discovery. The tumorigenic cells know what they’re doing. They make tumors.”

The question is how these cells know what to do. Soll hypothesizes they’re reaching back to a primitive past, when these cells were programmed to form embryos. If true, perhaps the cancerous cells—masquerading as embryo-forming cells—recruit other cells to make tissue that then forms the layered, self-sustaining architecture needed for a tumor to form and thrive.

Think of a Death Star that’s built up enough defenses to ward off repeated attacks. Or, less figuratively, how bacteria can conspire to create an impenetrable film on surfaces, from orthopedic implants to catheters.

“There must be a reason,” Soll said. “You might want one big tumor capable of producing the tissue it needs to form a micro-environment. It’s as if it’s building its own defenses against the body’s efforts to defeat them.”

In the AJCR paper, the researchers compared the actions of human breast tissue cells (MoVi-10') to a weakly tumorigenic, parental breast cancer cell line (MCF-7). First, they found that over a 50-hour period, MoVi-10'–only cells grew more in density, primarily by joining together, than did MCF-7.

Also, in all instances, regardless of the ratio of MCF-7 to MoVi-10' cells in the cluster, only MoVi-10' cells reached out and drew in other cells—including healthy cells—to the growing mass.

“The results here extend our original observation that tumorigenic cell lines and fresh tumor cells possess the unique capacity to undergo coalescence through the active formation of cellular cables,” the authors write.

The finding lends more weight to the idea that tumors are created concurrently, in multiple locations, by individual clusters of cells that employ the cancer-cell cables to draw in more cells and enlarge themselves. Some have argued that tumors come about more by cellular changes within the masses, known as the “cancer stem cell theory.”

Soll’s team also discovered that the Mo-Vi10' cells move at 92 microns per hour, about twice the speed of healthy cells. That’s important because it helps scientists better understand how quickly tumors can be created.

Contributing authors, all from the University of Iowa, include Joseph Ambrose, Michelle Livitz, Deborah Wessels, Spencer Kuhl, Daniel Lusche, and Edward Voss. Amanda Scherer, now at the University of Michigan, also contributed to the research while at the UI.

Wednesday, January 27, 2016

Kuiper Belt -- an overview

The Kuiper belt /ˈkaɪpər/ or /'køypǝr/ (as in Dutch), sometimes called the Edgeworth–Kuiper belt, is a circumstellar disc in the Solar System beyond the planets, extending from the orbit of Neptune (at 30 AU) to approximately 50 AU from the Sun. It is similar to the asteroid belt, but it is far larger—20 times as wide and 20 to 200 times as massive.  Like the asteroid belt, it consists mainly of small bodies, or remnants from the Solar System's formation. Although many asteroids are composed primarily of rock and metal, most Kuiper belt objects are composed largely of frozen volatiles (termed "ices"), such as methane, ammonia and water. The Kuiper belt is home to three officially recognized dwarf planets: Pluto, Haumea, and Makemake. Some of the Solar System's moons, such as Neptune's Triton and Saturn's Phoebe, are also thought to have originated in the region.

The Kuiper belt was named after Dutch-American astronomer Gerard Kuiper, though he did not actually predict its existence. In 1992, 1992 QB1 was discovered, the first Kuiper belt object (KBO) since Pluto.  Since its discovery, the number of known KBOs has increased to over a thousand, and more than 100,000 KBOs over 100 km (62 mi) in diameter are thought to exist. The Kuiper belt was initially thought to be the main repository for periodic comets, those with orbits lasting less than 200 years. However, studies since the mid-1990s have shown that the belt is dynamically stable, and that comets' true place of origin is the scattered disc, a dynamically active zone created by the outward motion of Neptune 4.5 billion years ago; scattered disc objects such as Eris have extremely eccentric orbits that take them as far as 100 AU from the Sun.

The Kuiper belt should not be confused with the theorized Oort cloud, which is a thousand times more distant and is not flat. The objects within the Kuiper belt, together with the members of the scattered disc and any potential Hills cloud or Oort cloud objects, are collectively referred to as trans-Neptunian objects (TNOs).

Pluto is likely the largest and most-massive member of the Kuiper belt and the largest and the second-most-massive known TNO, surpassed only by Eris in the scattered disc. Originally considered a planet, Pluto's status as part of the Kuiper belt caused it to be reclassified as a dwarf planet in 2006. It is compositionally similar to many other objects of the Kuiper belt, and its orbital period is characteristic of a class of KBOs, known as "plutinos", that share the same 2:3 resonance with Neptune.

Tuesday, January 26, 2016

Bob Cummings, Actor and Aviator


Charles Clarence Robert Orville Cummings (June 9, 1910 – December 2, 1990) was an American film and television actor known mainly for his roles in comedy films such as The Devil and Miss Jones (1941) and Princess O'Rourke (1943), but was also effective in dramatic films, especially two of Alfred Hitchcock's thrillers, Saboteur (1942) and Dial M for Murder (1954).  Cummings received five Primetime Emmy Award nominations, and won the Primetime Emmy Award for Best Actor in a Single Performance in 1955. In 1960, he received two stars on the Hollywood Walk of Fame for motion pictures and television.        

While attending Joplin High School, Cummings was taught to fly by his godfather, Orville Wright, the aviation pioneer.  His first solo was on March 3, 1927.  During high school, Cummings gave Joplin residents rides in his aircraft for $5 per person.  When the government began licensing flight instructors, Cummings was issued flight instructor certificate No. 1, making him the first official flight instructor in the United States.

Cummings studied briefly at Drury College in Springfield, Missouri, but his love of flying caused him to transfer to the Carnegie Institute of Technology in Pittsburgh, Pennsylvania. He studied aeronautical engineering for a year before he dropped out because of financial reasons, his family having lost heavily in the 1929 stock market crash.  Since the American Academy of Dramatic Arts in New York City paid its male actors $14 a week, Cummings decided to study there.

He achieved stardom in 1939 in Three Smart Girls Grow Up, opposite Deanna Durbin. His many film comedies include: The Devil and Miss Jones (1941) with Jean Arthur, Moon Over Miami (1941), and The Bride Wore Boots (1946) with Barbara Stanwyck.

Cummings gave memorable performances in three notable dramas. In Kings Row (1942), he played the lead role, Parris Mitchell, alongside friend Ronald Reagan, Claude Rains, Ann Sheridan and an all-star cast. In spite of its mixed critical reaction, the film was nominated for three Academy Awards, including one for Best Picture.

Cummings starred in the spy thriller Saboteur (1942) with Priscilla Lane and Norman Lloyd. He played Barry Kane, an aircraft worker wrongfully accused of espionage, trying to clear his name.

In 1947, Cummings had reportedly earned $110,000 in the past 12 months.

Cummings appeared in the Hitchcock film, Dial M for Murder (1954), as Mark Halliday, co-starring with Grace Kelly and Ray Milland. The film was a box-office smash. Cummings also starred in You Came Along (1945), with a screenplay by Ayn Rand. The Army Air Forces pilot Cummings played ("Bob Collins") died off camera, but was resurrected ten years later for his television show.

Cummings was chosen by producer John Wayne as his co-star to play airline pilot Captain Sullivan in The High and the Mighty, partly due to Cummings' flying experience; however, director William A. Wellman overruled Wayne and hired Robert Stack for the part.

In 1955 Cummings announced he would form his own production company, Laurel (named after his daughter and the street he lived in, Laurel Way). He intended to make a film called The Damned from a novel by John D. MacDonald directed by Frank Tashlin.  However no film resulted.

Cummings made his mark in the CBS Radio network's dramatic serial titled Those We Love, which ran from 1938 to 1945. Cummings played the role of David Adair, opposite Richard Cromwell, Francis X. Bushman, and Nan Grey. He was also one of the four stars featured in the short-run radio version of Four Star Playhouse.

During the 1970s for over 10 years Cummings traveled the US performing in dinner theaters and short stints in plays while living in an Airstream Travel Trailer. He relayed those experiences in the written introduction he provided for the book "AIRSTREAM" written by Robert Landau and James Phillippi in 1984.

Television

Cummings began a long career on television in 1952, starring in the comedy My Hero. He received the Primetime Emmy Award for Outstanding Lead Actor for his portrayal of "Juror Number Eight", in the first televised performance of Twelve Angry Men, a live production that aired in 1954 (Henry Fonda played the same role in the feature film adaptation).  Cummings was one of the anchors on ABC’s live broadcast of the opening day of Disneyland on July 17, 1955.

In 1955 Cummings announced he was setting up his own production company, Laurel, named after his daughter Laurel Ann .  From 1955 through 1959, Cummings starred on a successful NBC sitcom, The Bob Cummings Show (known as Love That Bob in reruns), in which he played Bob Collins, an ex–World War II pilot who became a successful professional photographer. As a bachelor in 1950s Los Angeles, the character Bob Collins considered himself to be quite the ladies' man. This sitcom was noted for some very risque humor for its time. A popular feature of the program was Cummings' portrayal of his elderly grandfather. His co-stars were Rosemary DeCamp, as his sister, Margaret MacDonald, Dwayne Hickman, as his nephew, Chuck MacDonald and Ann B. Davis, in her first television success, as his assistant Charmaine "Schultzy" Schultz. Cummings also was a guest on the NBC interview program Here's Hollywood.

In 1960 Cummings starred in "King Nine Will Not Return", the opening episode of the second season of CBS's The Twilight Zone.

The New Bob Cummings Show followed on CBS for one season, from 1961 to 1962. Cummings is depicted as the owner and pilot of Aerocar N102D and this aircraft was featured on his show.

In 1964–65 Cummings starred in another CBS sitcom, My Living Doll, which co-starred Julie Newmar as Rhoda the robot. Cummings' last significant role was the 1973 television movie Partners in Crime, co-starring Lee Grant. He also appeared in 1979 as Elliott Smith, the father of Fred Grandy's Gopher on ABC's The Love Boat.

In 1986, Cummings hosted the televised 15th Anniversary Celebration of Walt Disney World in Walt Disney's Wonderful World of Color.

Robert Cummings' last public appearance was on Disneyland's 35th Anniversary Special in 1990.

Personal Life

Cummings married five times and fathered seven children. He remained an avid aviator and owned a number of planes (all named "Spinach").  He was a staunch advocate of natural foods and a healthy diet and in 1960 wrote a book, Stay Young and Vital, which focused upon health foods and exercise.

Despite his interest in health, Cummings was a methamphetamine addict from the mid-1950s until the end of his life. Cummings began receiving injections from Max Jacobson, the notorious "Dr. Feelgood", in 1954 during a trip to New York to star in the TV production of Twelve Angry Men.

Rose and Cummings' friends Rosemary Clooney and José Ferrer recommended the doctor to Cummings, who was complaining of a lack of energy. While Jacobson insisted that his injections contained only "vitamins, sheep sperm and monkey gonads", they actually contained a substantial dose of methamphetamine.

Cummings continued to use a mixture provided by Jacobson, eventually becoming a patient of Jacobson's son Thomas, who was based in Los Angeles, and later injecting himself. The changes in Cummings' personality caused by the euphoria of the drug and subsequent depression damaged his career and led to an intervention by his friend, television host Art Linkletter. The intervention was not successful, and Cummings' drug abuse and subsequent career collapse were factors in his divorce from his third wife Mary, and his divorce from his fourth wife, Gina Fong.

After Jacobson was forced out of business in the 1970s, Cummings developed his own drug connections based in the Bahamas. Suffering from Parkinson's Disease, he was forced to move into homes for indigent older actors in Hollywood.

Monday, January 25, 2016

Concrete that Melts Ice


De-icing Concrete Could Improve Roadway Safety
By Scott Schrage, University of Nebraska, January 21, 2016

A 200-square-foot slab of seemingly ordinary concrete sits just outside the Peter Kiewit Institute as snowflakes begin parachuting toward Omaha on a frigid afternoon in late December.  The snow accumulates on the grass surrounding the slab and initially clings to the concrete, too. But as the minutes pass and the snow begins melting from only its surface, the slab reveals its secret: Like razors, stoves and guitars before it, this concrete has gone electric.

Its designer, UNL professor of civil engineering Chris Tuan, has added a pinch of steel shavings and a dash of carbon particles to a recipe that has literally been set in concrete for centuries. Though the newest ingredients constitute just 20 percent of Tuan’s otherwise standard concrete mixture, they conduct enough electricity to melt ice and snow in the worst winter storms while remaining safe to the touch.

Tuan’s research team is demonstrating the concrete’s de-icing performance to the Federal Aviation Administration during a testing phase that runs through March 2016. If the FAA is satisfied with the results, Tuan said the administration will consider scaling up the tests by integrating the technology into the tarmac of a major U.S. airport.

“To my surprise, they don’t want to use it for the runways,” Tuan said. “What they need is the tarmac around the gated areas cleared, because they have so many carts to unload – luggage service, food service, trash service, fuel service – that all need to get into those areas.

“They said that if we can heat that kind of tarmac, then there would be (far fewer) weather-related delays. We’re very optimistic.”

A unique bridge that resides about 15 miles south of Lincoln has given Tuan reason to feel confident. In 2002, Tuan and the Nebraska Department of Roads made the 150-foot Roca Spur Bridge the world’s first to incorporate conductive concrete. Inlaid with 52 conductive slabs that have successfully de-iced its surface for more than a decade, the bridge exemplifies the sort of targeted site that Tuan envisions for the technology.

“Bridges always freeze up first, because they’re exposed to the elements on top and bottom,” Tuan said. “It’s not cost-effective to build entire roadways using conductive concrete, but you can use it at certain locations where you always get ice or have potholes.”

Potholes often originate from the liberal use of salt or de-icing chemicals that can corrode concrete and contaminate groundwater over time, Tuan said, making the conductive concrete an appealing alternative with lower operating and maintenance costs. The power required to thermally de-ice the Roca Spur Bridge during a three-day storm typically costs about $250 – several times less than a truckload of chemicals, he said.

Tuan said the conductive concrete could also prove feasible for high-traffic intersections, exit ramps, driveways and sidewalks. Yet the technology offers another, very different application that doesn’t even require electric current.

Catching the next wave

By replacing the limestone and sand typically used in concrete with a mineral called magnetite, Tuan has shown that the mixture can also shield against electromagnetic waves. The electromagnetic spectrum includes the radiofrequency waves transmitted and received by cell phones, which Tuan said could make the concrete mixture useful to those concerned about becoming targets of industrial espionage.

Using the magnetite-embedded concrete, Tuan and his colleagues have built a small structure in their laboratory that demonstrates the material’s shielding capabilities.

“We invite parties that are interested in the technology to go in there and try to use their cell phones,” said Tuan, who has patented his design through NUtech Ventures. “And they always receive a no-service message.”

While Tuan’s collaborations have him dreaming big about the future of conductive concrete, he’s currently enjoying its benefits much closer to home.

“I have a patio in my backyard that is made of conductive concrete,” he said with a laugh. “So I’m practicing what I preach.”

Tuan developed the concrete with the assistance of Lim Nguyen, associate professor of electrical and computer engineering; Bing Chen, professor of electrical and computer engineering; and Sherif Yehia, a professor at the American University of Sharjah who earned his doctorate in civil engineering at UNL. The FAA is currently funding the team’s research, which has also received past support from the Nebraska Department of Roads.

Sunday, January 24, 2016

Musical Film -- A Summary

The musical film is a film genre in which songs sung by the characters are interwoven into the narrative, sometimes accompanied by dancing.

The songs usually advance the plot or develop the film's characters, though in some cases they serve merely as breaks in the storyline, often as elaborate "production numbers".

The musical film was a natural development of the stage musical after the emergence of sound film technology. Typically, the biggest difference between film and stage musicals is the use of lavish background scenery and locations that would be impractical in a theater. Musical films characteristically contain elements reminiscent of theater; performers often treat their song and dance numbers as if there is a live audience watching. In a sense, the viewer becomes the diegetic audience, as the performer looks directly into the camera and performs to it.

The First Musicals

Musical short films were made by Lee de Forest in 1923-24. Beginning in 1926, thousands of Vitaphone shorts were made, many featuring bands, vocalists and dancers. The earliest feature-length films with synchronized sound had only a soundtrack of music and occasional sound effects that played while the actors portrayed their characters just as they did in silent films: without audible dialogue.  The Jazz Singer, released in 1927 by Warner Brothers, was the first to include an audio track including non-diegetic music and diegetic music, but it had only a short sequence of spoken dialogue. This feature-length film was also a musical, featuring Al Jolson singing "Dirty Hands, Dirty Face", "Toot, Toot, Tootsie", "Blue Skies" and "My Mammy". Historian Scott Eyman wrote, "As the film ended and applause grew with the houselights, Sam Goldwyn's wife Frances looked around at the celebrities in the crowd. She saw 'terror in all their faces', she said, as if they knew that 'the game they had been playing for years was finally over.  Still, only isolated sequences featured "live" sound; most of the film had only a synchronous musical score.  In 1928, Warner Brothers followed this up with another Jolson part-talkie, The Singing Fool, which was a blockbuster hit.  Theaters scrambled to install the new sound equipment and to hire Broadway composers to write musicals for the screen.  The first all-talking feature, Lights of New York, included a musical sequence in a night club. The enthusiasm of audiences was so great that in less than a year all the major studios were making sound pictures exclusively. The Broadway Melody (1929) had a show-biz plot about two sisters competing for a charming song-and-dance man. Advertised by MGM as the first "All-Talking, All-Singing, All-Dancing" feature film, it was a hit and won the Academy Award for Best Picture for 1929. There was a rush by the studios to hire talent from the stage to star in lavishly filmed versions of Broadway hits. The Love Parade (Paramount 1929) starred Maurice Chevalier and newcomer Jeanette MacDonald, written by Broadway veteran Guy Bolton.

Warner Brothers produced the first screen operetta, The Desert Song in 1929. They spared no expense and photographed a large percentage of the film in Technicolor. This was followed by the first all-color, all-talking musical feature which was entitled On with the Show (1929). The most popular film of 1929 was the second all-color, all-talking feature which was entitled Gold Diggers of Broadway (1929). This film broke all box office records and remained the highest grossing film ever produced until 1939. Suddenly the market became flooded with musicals, revues and operettas. The following all-color musicals were produced in 1929 and 1930 alone: The Show of Shows (1929), Sally (1929), The Vagabond King (1930), Follow Thru (1930), Bright Lights (1930), Golden Dawn (1930), Hold Everything (1930), The Rogue Song (1930), Song of the Flame (1930), Song of the West (1930), Sweet Kitty Bellairs (1930), Under a Texas Moon (1930), Bride of the Regiment (1930), Whoopee! (1930), King of Jazz (1930), Viennese Nights (1930), Kiss Me Again (1930).  In addition, there were scores of musical features released with color sequences.

Hollywood released more than 100 musical films in 1930, but only 14 in 1931. By late 1930, audiences had been oversaturated with musicals and studios were forced to cut the music from films that were then being released. For example, Life of the Party (1930) was originally produced as an all-color, all-talking musical comedy. Before it was released, however, the songs were cut out. The same thing happened to Fifty Million Frenchmen (1931) and Manhattan Parade (1932) both of which had been filmed entirely in Technicolor.  Marlene Dietrich sang songs successfully in her films, and Rodgers and Hart wrote a few well-received films, but even their popularity waned by 1932.  The public had quickly come to associate color with musicals and thus the decline in their popularity also resulted in a decline in color productions.

Musicals after the 1950s

In the 1960s, 1970s and continuing up to today the musical film became less of a bankable genre that could be relied upon for sure-fire hits. Audiences for them lessened and fewer musical films were produced as the genre became less mainstream and more specialized.


In Other Nations

Musical film has also been important in Spanish speaking nations, in India (to the present day) and during the Stalin era of the Soviet Union as propaganda.

Saturday, January 23, 2016

Dog People and Cat People

Some people base a significant portion of their identity around their affinity for either cats or dogs, describing themselves as a "cat person" or a "dog person". This builds on the perceived dichotomy between cats and dogs as pets in society.  The two terms refer to people's self-identification, regardless of what pets they actually own, if any.

A 2010 study at the University of Texas found that those who identified as "dog people" tended to be more social and outgoing, whereas "cat people" tended to be more neurotic and "open," meaning creative, philosophical, or nontraditional.  In a 2014 study at Carroll University, Wisconsin, people who said they were dog lovers were found to be more energetic and outgoing, and tended to follow rules closely, while cat lovers were more introverted, open-minded and sensitive. Cat people also tended to be non-conformists as well as scoring higher on intelligence tests than dog lovers.

See Also

Cat lady, a derogatory depiction of a female cat person


= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

A cat lady is a single woman, often a stock character, who owns many pet cats. The term is usually considered pejorative,  though it is sometimes embraced.

Usage and Association

Women who have cats have long been associated with the concept of spinsterhood. In more recent decades, the concept of a cat lady has been associated with "romance-challenged (often career-oriented) women".

A cat lady may also be an animal hoarder who keeps large numbers of cats without having the ability to properly house or care for them.  They may be ignorant about their situation.

Some writers, celebrities, and artists have challenged the gender based "Crazy Cat Lady" stereotype, and embraced the term to mean an animal lover or rescuer who cares for one or multiple cats, and who is psychologically healthy.

Toxoplasma Gondii

Recent research indicates a link between the parasite Toxoplasma gondii, which sexually reproduces exclusively in cats, and numerous psychiatric conditions, including OCD.  The compulsive hoarding of cats, a symptom of obsessive compulsive disorder (OCD), has long been associated with "crazy cat ladies".  Mass media has drawn on this stereotype to coin the term Crazy Cat Lady Syndrome to refer to the association between T. gondii and psychiatric conditions.

Friday, January 22, 2016

The Silk Road

The Silk Road or Silk Route is an ancient network of trade routes that were central to cultural interaction through regions of the Asian continent connecting the West and East from China and India to the Mediterranean Sea.

The Silk Road derives its name from the lucrative trade in Chinese silk carried out along its length, beginning during the Han dynasty (207 BC – 220 AD). The Central Asian sections of the trade routes were expanded around 114 BC by the Han dynasty, largely through the missions and explorations of Chinese imperial envoy, Zhang Qian.  The hinese took great interest in the safety of their trade products and extended the Great Wall of China to ensure the protection of the trade route.

Trade on the Silk Road was a significant factor in the development of the civilizations of China, the Indian subcontinent, Persia, Europe, the Horn of Africa and Arabia, opening long-distance, political and economic relations between the civilizations.  Though silk was certainly the major trade item from China, many other goods were traded, and religions, syncretic philosophies, and various technologies, as well as diseases, also travelled along the Silk Routes. In addition to economic trade, the Silk Road served as a means of carrying out cultural trade among the civilizations along its network.

The main traders during antiquity were the Chinese, Persians, Somalis, Greeks, Syrians, Romans, Armenians, Indians, and Bactrians, and from the 5th to the 8th century the Sogdians. Following the emergence of Islam, Arab traders became prominent.

In June 2014 UNESCO designated the Chang'an-Tianshan corridor of the Silk Road as a World Heritage Site.



Thursday, January 21, 2016

The Green Children of Woolpit

The legend of the green children of Woolpit concerns two children of unusual skin colour who reportedly appeared in the village of Woolpit in Suffolk, England, some time in the 12th century, perhaps during the reign of King Stephen. The children, brother and sister, were of generally normal appearance except for the green colour of their skin. They spoke in an unknown language, and the only food they would eat was beans. Eventually they learned to eat other food and lost their green pallor, but the boy was sickly and died soon after he and his sister were baptised. The girl adjusted to her new life, but she was considered to be "rather loose and wanton in her conduct".  After she learned to speak English, the girl explained that she and her brother had come from St Martin's Land, an underground world inhabited by green people.

The only near-contemporary accounts are contained in William of Newburgh's Historia rerum Anglicarum and Ralph of Coggeshall's Chronicum Anglicanum, written in about 1189 and 1220 respectively. Between then and their rediscovery in the mid-19th century, the green children seem to surface only in a passing mention in William Camden's Britannia in 1586, and in Bishop Francis Godwin's fantastical The Man in the Moone, in both of which William of Newburgh's account is cited.

Two approaches have dominated explanations of the story of the green children: that it is a folktale describing an imaginary encounter with the inhabitants of another world, perhaps one beneath our feet or even extraterrestrial, or it is a garbled account of a historical event. The story was praised as an ideal fantasy by the English anarchist poet and critic Herbert Read in his English Prose Style, published in 1931. It provided the inspiration for his only novel, The Green Child, written in 1934.

Wednesday, January 20, 2016

Soundtrack Maestros


Introduction by the Blog Author

After World War II, pop music changed substantially.  Big Band music never really returned to live tours, due to the expense of taking so many musicians out on the road and due to competition from the crisp sound from new FM radio.  Instrumental arrangers and conductors like Percy Faith, Henry Mancini, Hugo Montenegro, Nelson Riddle, Lex Baxter, Montovani and many others were heard on the radio and recording albums in the new, long-playing 33rpm format.

The AM radio arena became the medium of choice for rock and roll and for talk radio.  But as music progressed into the 1960s, the cost of getting a full orchestra together for recording sessions skyrocketed.  There were fewer and fewer instrumental hits.  Who could afford to assemble a full orchestra?

The answer was supplied by motion picture budgets.  The motion picture audience still wanted lush, new melodic lines for the pictures they went to see.  As motion pictures were released, the soundtrack album became part of the overall scene by plan and promotion, a trend that started with the hypnotic main theme for the movie Spellbound in 1945.  The composer/conductor/arranger became the standard.  After the death of Victor Young and retirement of Max Steiner in the 1950s, the artists in the list below took over and have provided millions of listeners with continued instrumental music, some of which is so infectious and hypnotic that modern classical orchestras, sometimes reluctantly!, have turned to it to increase ticket sales for their live performances.
 
Very Significant Modern Soundtrack Composers
 
John Barry

Jerry Goldsmith

Henry  Mancini

Alan Silvestri

Lalo Shifirin

Ennio Morricone

Stanley Tarrentine

Hans Zimmer

James Horner

John Powell

Danny Elfman

Michael Giacchino

Harry Gregson-Williams

Paul Desmond

David Arnold

Ande Desplat

Peter Kater

Angelo Badalamenti

Thomas Newman

Ramin Djawadi

Tuesday, January 19, 2016

Judicial Review

Judicial review is the doctrine under which legislative and executive actions are subject to review by the judiciary. A court with judicial review power may invalidate laws and decisions that are incompatible with a higher authority, such as the terms of a written constitution.  Judicial review is one of the checks and balances in the separation of powers: the power of the judiciary to supervise the legislative and executive branches when the latter exceed their authority. The doctrine varies between jurisdictions, so the procedure and scope of judicial review may differ between and within countries.

General

Judicial review can be understood in the context of two distinct—but parallel—legal systems, civil law and common law, and also by two distinct theories of democracy regarding the manner in which government should be organized with respect to the principles and doctrines of legislative supremacy and the separation of powers.

First, two distinct legal systems, civil law and common law, have different views about judicial review. Common-law judges are seen as sources of law, capable of creating new legal principles, and also capable of rejecting legal principles that are no longer valid. In the civil-law tradition, judges are seen as those who apply the law, with no power to create (or destroy) legal principles.

Secondly, the idea of separation of powers is another theory about how a democratic society's government should be organized. In contrast to legislative supremacy, the idea of separation of powers was first introduced by Montesquieu; it was later institutionalized in the United States by the Supreme Court ruling in Marbury v. Madison under the court of John Marshall. Separation of powers is based on the idea that no branch of government should be able to exert power over any other branch without due process of law; each branch of government should have a check on the powers of the other branches of government, thus creating a regulative balance among all branches of government. The key to this idea is checks and balances. In the United States, judicial review is considered a key check on the powers of the other two branches of government by the judiciary.

Differences in organizing "democratic" societies led to different views regarding judicial review, with societies based on common law and those stressing a separation of powers being the most likely to utilize judicial review. Nevertheless, many countries whose legal systems are based on the idea of legislative supremacy have learned the possible dangers and limitations of entrusting power exclusively to the legislative branch of government. Many countries with civil-law systems have adopted a form of judicial review to stem the tyranny of the majority.

Another reason why judicial review should be understood in the context of both the development of two distinct legal systems (civil law and common law) and two theories of democracy (legislative supremacy and separation of powers) is that some countries with common-law systems do not have judicial review of primary legislation. Though a common-law system is present in the United Kingdom, the country still has a strong attachment to the idea of legislative supremacy; consequently, judges in the United Kingdom do not have the power to strike down primary legislation. However, since the United Kingdom became a member of the European Union there has been tension between its tendency toward legislative supremacy and the EU's legal system, which specifically gives the Court of Justice of the European Union the power of judicial review.

Judicial Review of Administrative Acts

Most modern legal systems allow the courts to review administrative acts (individual decisions of a public body, such as a decision to grant a subsidy or to withdraw a residence permit). In most systems, this also includes review of secondary legislation (legally enforceable rules of general applicability adopted by administrative bodies). Some countries (notably France and Germany) have implemented a system of administrative courts which are charged with resolving disputes between members of the public and the administration. In other countries (including the United States and United Kingdom), judicial review is carried out by regular civil courts although it may be delegated to specialized panels within these courts (such as the Administrative Court within the High Court of England and Wales). The United States employs a mixed system in which some administrative decisions are reviewed by the United States district courts (which are the general trial courts), some are reviewed directly by the United States courts of appeals and others are reviewed by specialized tribunals such as the United States Court of Appeals for Veterans Claims (which, despite its name, is not technically part of the federal judicial branch). It is quite common that before a request for judicial review of an administrative act is filed with a court, certain preliminary conditions (such as a complaint to the authority itself) must be fulfilled. In most countries, the courts apply special procedures in administrative cases.


= = = = = = = = = = = = =  = = = = = = = = = = = = = = = =

Judicial Review in the United States

In the United States, judicial review is the ability of a court to examine and decide if a statute, treaty or administrative regulation contradicts or violates the provisions of existing law, a State Constitution, or ultimately the United States Constitution. While the U.S. Constitution does not explicitly define a power of judicial review, the authority for judicial review in the United States has been inferred from the structure, provisions, and history of the Constitution.

Two landmark decisions by the U.S. Supreme Court served to confirm the inferred constitutional authority for judicial review in the United States: In 1796, Hylton v. United States was the first case decided by the Supreme Court involving a direct challenge to the constitutionality of an act of Congress, the Carriage Act of 1794 which imposed a "carriage tax".  The Court engaged in the process of judicial review by examining the plaintiff's claim that the carriage tax was unconstitutional. After review, the Supreme Court decided the Carriage Act was not unconstitutional. In 1803, Marbury v. Madison was the first Supreme Court case where the Court asserted its authority for judicial review to strike down a law as unconstitutional. At the end of his opinion in this decision, Chief Justice John Marshall maintained that the Supreme Court's responsibility to overturn unconstitutional legislation was a necessary consequence of their sworn oath of office to uphold the Constitution as instructed in Article Six of the Constitution.

As of 2014, the United States Supreme Court has held 176 Acts of the U.S. Congress unconstitutional.

Monday, January 18, 2016

"Declaratory Judgment" Explained

A declaratory judgment, also called a declaration, is the legal determination of a court that resolves legal uncertainty for the litigants. It is a form of legally binding preventive adjudication by which a party involved in an actual or possible legal matter can ask a court to conclusively rule on and affirm the rights, duties, or obligations of one or more parties in a civil dispute (subject to any appeal).  The declaratory judgment is generally considered a statutory remedy and not an equitable remedy in the United States, and is thus not subject to equitable requirements, though there are analogies that can be found in the remedies granted by courts of equity.  A declaratory judgment does not by itself order any action by a party, or imply damages or an injunction, although it may be accompanied by one or more other remedies.

The declaratory judgment is distinguished from another important non-monetary remedy, the injunction, in two main ways. First, the injunction has, and the declaratory judgment lacks, a number of devices for managing the parties.  Second, the declaratory judgment is sometimes available at an earlier point in a dispute, because it is not subject to the equitable ripeness requirement.

A declaratory judgment is generally distinguished from an advisory opinion because the latter does not resolve an actual case or controversy. Declaratory judgments can provide legal certainty to each party in a matter when this could resolve or assist in a disagreement. Often an early resolution of legal rights will resolve some or all of the other issues in a matter.

A declaratory judgment is typically requested when a party is threatened with a lawsuit but the lawsuit has not yet been filed; or when a party or parties believe that their rights under law and/or contract might conflict; or as part of a counterclaim to prevent further lawsuits from the same plaintiff (for example, when only a contract claim is filed, but a copyright claim might also be applicable). In some instances, a declaratory judgment is filed because the statute of limitations against a potential defendant may pass before the plaintiff incurs damage (for example, a malpractice statute applicable to a certified public accountant may be shorter than the time period the IRS has to assess a taxpayer for additional tax due to bad advice given by the C.P.A.).

Declaratory judgments are authorized by statute in most common-law jurisdictions. In the United States, the federal government and most states enacted statutes in the 1920s and 1930s authorizing their courts to issue declaratory judgments.

Sunday, January 17, 2016

"Standing" in U.S. Law

In law, standing or locus standi is the term for the ability of a party to demonstrate to the [American] court sufficient connection to and harm from the law or action challenged to support that party's participation in the case. Standing exists from one of three causes:

  1. The party is directly subject to an adverse effect by the statute or action in question, and the harm suffered will continue unless the court grants relief in the form of damages or a finding that the law either does not apply to the party or that the law is void or can be nullified. This is called the "something to lose" doctrine, in which the party has standing because they directly will be harmed by the conditions for which they are asking the court for relief.
  2. The party is not directly harmed by the conditions by which they are petitioning the court for relief but asks for it because the harm involved has some reasonable relation to their situation, and the continued existence of the harm may affect others who might not be able to ask a court for relief. In the United States, this is the grounds for asking for a law to be struck down as violating the First Amendment, because while the plaintiff might not be directly affected, the law might so adversely affect others that one might never know what was not done or created by those who fear they would become subject to the law – the so-called "chilling effects" doctrine.
  3. The party is granted automatic standing by act of law.  Under some environmental laws in the United States, a party may sue someone causing pollution to certain waterways without a federal permit, even if the party suing is not harmed by the pollution being generated. The law allows them to receive attorney's fees if they substantially prevail in the action. In some U.S. states, a person who believes a book, film or other work of art is obscene may sue in their own name to have the work banned directly without having to ask a District Attorney to do so.

In the United States, the current doctrine is that a person cannot bring a suit challenging the constitutionality of a law unless the plaintiff can demonstrate that he/she/it is or will "imminently" be harmed by the law. Otherwise, the court will rule that the plaintiff "lacks standing" to bring the suit, and will dismiss the case without considering the merits of the claim of unconstitutionality. To have a court declare a law unconstitutional, there must be a valid reason for the lawsuit. The party suing must have something to lose in order to sue unless it has automatic standing by action of law.

Prudential Limitations

Additionally, there are three major prudential (judicially created) standing principles. Congress can override these principles via statute:

  1. Prohibition of Third-party standing: A party may only assert his or her own rights and cannot raise the claims of a third party who is not before the court; exceptions exist where the third party has interchangeable economic interests with the injured party, or a person unprotected by a particular law sues to challenge the oversweeping of the law into the rights of others. For example, a party suing over a law prohibiting certain types of visual material, may sue because the 1st Amendment rights of theirs, and others engaged in similar displays, might be damaged.

    Additionally, third parties who do not have standing may be able to sue under the next friend doctrine if the third party is an infant, mentally handicapped, or not a party to a contract. One example of a statutory exception to the prohibition of third party standing exists in the qui tam provision of the Civil False Claims Act.
  2. Prohibition of generalized grievances: A plaintiff cannot sue if the injury is widely shared in an undifferentiated way with many people. For example, the general rule is that there is no federal taxpayer standing, as complaints about the spending of federal funds are too remote from the process of acquiring them. Such grievances are ordinarily more appropriately addressed in the representative branches.
  3. Zone of interest test: There are in fact two tests used by the United States Supreme Court for the zone of interest
    1. Zone of injury - The injury is the kind of injury that Congress expected might be addressed under the statute.
    2. Zone of interests - The party is arguably within the zone of interest protected by the statute or constitutional provision.

Saturday, January 16, 2016

John O'Hara's short stories

The Dark Gift of Writer John O’Hara

Below is a review from Amazon.com about a collection of O’Hara short stories followed by three comments from review readers and then an afterword by the blog author.

Not among the great writers of short- stories

June 27, 2010

By Shalom Freedman

This review is from: Collected Stories of John O'Hara: Selected and With an Introduction by Frank MacShane (Hardcover)

Frank MacShane in his informative introduction to these Selected Stories claims that O'Hara is among the first rank of story writers. MacShane says it is the skill at dialogue, the brilliance at capturing so many different human types and voices which puts O'Hara in the first rank of story- writers. The opening story of the collection which was the best of those I read 'The Doctor's Son' gives some substantiation to MacShane's assertion. Set during the influenza epidemic during the First War it tells of a doctor's son and his adventures with the young doctor who comes to fill in for his father. In making the rounds we meet various communities each a part of the American mosaic. The dialogue brings the various voices to life. But what makes the story powerful is the harrowing and frightening descriptions of the effects of the epidemic. There is a sharpness and strength in the writing. One has a sense of a writer who really knows his characters and world. And yet the story ends to my mind in a disappointing way. Somehow the characters are not explored deeply enough. And if this is true in the opening story it is even more so in many of the more incidental smaller pieces which are more properly called vignettes. There is in the O'Hara world much incidental meaning, much off- hand connection. There are not the passions of Chekhov and Isaac Singer, nothing like the linguistic brilliance of Joyce, no characters who win our sympathy as Anderson's do in 'Winesburg Ohio'. There is not the humor of the Damon Runyan and Ring Lardner worlds and nothing of the incredible haunting originality of Kafka, or the colloquial loveableness of Salinger. There is rather a sense of toughness and mean- spiritedness, of down- and - outers and high- society folks who all are in some way or another not very generous, loving, kind, sensitive to the beautiful and the good.
In one sense the greatest Literature is that which makes us love life more, and see it as something greater than what we ourselves have known and experienced. Unfortunately O'Hara's work has for me the opposite effect.

= = = = = = = = = = = = = = = = = = = = = = = = =

Three reviews of the above review

Magyar says:

You are certainly entitled to your opinion. I, personally, disagree with you VERY strongly. I don't know what stories are in here, EXCEPT "Imagine Kissing Pete", which (again in my opinion) is one of the best Novellas ever written.
My suggestion to you is to pick up "Cape Cod Lighter". Read "You don't remember Me", "Your Funeeh, Funeeh face", and the short story with Lefty Gaines as the main character. (The story that starts at a funeral). If you STILL do not think John O'Hara was one of America's best short story writers, then we agree to disagree.

= = = = = = = = = =

manny says:

writing is subjective. I believe I have read most of his shortstories. my sister recently gave me the gift of "the horse knows the way", which I had read long ago. still wonderful. if you don't get the writing , well, whatever.

= = = = = = = = = =

Anna Hunt says:

I would have to say I'm for the defense on this. O'Hara is a brilliant writer, but requires close reading. In his best writing the real story is in the background, the Doctor's Son being a perfect example. The characters in the foreground of the story are the son, Myers and Evans; the affair being the center of the plot, but the real main character is the influenza and how it is affecting everyone's lives. The change in the relationship between Myers and the father shows this. The father obviously finds out about the affair and cuts it short by returning to work; he casually mentions he doesn't have a place on his staff for Myers and steers him away from Gibbsville. How the epidemic affects everyone is dealt with in a very matter of fact way, the way it happens in real life. It's something he does over and over again. It is more obvious in works like Pal Joey, but much more subtle in, for example, a Rage to Live. Having said that, it can be easy to miss what is happening in the background; that just makes him more of a pleasure to reread. But I would agreed there is a toughness, and sometimes a meanness, towards his characters and you can feel him toying with them.


= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

Afterword by the Blog Author

More than a little bit like Kipling, John O’Hara remains one of the genuinely hated twentieth century writers.  I think O’Hara will survive and grow in the twenty-first century, partly because of the bias in that hatred.

O’Hara himself was needled by the critics of his own time and not regarded as one of the best writers; specifically, he was a second banana in the eyes of critics to John Steinbeck.  O’Hara’s defense, which is my own defense of O’Hara as well, is that he was writing about the actual lives of most Americans in the first half of the twentieth century rather than writing sentimentally about rural America as Steinbeck did famously and successfully.

O’Hara believed in the inherent pettiness and resentfulness of the American people, a view that is abhorrent to many but which has not been successfully challenged as a matter of intelligent criticism.  Because O’Hara brought the concept of original sin into modern life, vividly and through the storytelling of a classic, Greek, cynic narrator, he was a significant and even great writer.  O’Hara is telling us that the great American religion is resentment and that the overwhelming mood of Americans is competitiveness.

O’Hara’s peer and fellow-traveler, at least in terms of my own reading, is Margaret Mitchell and her not-yet-fully-appreciated masterpiece Gone with the Wind, a war epic that hauntingly remains the great story of obsessive love that falls apart.  These are both writers that shunned happy endings.  And contrary to most cookbooks on how to write fiction, they both wrote books that became fabulously successful movies.  O’Hara’s Butterfield 8 secured a best actress award deservedly given to Elizabeth Taylor.

O’Hara’s world is neither pretty nor optimistic.  But I want to say something about his overview because this is 2016 and I am 64 years old.  My grandparents were born in 1894, 1894, 1900 and 1907.  They were in their 20s and 30s when O’Hara wrote his best work during the depression.  He was right.  He wrote accurately and journalistically about a mean world that worshipped resentment.  The worth inherent in such narration may finally be valued when the last of the depression era children are gone.