Irma – Yes All Staff Have Evacuated Florida


 

All our staff have evacuated.  We were all under mandatory evacuation orders. So thank you for all the concern. Our decision to leave was primarily due to the fact that the storm shifted to the West and the likelihood of power outages potentially lasting days or weeks was the real deciding factor. Generators also depend upon natural gas. If the storm is intense above Sarasota, then there is a potential that even the gas could be turned off. While Irma is not forecast to strike Cuba, it should cyclically speaking. We will see what happens. If this is correct, then Irma may impact the West even more than the Eastern part of Florida.

This is the 32nd (Pi) Catagory 5 Hurricane since 1924. That is not a good number cyclically speaking. Only five times has more than one Category 5 hurricane formed during the same season —1932, 1933, 1961, 2005, and 2007. Only in 2005 have more than two Category 5 hurricanes formed, and only in 2007 has more than one made landfall at Category 5 strength.

Moreover, there is a 96 year cycle in Category 5 hurricanes. There was a serious hit in 1921, but the 1825 Hurricane Santa Ana of took place  on July 26, hitting near  Guadeloupe. It continued west-northwestward to hit Puerto Rico, causing 1,300 deaths, before tracking to the west of Bermuda by August 2. The Santa Ana Hurricane quite honestly was one of the most intense storms to strike Puerto Rico in the previous few hundred years. WHile there was no atmospheric pressure readings taken in Puerto Rico, however, a minimum pressure of 918 mbar (27.1 inHg) in Guadeloupe indicated that the storm was likely a Category 5 hurricane (there are no officially observed Category 5 hurricanes before 1924).

The Tampa Bay hurricane of 1921 (also known as the 1921 Tarpon Springs hurricane) is the most recent major hurricane to strike the Tampa Bay Area. The storm developed from a trough in the southwestern Caribbean Sea on October 20. Initially a tropical storm, the system moved northwestward and intensified into a hurricane on October 22 and a major hurricane by October 23. Later that day, the cyclone peaked as a Category 4 before it hit Tampa. Typically, Category 5 storms develop most in the Atlantic.

There is evidence of even a monster storm which may have been greater than Category 5. Using paleotempestological research, they were able to identify past major hurricanes by comparing sedimentary evidence of recent and past hurricane strikes. There appears to have been a massive hurricane which was significantly more powerful than Category 5 that has been identified in Belizean sediment, having struck the region sometime before 1500.

We also know that in 1715 and again in 1733, Spain’s treasure fleets were devastated by hurricanes off the coast of Florida. Although the Spanish managed to recover some treasure, much more remained on the ocean floor. The sunken ships lay forgotten for more than 200 years until modern treasure hunters discovered several of them. The coins recovered have been sold off at auctions.

The Last Total Eclipse was 2 days After Signing the Declaration of Independence So Celebrate Today


Solar eclipses have been observed throughout history. As to be expected, they have often been seen as omens along with comets. We hear if the star of Bethlehem that announced the birth of Jesus and the comet that appeared the night Julius Caesar was assassinated, which has been historically recorded on coins proving his divinity.

Yet eclipses have also been seen as major markers of events throughout time. There is of course the famous eclipse that marked the Crucifixion of Jesus. The Christian gospels say that the sky was darkened for hours after the crucifixion of Jesus, which historians viewed either as a miracle or a portent of dark times to come. Later historians used astronomy to pinpoint the death of Christ based on this eclipse mention. Some historians tie the crucifixion to a total solar eclipse lasting 1 minute and 59 seconds that occurred in the year 29 AD while others say a second total eclipse, blocking the sun for 4 minutes and 6 seconds, in 33 AD, marked Jesus’ death. In the Vatican Codex, the oldest copy of the Greek Bible speaks of the solar eclipse in the Book of Luke. This is supported by a Greek historian named Phlegon, who also recorded a solar eclipse at the same period. Phlegon wrote a history entitled The Olympiads, in which he records during the rule of Tiberius Caesar a three hour eclipse of the sun. The ellipse of the sun took place at the time of Tiberius Caesar, in whose reign Jesus was crucified, and the great earthquakes which then took place.

Yet what is very interesting, is the fact that the eclipse of 33AD was also timed with the Great Financial Panic in Rome. The Financial Panic of 33AD provides one of the few detailed accounts of events recorded by the ancient historian Tacitus (56–117AD) whose primary focus appears to be moneylending.

Where a comet marked the birth of Jesus, it was an eclipse that marked the birth of Mohammed. The Koran mentions an eclipse that preceded the birth of Mohammed. Historians later tied this to a total eclipse that lasted 3 minutes and 17 seconds in 569 AD. The sun also disappeared for 1 minute and 40 seconds after the death of Mohammed’s son Ibrahim. But the world’s first Muslims didn’t believe that eclipse was a sign from God. Instead, according to Islamic texts called the Hadiths, Mohammed proclaimed “the sun and the moon do not suffer eclipse for any one’s death or life.” Muslims pray five times daily, but during eclipses they specially perform the “eclipse prayer” whose purpose is to remember the might and gifts of Allah the Creator.

Eclipses of the Sun are indeed awe-inspiring phenomena that have inspired legends and correlations throughout time. Such events have truly inspired a variety of claims. In many early cultures they were seen as omens of the coming end of the world. The word eclipse is of Greek origin meaning stemming from abandonment, desertion, or relinquishment.

In China, India, and Southeast Asia as well as in Peru there were beliefs that dragons or demons attack the Sun during such eclipses trying to devour it. The ancient Egyptian myth of the snake Apep that attacks the boat of the Sun god is believed now to refer to solar eclipses. The cult of the Sun god Sol remains with us today in so many ways. The cult was Sol Invictus or the invincible sun that reappeared every day. Sol was depicted with rays of sunlight coming from his head. That is the model of the Statue of Liberty. This also appeared as a halo around his head, and this tradition is why Christians often used this same symbol around the heads of Jesus and saints.

The Chinese and the Incas tried to frighten these dragons and demons away who were trying to devour the sun and end the world through various incantations which were believed to work since the eclipse would end and everyone most likely cheered and gave money to the priests. The Indians immersing themselves in water as a religious ritual to help the Sun struggle against the dragon. The bathing rituals are often linked to this idea.

Even today, in some countries, it is still traditional to bang pots, chant or shoot into the air when an eclipse happens to frighten off bad omens and preserve the world for another day. Some superstitions have even attributed disease comes from the solar eclipses ushering is famine and pestilence.

It was the Babylonians who in fact were the first state funded research project recording everything that ever moved from the planets to the migration of animals. Indeed, it was this massive state funded project that not merely led to astrology trying to link events on earth and traits in people to the cyclical movement of the heavens, but it led to Babylonian Priests becoming very powerful. They linked events as omens that indicated an impending miracle, the wrath of God, or the doom of a ruling dynasty. They understood the eclipse and could pronounce they were so powerful, they could darken the sun tomorrow. Who could possible do that without awesome power which the people them gave piles of gold. In 129 BC, the Greek astronomer Hipparchus had published his star charts and he relied upon the Babylonian records of several hundred years prior.

A Saros series is a series in which similar eclipses happen every 18 years 10/11 and a third days. Eclipses happen more often than that but the cycles are complex for there are many combinations of circumstances that can produce an eclipse. So there are at any given time, 42 Saros cycles running at once, which produces more than 2 eclipses per year. The same cycles come together to produce the same set of similar eclipses every 18 years and 10 or 11 and a third days. This period is what the Babylonian priests/ astronomers discovered 2,500 years ago and we retain that name Saros, meaning “repetition.” This too is fractal and we are currently in Saros series 125. The series contains 72 eclipses, occurring over 1280 years.

Ancient eclipse records made in China and Babylonia are believed to be over 4,000 years ago. Recent research has demonstrated that solar eclipses had been depicted in the fascinating mythology of ancient Egypt, and produced evidence that the ancient Egyptians also observed solar eclipses over 4,500 years ago.

The earliest recorded eclipse is described on an ancient clay tablet, in Ugarit in modern-day Syria. A total solar eclipse was linked to an uprising in an ancient Assyrian city. So will the total eclipse over the United States, be an omen of what is to begin in 2018 according to our models rekindling this view of a total eclipse as an omen of what is to come?

After all, this will be the FIRST total solar eclipse visible in the United States in nearly four decades that will take place on Aug. 21, 2017. This will be called the Great American Total Solar Eclipse, the 70-mile-wide shadow cast by the moon will darken skies from Oregon to South Carolina. Most solar eclipses, the moon takes just hides a tiny bit of the sun, but this will be a rare total solar eclipse.

Believe it or not, this August event will go down as the FIRST total solar eclipse whose path of totality stays completely in the United States since 1776 when the American Revolution took place. The 1776 Eclipse was a total eclipse of the Moon which occurred on Wednesday July, 31st, 1776. The Declaration of Independence was presented on July 1, 1776, and on the following day 12 of the 13 colonies voted in favor of Richard Henry Lee’s motion for independence. The delegates then spent the next two days debating and revising the language of a statement drafted by Thomas Jefferson. Then on July 4th, Congress officially adopted the Declaration of Independence. However, it was nearly a month before the actual signing of the document took place. First, New York’s delegates didn’t officially give their support until July 9th, which many have long since viewed the ”New Yorkers” as moving to a different beat, which is why red lights are optional in the city for pedestrians. Then it took two weeks for the Declaration to be “engrossed” which meant written on parchment in a clear hand. Most of the delegates then signed the Declaration of Independence actually on August 2nd, 1776 two days after the eclipse, which some said also was an omen.

This time, the 2017 eclipse comes in August when our model has been projecting an important turning point. It is very strange that this is coming at a time when our model is warning of the start of the Monetary Crisis in 2018. Is this also an omen as was the 1776 eclipse?

The fascinating question begs to wonder if this eclipse will make the TOP TEN eclipses in history? There was the Ugarit Eclipse which is one of the earliest solar eclipses recorded darkening the sky for 2 minutes and 7 seconds on May 3, 1375 BC, according to an analysis of a clay tablet, discovered in 1948. However, this was revised when a report appeared in the journal Nature back in 1989 suggesting that the eclipse actually took place on March 5, 1223 BC based upon historical dating of the tablet and the text which mentioned the visibility of the planet Mars during the eclipse. In approximately 1230 BC, there was the Battle of Nihriya which was the fall of the Hittite Empire to the Assyrian Empire. The rise of the Assyrians even intimidated the Egyptians and in 2 Chronicles 32:15, it reads:

‘Now therefore, do not let Hezekiah deceive you or mislead you like this, and do not believe him, for no god of any nation or kingdom was able to deliver his people from my hand or from the hand of my fathers. How much less will your God deliver you from my hand?’”

About 1221 BC, the Pharaoh Merneptah defeated the Libyan invasion, which was a combined force in union with the Sea People who were most likely invading Greeks.

The next major event is known as the Assyrian Eclipse. In 763 B.C., the Assyrian empire, which occupied modern Iraq, saw a total eclipse of the sun for 5 minutes. Early records from the period mention the eclipse in the same passage as an insurrection in the city of Ashur, now known as Qal’at Sherqat (shown in the image) in Iraq, suggesting that the ancient people linked the two in their minds. This marked the real beginning of the Assyrian Empire’s aggression for it then conquered Egypt in 671BC following the good omen of the appearance of Haley’s comet in 674BC, The Assyrian Empire lasted for only 3 waves of 51.6 years when it fell at the Battle of Carchemish in 605 BC to the armies of Babylonia, allied with the Medes, Persians and Scythians. There were eclipses in 605 BC, and 602 BC. Three months, December 689 BC, November 678 BC, and October 602 BC, had two eclipses. The Babylonian king Nebuchadnezzar in his Chronicle, recorded that he:

“…crossed the river to go against the Egyptian army which lay in Karchemiš. They fought with each other and the Egyptian army withdrew before him. He accomplished their defeat, decisively. As for the rest of the Egyptian army which had escaped from the defeat so quickly that no weapon had reached them, in the district of Hamath the Babylonian troops overtook and defeated them so that not a single man escaped to his own country. At that time Nebuchadnezzar conquered the whole area of Hamath.”

In 1302 BC, Chinese historians documented the total eclipse which blocked out the sun for 6 minutes and 25 seconds. Since the sun was a symbol of the emperor, an eclipse naturally was a serious warning to the emperor. After an eclipse, the duty of the emperor was to fast eating only vegetarian meals and performing rituals to rescue the sun. In fact, in ancient China, since the solar and lunar eclipses were regarded as heavenly signs that foretell the future of the Emperor. If an astrologer failed to predict an eclipse, they were executed. Finally, by 20 BC, Chinese astronomers realized the true nature of solar eclipses, and by 206AD, Chinese astronomers were able to predict solar eclipses by analyzing the motion of the Moon.

Ruins at Yin

This event seems to have also marked Pangeng moved the capital of Shang Dynasty to Yin in China, perhaps because of this eclipse. However, this is also the period (c. 1300 BC—1312 BC) when the revelation of the Torah to Moses took place.

Herodotus, the Greek father of history, reported that Thales (ca. 624-547 BC), the Greek philosopher, predicted the solar eclipse of May 28th, 585 BC that put an end to the conflict between the Lydians and the Medes.

Herodotus wrote:

… day was all of sudden changed into night. This event had been foretold by Thales, the Milesian, who forewarned the Ionians of it, fixing for it the very year in which it took place. The Medes and the Lydians when they observed the change, ceased fighting, and were alike anxious to have terms of peace agreed on.

The Eclipse of 241BC marked the end of the First Punic War (264–241 BC) and the first Emperor of China, Qin Shi Huang (259-210BC), came to the thrown in his state in 246BC. He then  begins to engage in the Waring States period and conquers all of China becoming the emperor in 220 BC. There was a total eclipse on June 4th, 241 BC 02:47:32 in Saros Cycle 72. The seems to have marker the fall of Carthage and the Rise of Rome as well as the start of the Qin conquest of China. We can see the cost of the Punic Wars in the debasement of the bronze coinage of Rome. The silver denarius was introduced in 211BC. It was also Eratosthenes who actually calculated the circumference of the Earth in 240BC and determined the Earth’s circumference and it was round. His data was rough, but he wasn’t far off.

Also on this list of the TOP TEN eclipses in history we have King Henry’s Eclipse in England. When King Henry I of England, the son of William the Conqueror, died in A.D. 1133, the event coincided with a total solar eclipse that lasted 4 minutes and 38 seconds. After this eclipse and the death of Henry I, a struggle for the throne plunged England into chaos and civil war. In the Anglo-Saxon Chronicle, a passage is also found recording this eclipse:

“In this year King Henry went over sea at Lammas, and the second day as he lay and slept on the ship the day darkened over all lands; and the Sun became as it were a three-night-old Moon, and the stars about it at mid-day. Men were greatly wonder-stricken and were frightened, and said that a great thing should come thereafter. So it did, for the same year the king died on the following day after St Andrew’s Mass-day, Dec 2 in Normandy.”

Not all eclipses are bad omens. The element helium was discovered on August 18th, 1868 by the French astronomer Jules Janssen (1824-1907) when he observed the spectrum of the Sun during a total eclipse in India. Then there was also Einstein’s Eclipse of 1919. It was this solar eclipse where the sun vanished for 6 minutes and 51 seconds. This allowed the observation that gravity in fact could bend light from the stars as they passed near the sun. The findings confirmed Einstein’s theory of general relativity, which describes gravity as a warping of space-time. The British astronomer Sir Arthur Eddington (1882-1944) travelled to the island of Príncipe near Africa to observe that eclipse to verify Einstein’s conclusion that light is deflected in the gravitational fields of celestial objects.

So the question becomes, does this 2017 Solar Eclipse mark the fall of the Western Culture and the beginning of the shift in the financial capital of the world to China?

It is curious that such events tend to line up at times with our model. Of course, our model is derived from the data, not astrology. So we have tracked the trends (effects) but have never attempted to attribute the cause to some mystical event.

We track the effects and the ebb and flow of human events that leave their footprints throughout the centuries. Is there any connection? It is far too complex to try to attribute such things to a single event. There are numerous total eclipses throughout history. Where they occur seems to have little correlation to events in other regions.

We will see with this one since it is the first time since 1776 to appear in the United States. Have a party today and say here’s to posterity. We will.

Time will tell.

Thank a White Male


Re-Post from American Thinker on, progress, inventon, By Jeff Lipkes

Do you like internal combustion engines?

Thank a few white men.  (Jean Lenoir, Nikolaus Otto, Karl Benz, Rudolf Diesel, Gottlieb Daimler, Emil Jellinek, Henry Ford among others.)

Are you a fan of flush toilets and indoor plumbing?

Thank white males Alexander Cumming, Thomas Twyford, and Isaiah Rogers

Toilet paper?

Thank Joseph Gayetty, W.M.

How about washing machines and dryers?

Thank white males Alva Fisher and J. Ross Moore.

“When you’ve got your health, you’ve got just about everything” ran the tag-line in a famous Geritol commercial from the 1970s, and the guys we most have reason to be grateful for are undoubtedly those who’ve developed the medical practices and the drugs and devices that have transformed our lives over the past hundred fifty years.

Before the turkey gets carved, it’s worth taking a moment to remember a few of these brilliant, persistent, and lucky men, and recall their accomplishments.  Even when they’ve won Nobel Prizes in Medicine, their names are virtually unknown.  They’re not mentioned in the Core Curriculum or celebrated by Google on their birthdays.

Pain

If you ever had surgery, did you opt for anesthesia?

If so, thank a few more white males, beginning with William Clarke in New York and Crawford Long in Georgia who both used chloroform in minor surgeries in 1842.  A paper published four years later by William Morton, after his own work in Boston, spread the word.  Ether replaced chloroform during the next decade.  There are now scores of general and regional anesthetics and sedatives and muscle relaxants, administered in tandem.  The first local anesthetic has also been superseded.  It was cocaine, pioneered by a Viennese ophthalmologist, Carl Koller, in 1884.

Ever take an analgesic?

Next time you pop an aspirin, remember Felix Hoffmann of Bayer.  In 1897, he converted salicylic acid to acetylsalicylic acid, much easier on the stomach.  Aspirin remains the most popular and arguably the most effective drug on the market.  In 1948 two New York biochemists, Bernard Brodie and Julius Axelrod, documented the effect that acetaminophen (Tylenol), synthesized by Harmon Morse in 1878, had on pain and fever.  Gastroenterologist James Roth persuaded McNeil Labs to market the analgesic in 1953.

Infectious Diseases

Most Americans today die of heart disease or cancer, but before the twentieth century, it was infectious diseases that struck people down, and children were the primary victims.  In pre-industrial England, still with the most developed economy in the world in the late 17th century, 50% of all children didn’t survive the age of 15.  With the phenomenal growth of cities during the 19th century, cholera, typhoid fever, and tuberculosis became the leading killers.

In 1854, a London medical inspector, John Snow, proved that a cholera epidemic in Soho was caused by infected sewage seeping into the water supply.  Until then it was thought the disease spread through the air.  The sanitary disposal of sewage and the provision of clean water, possible thanks to mostly anonymous metallurgists and engineers — an exception is the famous Thomas Crapper, who pioneered the u-shaped trap and improved, though he didn’t invent, the flush toilet — has saved more lives than any drug or surgical innovation.

Dramatic improvements in food supply have also had an incalculable effect on health.  Agricultural innovations, beginning with those introduced in England in the 18th century, were disseminated globally by the end of the 20th century — the “Green Revolution.”  Famines struck Europe as recently as the late 1860s.  (The man-made famines of the 20th century are another story.)  A transportation revolution made possible the provision of more than sufficient protein, calories, and nutrients worldwide.  Needless to say, it was white males who designed and built the roads, canals, railroads, and ports and airports, and the ships, trains, planes, and trucks that used them, and the mines, and then wells, pipelines, and tankers that supplied the fuel they ran on.

Whatever the merits of taking vitamins and supplements today, no one has to take vitamin C to prevent scurvy, or vitamin B to prevent pellagra, or vitamin D and calcium to prevent rickets.  And, for the time being, we all live in a post-Malthusian world.  The global population was about 800 million to 1 billion when the gloomy parson wrote his famous book in 1798.  It’s now over 7 billion.

***

Dr. Snow had no idea what was actually causing cholera.  It was Louis Pasteur who gave the world the germ theory of disease, as every schoolchild once knew.  Studying the fermentation of wine, he concluded that this was caused by the metabolic activity of microorganisms, as was the souring of milk.  The critters were responsible for disease, too, he recognized, and identified three killer bacteria:  staphylococcus, streptococcus, and pneumococcus.  Nasty microorganisms could be killed or rendered harmless by heat and oxygenation, Pasteur discovered, and would then prevent the disease in those who were inoculated.  He went on to develop vaccines for chicken cholera, anthrax, and rabies.  Edward Jenner had demonstrated in in the late 1790s that the dreaded smallpox could be prevented by injecting patients with material from the pustules of cowpox victims, a much milder disease.  (The word vaccine comes from vaca, one of the Latin words for cow.)  Pasteur, however, was the first to immunize patients by modifying bacteria rather than through cross-vaccination.

A parade of vaccines followed.  People in their mid-60s and older can remember two of the most famous:  the Salk and Sabin vaccines against poliomyelitis, a paralyzing disease that had panicked American parents in the late ‘40s and early ‘50s.  Children preferred Albert Sabin’s 1962 version:  the attenuated virus was administered on a sugar cube.  Jonas Salk’s inactivated vaccine, available in 1955, was injected.

In 1847, more than a decade before Pasteur disclosed his germ theory, the Viennese obstetrician Ignaz Semmelweis documented the effectiveness of hand washing with chlorinated water before entering a maternity ward.  He brought mortality rates from puerperal fever down from 8% to 1.3%.  Two decades later, having read a paper by Pasteur, Joseph Lister demonstrated the effectiveness of carbolic acid to sterilize wounds and surgical instruments.  Mortality rates fell from around 50% to about 15%.  The efforts of both men, especially Semmelweis, were met with ridicule and disdain.

Pasteur’s German rivals Robert Koch and Paul Ehrlich made monumental contributions to biochemistry, bacteriology, and hematology, but left the world no “magic bullet” (Ehrlich’s term).  Koch identified the organism causing tuberculosis, the leading killer of the 19th century, but his attempts at finding a vaccine failed.  His purified protein derivative from the bacteria, tuberculin, could be used to diagnose the disease, however.  It was two French researchers, Albert Calmette and Camille Guerin, who developed a successful vaccine, first administered in 1921, though it was not widely used until after World War II.

Ehrlich joined the search for antibacterial drugs that were not denatured bacteria or viruses.  He synthesized neoarsphenamine (Neo-Salvarsan), effective against syphilis, a scourge since the late15th century, but which had toxic side effects.  It was not until the 1930s that first generation of antibiotics appeared.  These were the sulfa drugs, derived from dyes with sulfa-nitrogen chains.  The first was a red dye synthesized by Joseph Klarer and Fritz Mietzsch.  In 1935, Gerhard Domagk at I. G. Farben demonstrated its effectiveness in cases of blood poisoning.

The anti-bacterial properties of Penicillium had already been discovered at this point by Alexander Fleming.  The Scottish bacteriologist had famously left a window open in his lab when he went on vacation in 1928, and returned to find that a mold had destroyed the staphylococcus colony in one of his petri dishes.  But it’s one thing to make a fortuitous discovery and another thing to cultivate and purify a promising organic compound and conduct persuasive trials.  This was not done until 1941.  Thank Oxford biochemists Howard Florey and Ernst Chain.  A Pfizer chemist, Joseph Kane, figured out how to mass-produce penicillin and by 1943 it was available to American troops.  The wonder drug of the 20th century, penicillin killed the Gram-positive bacteria that caused meningitis, diphtheria, rheumatic fever, tonsillitis, syphilis, and gonorrhea.  New generations of antibiotics followed, as bacteria rapidly developed resistance:  among them, streptomycin in 1943 (thank Selman Waksman), tetracycline in 1955 (thank Lloyd Conover), and, the most widely prescribed today, amoxicillin.

Diagnostic technologies

Microscope:  While the Delft draper Antonie van Leeuwenhoek didn’t invent the compound microscope, he improved it, beginning in the 1660s, increasing the curvature of the lenses, and so became the first person to see and describe blood corpuscles, bacteria, protozoa, and sperm.

Electron microscope:  Physicist Ernst Ruska and electrical engineer Max Kroll constructed the prototype in Berlin in 1933, using a lens by Hans Busch.  Eventually, electron microscopes would be designed with two-million power magnification.  Leeuwenhoek’s had about two hundred.

Stethoscope:  Thank the French physician René Laennec, who introduced what he called a microphone in 1816.  British nephrologist Golding Bird substituted a flexible tube for Laennec’s wooden cylinder in 1840, and the Irish physician Arthur Leared added a second earpiece in 1851.  Notable improvements were made by Americans Howard Sprague, a cardiologist, and electrical engineer Maurice Rappaport in the 1960s (a double-sided head), and Harvard cardiologist David Littmann in the same decade (enhancing the acoustics).  The device undoubtedly transformed medicine, and with good reason became the symbol of the health care professional.

Sphygmograph:  The first machine to measure blood pressure was created by a German physiologist, Karl von Vierordt in 1854.

X-rays:  Discovered by Karl Wilhelm Röntgen, at Wurzberg in 1895, this was probably the single most important diagnostic breakthrough in medical history.  Before Röntgen noticed that cathode rays, electrons emitted from a cathode tube, traveled through objects and created images on a fluorescent screen, physicians could only listen, palpitate, examine stools, and drink urine.

PET scans:  James Robertson designed the first machine in 1961, based on the work of number of American men at Penn, Wash U., and Mass General, designed the first machine.  The scanner provides an image from the positron emissions coming from a radioactive isotope injected into the patient, and is particularly useful for mapping activity in the brain.

CAT scans:  The first model was developed by electrical engineer Godfrey Hounsfield, in London, 1972, drawing on the work of South African physicist Alan Cormack in the mid-1960s.  It generates three-dimensional and cross-sectional images using computers and gamma rays.

MRI:  Raymond Damadian, a SUNY professor of medicine with a degree in math, performed the first full-body scan 1977.  His design was anticipated by theoretical work by Felix Bloch and Edward Purcell in the 1930s, and, later, Paul Lauterbur.  MRIs map the radio waves given off by hydrogen atoms exposed to energy from magnets, and are particularly useful in imaging tissue — and without exposing the patient to ionizing radiation.

Ultrasound:  Ian Donald, a Glasgow obstetrician, in the mid-1950s adopted a device already used in industry that generated inaudible, high frequency sound waves.  The machine quickly and cheaply displays images of soft tissue, and now provides most American parents with the first photo of their baby.

Endoscopes:  Georg Wolf produced the first flexible gastroscope in Berlin in 1911, and this was improved by Karl Storz in the late ‘40s.  The first fiber optic endoscope was introduced in 1957 by Basil Hirschowitz, a South African gastroenterologist, drawing on the work of British physicist Harold Hopkins.  The scope is indispensible in diagnosing GI abnormalities.

Angiogram:  Werner Forssmann performed the first cardiac catherisation — on himself — in Eberswald in 1929.  He inserted a catheter into his lower left arm, walked downstairs to a fluoroscope, threaded the catheter to his right atrium and injected a radioptic dye.  The technique was further developed by Dickson Richards and André Courmand at Columbia in the ‘40s, and then extended to coronary arteries, initially accidentally, by Frank Sones at the Cleveland Clinic in 1958.

X-rays and scopes were quickly used in treatment as well diagnosis.  Roentgen himself used his machines to burn off warts.  Similarly, in 1964, Charles Dotter and Marvin Judkins used a catheter to open a blocked artery, improving the technique in 1967.  Andreas Gruentzig then introduced balloon angioplasty in 1975, an inflated balloon opening the narrowed or blocked artery.  In 1986, Jacques Puel implanted the first coronary stent at U. of Toulouse, and soon afterwards a Swiss cardiologist, Ulrich Sigwart, developed the first drug-eluding stent.

***

The men who developed five of the most dramatically effective and widely used drugs in internal medicine deserve mention.

In the late ‘30s, two Mayo Clinic biochemists hoping to cure rheumatoid arthritis, Philip Hench and Edward Kendall, isolated four steroids extracted from the cortex of the adrenal gland atop the kidneys.  The fourth, “E,” was very difficult to synthesize, but Merck chemist Lewis Sarrett succeeded, and in 1948, the hormone was injected into fourteen patients crippled by arthritis.  Cortisone relieved the symptoms.  Mass produced, with much difficulty, by Upjohn chemists in 1952, it was refined by their rivals at Schering three years later into a compound five times as strong, prednisone.  In addition to arthritis, corticosteroids are used in the treatment of other inflammatory diseases, like colitis and Crohn’s, and in dermatitis, asthma, hepatitis, and lupus.

Anyone over fifty can remember peptic ulcers, extremely painful lesions on the stomach wall or duodenum.  They were thought to be brought on by stress.  “You’re giving me an ulcer!” was a common expression.  Women were especially affected, and a bland diet was the only treatment, other than surgery.  The lesions were caused by gastric acid, and two British pharmacologists and a biochemist, George Paget, James Black, and William Duncan, investigated compounds that would block the stomach’s histamine receptors, reducing the secretion of acid.  There were endless difficulties.  Over 200 compounds were synthesized, and the most promising, metiamide, proved toxic.  Tweaking the molecule, replacing a sulfur atom with two nitrogen atoms, yielded cimetidine in 1976.  As Tagamet, it revolutionized gastroenterology.  It was also the first drug to generate over $1 billion in annual sales.  Its successors, the proton pump inhibitors Prilosec and its near-twin Nexium, more than doubling the acid reduction, have also been blockbuster drugs.

Cimetidine was the culmination of one line of research that began in 1910, when a London physiologist, Henry Dale, isolated a uterine stimulant he called “histamine.”  Unfortunately, when it was given to patients, it caused something like anaphylactic shock.  The search began for an “antagonist” that would block its production, even before it was recognized as the culprit in hay fever (allergic rhinitis).  The most successful antagonist was one was developed in 1943 by a young chemist in Cincinnati, Geroge Rieveschl, diphenhydramine, marketed as Benadryl.  Ten to thirty percent of the world’s population suffers from seasonal allergies, so this was hailed as miracle drug.  In the early ‘80s a second generation of antihistamines appeared that didn’t cross the brain-blood barrier and thus didn’t sedate the user.  Loratadine (Claritin), the first, was generating over $2 billion in annual sales before it went generic.

Diabetes, resulting in high blood glucose levels (heperglycemia), has been known for two millennia.  It was a deadly disease, type 1 rapidly fatal, type 2, adult onset, debilitating and eventually lethal.  By the end of the 19th century, the Islets of Langerhans in the pancreas had been identified as the source of a substance that prevented it, insulin, but this turned out to be a fragile peptide hormone, broken down by an enzyme in the pancreas during attempts to extract it.  In 1921, Canadian surgeon Frederick Banting and medical student Charles Best determined a way to disable the production of the enzyme, trypsin.  Injected in a teenager with type 1 diabetes, insulin was immediately effective.  There is still no cure for diabetes, but today the 380 million sufferers globally can live normal lives thanks to Banting and Best.

Finally, millions of men and their wives and girlfriends owe a big debt to British chemists Peter Dunn and Albert Wood, and Americans Andrew Bell, David Brown, and Nicholas Terrett.  They developed sildenafil, intended to treat angina.  It works by suppressing an enzyme that degrades a molecule that relaxes smooth muscle tissue, increasing blood flow.  Ian Osterloh, running the clinical trials for Pfizer, observed that the drug induced erections, and it was marketed for ED.  Viagra made the cover of Time Magazine after it was approved in March 1998.  The blue pill still generates about $2 billion annually in sales, despite competition, and is prescribed for 11 million men.

***

Two incredible machines build in the mid-20th century revolutionized the practice of medicine.  Both remove blood from the body.

During World War II, the Dutch physician Willem Kolff constructed a machine to cleanse the blood of patients suffering from renal failure by filtering out urea and creatine.  Over 400,000 Americans are on dialysis today.

In 1953, after 18 years of work, John Gibbon, a cardiologist at the University of Pennsylvania, produced a machine that oxygenated blood and pumped it around the body, permitting operations on the heart, like those performed a decade later by Michael DeBakey in Houston and René Favaloro in Cleveland.  The two surgeons pioneered coronary bypass grafts, using a blood vessel in the leg or chest to re-route blood around a blocked artery.  About 200,000 operations are performed each year, down from about 400,000 at the turn of the century, thanks to stents.  Gibbon’s machine enabled the most widely covered operation in history, the heart transplant, first performed by South African surgeon Christian Barnard in 1967, based on research by Norman Shumway and others.  Over 2,000 Americans receive heart transplants each year.

The cardiac device Americans are most likely to encounter is the defibrillator, now in airports, stadiums, supermarkets, and other public places.  Thank two Swiss professors, Louis Prévost and Frédéric Batelli, who, in 1899, induced ventricle fibrillation, abnormal heartbeat, in dogs with a small electrical shock, and restored normal rhythm with a larger one.  It was not until the 1940s that a defibrillator was used in heart surgery, by Claude Beck in Cleveland.  A Russian researcher during World War II, Naum Gurvich, discovered that biphasic waves, a large positive jolt followed by a small negative pulse, was more effective, and a machine was constructed on this basis by an American cardiologist, Bernard Lown.  Improvements by electrical engineers William Kouwenhoven and Guy Knickerbocker, and cardiologist James Jude at Hopkins in 1957, and subsequently by Karl and Mark Kroll, and Byron Gilman in the ‘90s made the device much smaller and portable.

Over three million people worldwide don’t have to worry about defibrillators or face open-heart surgery.  These are the recipients of pacemakers, and can thank a Canadian electrical engineer, John Hopps.  Predecessors were deterred by negative publicity about their experiments, which were believed to be machines to revive the dead.  Gurvich had faced this as well.  Hopps’ 1950 device used a vacuum tube.  With the invention of the transistor, a wearable pacemaker became possible, and Earl Bakken designed one in 1958.  Not long afterward, two Swedish engineers, Rune Elmquist and Åke Senning created an implantable pacemaker.  The first recipient eventually received 26 and lived to age 86.  Lithium batteries, introduced in 1976, enabled the creation of devices with a much longer life.

Cardiac Drugs

Cardiac stimulants have been around since the late 18th century.   Thank William Withering, who published his experiments with the folk-remedy digitalis (from foxglove) in 1785.

Anti-anginal drugs were introduced a century later, also in Britain:  amyl nitrite in the mid-1860s and nitroglycerin a decade later.  Both compounds had been synthesized by French chemists.  Thank Thomas Bruton and William Murrell.

The first diuretics, to reduce edema (swelling) and lower blood pressure, were alkaloids derived from coffee and tea.  These were not very effective, but better than leeches.  Mercury compounds were pioneered by the Viennese physician Arthur Vogel in 1919.  These worked, but were tough on the kidneys and liver.  The first modern diuretics, carbonic anhydrase inhibitors, were developed in the 1940s, with the American Karl Beyer playing a leading role.

The first anti-coagulants date from the ‘20s.  A Johns Hopkins physiologist, William Howell, extracted a phospholipid from dog livers that he called heparin and that appeared to prevent blood clots.  The first modern anti-coagulant, and still the most widely prescribed, was warfarin (Coumadin), developed as a rat-poison by Karl Link in Wisconsin in 1948.  Its effectiveness, and lack of toxicity, was revealed when an army recruit took it in a suicide attempt.

Anti-arrhythmic drugs, to stabilize the heartbeat, were introduced in the opening decade of the 20th century.   The first was derived from quinine.  The big breakthrough occurred in 1962.  Thank, once again, the Scotsman James Black, who synthesized propranolol in that year, the first beta-blocker.  What they block are the receptors of epinephrine and norepinephrine.  These two chemicals (catecholamines) increase the heart rate, blood pressure, and blood glucose levels, useful for many purposes, but not a good thing in patients with cardiac arrhythmia, irregular heartbeats.  Beta-blockers are also prescribed to lower blood pressure.

ACE inhibitors lower the levels of an enzyme secreted by the kidneys and lungs that constricts blood vessels.  The unpromising source for the first inhibitor was the venom of the Brazilian pit-viper.  It was extracted, purified, and tested by three Squibb scientists in 1975, David Cushman, Miguel Ondetti, and Bernard Rubin.  It’s still widely prescribed, though many other ACE inhibitors have since been designed.  They are used for patients with congestive heart failure or who have had a heart attack, as well as those with hypertension.

Finally, mention must be made of the statins, which, though over-hyped and over-prescribed, lower serum cholesterol and reduce the risks of a second heart attack.  A Japanese microbiologist, Akira Endo, derived, from a species of Penicillium, a substance that inhibited the synthesis of cholesterol, but it was too toxic to use on humans.  In 1978, a team at Merck under Alfred Alberts had better luck with another fungus, and called the compound lovastatin.  Statins work by inhibiting the activity of an enzyme called HMGR.

Cancer Drugs

In the forty-three years since Richard Nixon’s “war on cancer” was launched, the disease has received the lion’s share of government, foundation, and pharmaceutical industry funding, though heart disease kills more people — 596,577 Americans last year to 576,691 for cancer, according to the most recent data.  This makes it particularly difficult, and invidious, to single out individual researchers.

There is still, of course, nothing close to a magic bullet, though cancer deaths have dropped about 20% since their peak in 1991.  Around 27% of cancer deaths this year will be from lung cancer, so the rate will continue to fall as more people stop smoking.

The originators of a few therapies with good five-year survival rates ought to be singled out and thanked.

Seattle oncologist Donnall Thomas performed the first successful bone marrow transplant in 1956.  The donor was an identical twin of the leukemia patient.  With the development of drugs to suppress the immune system’s response to foreign marrow, Thomas was able to perform a successful transplant from a non-twin relative in 1969.  About 18,000 are now performed each year.

One of the more notable successes of chemotherapy has been in the treatment of the childhood cancer acute lymphoblastic leukemia (ALL).  Sidney Farber in the late ‘40s carried out clinical trials with the antifolate aminopterin, synthesized at Lederle by the Indian biochemist Yellapragada Subbarow.  This proved the first effective compound in treating the disease.  It was superseded by methotrexate, and now, as in all chemo treatments, a combination of agents is used.  The five-year survival rate for ALL has jumped from near zero to 85%.

Early detection is the key to successful treatment in all cancers, and survivors of breast cancer can thank at least four men who pioneered and popularized mammography over a fifty-year period beginning in 1913:  Albert Salman, Stafford Warren, Raul Leborgne, and Jacob Gershon-Cohen.

A second key to the comparatively high survival rates for women with breast cancer is tamoxifen.  First produced in the late ‘50s by British endocrinologist Arthur Walpole, it was intended as a “morning-after” birth control pill because it blocked the effects of estrogen.  However, it failed to terminate pregnancy.  Researchers had meanwhile discovered that some, though not all, women with breast cancer recovered when their ovaries were removed.  Walpole thought tamoxifen might block breast cancer estrogen receptor cells, inhibiting their reproduction, and persuaded a professor of pharmacology, Craig Jordan, to conduct experiments.  These demonstrated the drug’s efficacy, and after clinical trials it was approved and marketed in 1973.  Think of Arthur W. the next time you see one of those ubiquitous pink ribbons.

Most chemo agents are cytotoxic metal-based compounds that do not distinguish between abnormal cells and healthy cells that also divide rapidly.  The nasty side effects range from hair-loss and nausea to decreased production of red blood cells, nerve and organ damage, osteoporosis and bone fusion, and loss of memory and cognition. More selective drugs, monoclonal antibodies, have been used for some time.  These were first produced by Georges Köhler and César Millstein in 1975 and “humanized” by Greg Winter in 1988, that is, made more effective by using recombinant DNA from mammals.  Over 30 “mab” drugs have been approved, about half for cancer.

Research has also been underway for years into delivery systems using “nano-particles” that will target tumors exclusively.  Another approach, pioneered by Jonah Folkman, has been to find drugs that will attack the blood supply of tumors, angiogenesis inhibitors.  This turned out not to be the magic bullet Folkman hoped for, but more than fifty of these drugs are in clinical trials, and a number are effective owing to other mechanisms, and are currently used.

Psychiatric medicine

Drugs have revolutionized the practice of psychiatry since the 1950s, and brought relief to millions suffering from depression, anxiety, and psychoses.  For obvious reasons, these are some of the most highly addictive and widely abused drugs.

A few men to thank:

Adolph von Baeyer, Emil Fischer, Joseph von Mering:  barbiturates, synthesized in 1865, but not marketed until 1903.  The most commonly prescribed today are phenobarbital sodium (Nembutal) and mephobarbital (Membaral).

Bernard Ludwig and Frank Berger:  meprobamate, the tranquilizer Miltown.  By the end of the ‘50s, a third of all prescriptions in America were for this drug

Leo Steinberg:  the anxiolytic (anti-anxiety) benzodiazepines, first synthesized in 1955.  The most successful initially was diazepam, Valium, marketed in 1963.  The most widely prescribed benzodiazepine today is alprazolam, Xanax.  It’s also the most widely prescribed psychiatric drug, with nearly 50 million prescriptions.  It increases concentrations of dopamine and suppresses stress-inducing activity of the hypothalamus.

Leandro Panizzon:  methylphenidate (Ritalin).  The Swiss chemist developed it in 1944 as a stimulant, and named it after his wife, whose tennis game it helped improve.  Until the early ‘60s amphetamines were used, counter-intuitively, to treat hyperactive children.  Thirty years after its patent expired, the controversial dopamine reuptake inhibitor is still the most widely prescribed medication for the 11% of children who’ve been diagnosed with ADHD.

Klaus Schmiegel and Bryan Malloy:  the anti-depressant fluoxetine, the first SSRI, selective serotonin reuptake inhibitor, increasing serotonin levels.  Marketed as Prozac in 1988, it made the cover of Newsweek and is still prescribed for over 25 million patients.

Paul Janssen:  risperdone (Risperdal), the mostly widely prescribed antipsychotic drug worldwide.  The Belgian researcher developed many other drugs as well, including loperamide HCL (Imodium).  When commenters on web articles advise trolls to take their meds, they might want to specify risperdone.

Seiji Sato, Yasuo Oshiro, and Nobuyuki Kurahashi:  aripiprazole (Abilify) which blocks dopamine receptors, and was the top selling drug at the end of 2013, grossing $1.6 billion in Q4.

***

A few observations.

Japanese and Indian researchers will make important contributions to future drugs, as the trio responsible for Abilify reminds us.

And, naturally, some women have played roles in the advances that have been summarized.  Mary Gibbon, a technician, assisted her husband on the heart-lung machine.  Lina Stern did important research on the blood-brain barrier, and it was in her lab that Guravich improved the defibrillator.  Jane Wright conducted early trials of methotrexate that helped demonstrate its efficacy.  Lucy Wills did pioneering work on anemia in India.  Roslyn Yalow helped develop radioimmunoassay, which measures concentrations of antigens in the blood.  Anne-Marie Staub did interesting work on antihistamines, though her compounds proved toxic.

They are exceptions.  Our benefactors have not only been overwhelmingly European males, but are mostly from England and Scotland, Germany, France, Switzerland, and the Netherlands, as well as Americans and Canadians whose families emigrated from those countries.  And, of course, Jews, who’ve won 28% of the Nobel Prizes in Medicine.

Some of the beneficiaries in particular might want to think about this.

Muslims boast that their faith has over 2 billion followers throughout the world.  If this number is accurate it has far less to do with the appeal of Islam or with Arab or Turkish conquests, and everything to do with the work of some Northern Europeans and Jews, along with the “imperialists” who built roads, canals, and ports and the vehicles that use them, as well as schools and hospitals — like the traveling eye clinics in Egypt funded by the Jewish banker Ernest Cassel, which nearly eliminated blinding trachoma, then endemic.

The fact that we in the U.S. idolize our entertainers as no society has before is not going to cut off the supply of outstanding medical researchers.  Very bright and inquisitive people usually don’t pay much attention to popular culture.  But it diminishes us.

It’s the ingratitude, though, not the indifference, that’s more troubling.

Biting the hand that feeds is a core principle of Leftists.  For 150 years, they’ve sought to concentrate power in their own hands by exploiting the resentment of ignorant people against a system that has finally enabled mankind to spring the Malthusian trap.

Multiculturalism, with its simple-minded relativism, has broadened the scope of the party line.  Not only shadowy “capitalists” are vilified, but whites and males.  Ignorant people can now think well of themselves by opposing “racism” and “the patriarchy” — and by voting for an unqualified and deceptive poseur because, though a male, he is not white.

The first step parents can take to help spare America from being “fundamentally transformed” is to insist that history be properly taught.  This means, among other things, recognizing the accomplishments of a few men who’ve found cures for or relieved the symptoms of diseases that have killed and tortured humans for millennia.