Federal management of our health and well-being: car safety

Here’s part 3 of our small series, inspired by the health-care debate, on whether the federal government can properly look after our health and well-being. We turn here from food and drug safety to cars.

The safety of the cars manufacted by U.S. automakers was completely unmonitored by anyone before the 1960s. For decades Americans drove cars that not only were often unsafe, but were under absolutely no pressure to be safe. There was no consumer protection service for drivers. If your car was dangerous, that was your problem. Causes of accidents were not investigated with an eye to forcing car manufacturers to improve their products. In 1958 the UN established an international “forum” for vehicle regulation, but the U.S. refused to join it. As is so often the case, manufacturers assumed—and protested loudly—that any oversight would be fatal to them, that bankruptcy was the only possible outcome of regulation, and that U.S. consumers did not want safety regulations.

By 1965, all this de-regulation had created a situation where, according to a report released the next year by the National Academies of Science, car accidents were the leading cause of death “in the first half of life’s span” (from the “History of US National Highway Traffic Safety Administration NHTSA” website at http://www.usrecallnews.com/2008/06/history-of-the-u-s-national-highway-traffic-safety-administration-nhtsa.html).

The Big Three responded as they always had—by saying that all accidents were the result of driver error or bad roads. Since the 1920s, U.S. car manufacturers had pushed what they called the “Three E’s”—Engineering, Enforcement and Education”. As Ralph Nader put it (much more about him later) “Enforcement” and “Education” were directed at drivers, and “Engineering” was directed at all those bad roads causing accidents.

With the federal government still reluctant to step in and regulation car manufacturer safety standards—just as Congress, lobbied relentlessly by criminal food manufacturers had refused to step in to regulate food and drug safety—it took a bombshell book to shake up the status quo.

Next: Ralph Nader and Unsafe at Any Speed

The FDA and government regulation of food safety

Part 2 in the series, basically Truth v. Myth, on whether the federal government can be trusted to compassionately and capably protect the public health and well-being, in which we continue our look at the founding of the Food and Drug Administration.

We’ve seen the dangerous and criminal state of food production in the U.S. by the turn of the 20th century. Starting in the late 1800s, some U.S. food manufacturers were petitioning the government to regulate their industry. These were manufacturers that actually spent the money to produce good-quality food, and they were afraid of being driven out of business by those companies that saved a fortune by pasting ashes together, canning it and calling it potatoes. (It’s amazing how many things were canned early on. Potatoes are one example. Even in the 1930s—I saw a movie from the late-30s where a woman says she’s running to the store to buy a can of potato salad.)

Farmers also protested that they took the blame for adulterated butter, eggs, and milk even though they sent good quality material to the manufacturers. “Shady processors …deodorized rotten eggs, revived rancid butter, [and] substituted glucose for honey” (Young, “The Long Struggle for the 1906 Law“).

During the Populist era of reform, one bill for federal food manufacture standards managed to clear the Senate but was blocked in the House. Congressional representatives, well-paid by shady manufacturers’ lobbies, blocked every clean food and drug law that came to them. One bit of progress was that in 1890-1 meat was set aside for special inspection after a scare in Europe over tainted pork from America led to a ban of U.S. meat on the continent. Later in that decade, tainted beef sickened U.S. soldiers in Cuba fighting the Spanish-American War, causing a furore at home. The culmination of the meat uproar was the famous publication, in 1906, of Upton Sinclair’s novel The Jungle, which described in a brutally unsparing section how meat was processed at the great packing houses in Chicago, detailing the rats, feces, human fingers, and floor sweepings that were incorporated into the finished product. American consumers boycotted meat, with sales falling by half.

This was enough to get Congress to finally act on President Roosevelt’s December 1905 demand for a pure food bill. It wasn’t as easy as it should have been–there was still plenty of resistance in both houses. But thanks to pressure from the president and Dr. Harvey Wiley, a longtime pure food advocate who would be placed in charge of its enforcement, the Pure Food and Drugs Act was passed in 1906.

There was protest from criminal food manufacturers. Whiskey producers complained the loudest, as they were the largest producers of quack medicines. Quack medicines actually accounted for more advertising dollars than any other food or product in the nation in 1906. Their manufacturers claimed the federal government had no right to “police” what consumers chose to buy. This, of course, ran on the incorrect presumption that consumers knew what was in the products they consumed and decided to take the risk (Janssen, “The Story of the Laws behind the Labels”).

Many manufacturers of impure foods claimed that being forced to list ingredients on their labels would put them out of business. Their secret recipes would be exposed! The cost of printing long labels would bankrupt them! Such “technical” labels would turn off customers! Of course, none of this came to pass. Americans were grateful for protection from fraudulent food and medicine, and the Act would go through a few more iterations. The Bureau of Chemistry created to enforce the Act would become the Food and Drug Administration in 1937, and cosmetics would be added to its charges in 1938 when the Food, Drug, and Cosmetic Act was signed by FDR.

The FDA was weakened in the late 20th and early 21st centuries. Food supplements, like vitamins and diet potions, are not subject to FDA scrutiny, and are the new quack medicines, just as dangerous and fradulent as 19th-century snake oil. The organization has had its funding cut by deregulation-minded politicians who wanted to re-establish a completely free marketplace for food and drugs.

This negative turn of events merely proves that bad things happen to our food and drugs supply when the federal government relaxes or impairs its oversight of that market. A strong FDA is a vital necessity to American manufacturers and consumers, and a shining example of the power of good federal management of consumer health and well-being to drastically improve both.

Next time, the federal government and car safety.

The federal government–can it run health care? Check with the FDA

The uproar over the proposed health care legislation that is ongoing in the summer of 2009 is puzzling to the historian. Americans who oppose the legislation seem to feel, when you boil their arguments down, that the main problem is that they don’t want the government running any health care program. The government has neither the experience nor the ability, nor even the humanity to oversee any health care program. (This, of course, when the federal government already runs a health program, namely Medicare.)

This lack of faith in government programs is odd. It’s historically unfounded in three major consumer areas: food safety, car safety, and social security. These are three 20th-century areas which the federal government completely overhauled, improved, and maintains well to this day. We’ll look at all three, starting with food safety and the founding of the FDA (Food and Drug Administration).

In 1906, the federal government passed the Pure Food and Drugs Act. By that time, Washington had been petitioned for decades to create and enforce food safety laws. It’s hard to imagine today what food was like at the turn of the 20th century. We think of that time as a time of pure, wholesome, real food—the kind of food we’re trying to get back to now, in a 21st century filled with pre-packaged, trans-fat adulterated food substitutes.

But the early 20th century was actually little different from—and in many ways, much worse than—today. Here is a description of a meal served by a respectable woman to house guests in the early 1900s, which Dr. Edward A. Ayers  included in his article “What the Food Law Saves Us From: Adulterations, Substitutions, Chemical Dyes, and Other Evils”: 

“We had a savory breakfast of home-made sausage and buckwheat cakes. The coffee, bought ‘ground,’ had a fair degree of coffee, mixed in with chicory, peas, and cereal. There was enough hog meat in the ‘pure, homemade’ sausage to give a certain pork flavor and about one grain of benzoic acid [a chemical preservative] in each [serving]. I also detected saltpetre, which had been used to freshen up the meat. [Either] corn starch or stale biscuit had been used as filler…

“The buckwheat cakes looked nice and brown [from the] caramel [used to color them]…. and added one more grain of benzoic acid to my portion. The maple syrup [was] 90 percent commercial glucose… one-third a grain of benzoic acid and some cochineal [red dye derived from insects] came with the brilliant red ketchup. [At lunch] I figure about seven grains of benzoic acid and rising. …The ‘home-made’ quince jelly, one of the ‘Mother Jones Pure Jellies’…worked out as apple juice, glucose, gelatin, saccharin, and coal tar.

“I had to take a long walk after lunch; having overheard the order for dinner, I figured on about 10 to 15 grains more of benzoic acid reaching my stomach by bedtime.” 

I looked up benzoic acid, which is still used today as a preservative, and found that the World Health Organization’s limit on the stuff is 5 mg per each kilogram of a person’s body weight per day. I don’t know how much a “grain” of benzoic acid was, but I think the poor houseguests described above were getting way more than that.

Why was food so awful in America at that time? Progress. As Arthur Wallace Dunn described it in his 1911 article “Dr. Wiley and Pure Food…”,

“During the preceding quarter of a century [from the 1880s to 1911], the whole system of food supply [in the U.S.] had changed. Foods were manufactured and put up in packages and cans in large quantities at central points where immense plants had been erected. To preserve the food all manner of ingredients were used, and to increase the profits, every known device and adulteration and misrepresentation was adopted. Many states passed strict pure food [and drug] laws, but they were powerless to [control interstate shipping–any state could ship its impure foods and drugs to another state]. Besides, many state laws were not uniform and were easily evaded.”

The Industrial Revolution brought mass production of foods to the U.S. As the population rapidly shifted from rural to urban, more people were in towns and cities where food was not locally grown, but shipped in to them to buy in grocery stores. Meat was shipped across the country with minimal or no refrigeration. Rotten foods were canned to disguise their state. Chemicals were added in enormous amounts to all types of food. Coal tar—yes, from real, black coal—was slightly altered and used to brighten the colors of ketchup, peas, coffee, margarine, and more.  Here is a short list of such “adulterations,” again from Dr. Ayers’ article:

Jams and Jellies: apple juice mixed with glucose and gelatine

Milk: diluted with water, bulked back up with starch or chalk (yes, chalkboard chalk)

Butter: made of carrot juice, turmeric, and coal tar

Worse yet, margarine, or “butterine”: made from oleo oil [this comes from beef fat], lard, coloring, and cottonseed or peanut oil

Filled cheese: skim milk injected with fat from oil

This was the state that modernization had brought American food production to. Eager to make as large a profit as possible, many food manufacturers basically used scraps, glucose, and oil to make a wide range of foods. Drugs were just as bad or even worse, with unhealthy or even fatal miracle cures constantly on the market. Coca-Cola contained not only real cocaine, but unimaginable amounts of caffeine.

Food manufacturers were not required to label their products. Most canned goods had the name of the product, the name of the manufacturer, and a lovely drawing on them. That was it. No list of ingredients. No expiration date.

How did Americans survive? Through the intervention of the federal government.

Part 2–the Pure Food and Drugs Act of 1906

The Gettysburg Address on Powerpoint

This famous illustration of how bereft of all real power and meaning Powerpoint presentations necessarily are is by Peter Norvig (www.norvig.com):  Gettysburg

Just click the Gettysburg link above to see one of the great moments in all human history reduced to marketing slides. It is both hilarious and deeply troubling. Once you’ve seen it, you realize how almost all political speeches today can be easily transferred to Powerpoint without losing any of their original depth.

“The Puritan Experience” on YouTube

Here is another video about the Puritans that I found by typing “New England Puritans” into YouTube. It is called “The Puritan Experience: Making of a New World” and it is an episode in a continuing story:

It is an interesting mixture of truth and myth. Apparently the protagonists are a family with an 18 year-old daughter who is deemed rebellious. In this episode, her parents are confronted by their minister because their daughter hasn’t married yet. But the Puritans did not generally marry in their teens; the average man was 26 at his marriage, the average woman 22. So the unnamed daughter is not pushing any limits here.

The minister rebukes the father for saying he will trust his own judgment on the issue; God will judge, the minister reminds him, and this is an accurate depiction of the Puritan attitude. Having too much confidence in your own judgment was a sign of pride.

Next, the father is confronted by a friend who says his daughter should have officially joined the church by now. “She is past the age,” he says. The father responds that the girl is not sure she is ready to join yet, to which the friend replies they must make “a truce upon their doubts” because they can’t build a civilization in the wilderness unless they are all united in their religious practice.

This is all inaccurate. It was very rare for a teenager to become a full member of a Puritan church; you could only do that once you had spent many hard years searching your soul, studying the Bible, and generally going through the complicated and thorough discernment process of the Puritan faith. Most adults never became full members of their church. They were never certain that they had received God’s grace. Indeed, many times when someone did try to become a full member they were rejected because the congregation felt they were not yet ready. So an 18 year-old is not nearly too old to become a full member of her church.

There also could be nothing further from the Puritan mind than to “make a truce with our doubts.” They relished religious debate and gave every questioning voice a full hearing. To doubt oneself, one’s faith and goodness, was to realize that one could never earn God’s grace. Those who were certain of their virtue were dangerously deluded, led by pride to deny their complete reliance on God’s grace. Building a new civilization was based on this kind of doubt, not threatened by it.

The friend closes this episode by reminding the father that loving a child too much, taking too much delight in her, is dangerous, and this is a true depiction of Puritan thinking. They were always worried about loving someone too much, more than they loved God; such love could lead one to do things for the loved one regardless of the spiritual consequences. It could also keep a child from realizing her precarious state; her parents’ unconditional love might lead her to think she was fine as she was, and to question God’s possible damning of her soul. Puritans tried to temper their love with objective correction as often as they could, but you get the feeling they preached it more than they could practice it.

So while this video is wrong on some key issues, it’s still intriguing. It does nail the lack of privacy amongst Puritans when it came to one’s spiritual life. Your friends, congregation, and neighbors were duty-bound to instruct you and to receive your instruction on matters large and small. It was a kind of neighborhood watch of the soul. If your neighbor was in error and you did not try to help, he might die in his sin and then how would that look for you? Rather than resenting it as intrusion, most Puritans seem to have welcomed constant meddling, as it kept them on the straight and narrow.

Just a note–the minister is called Mr. Endecott; is he perhaps meant to be the famous John Endecott of Salem?

Puritans on YouTube

I thought I’d do a search for Puritans on YouTube and see what’s out there. There’s a lot, it turns out, of varying purpose and quality.

Today, I’ll share a video simply called “Puritans”:

This video states that it wants to clear up the negative myths about the Puritans, and counter their bad image in popular culture, by explaining all the wonderful things the Puritans did, their legacy to modern Americans. You know I’m on board with that! But this video fails, however, to give those explanations, constantly telling us that the Puritans were amazing but never telling us why. “There are few more misunderstood people in America,” it claims, but does little to provide understanding.

My favorite moments are:

The video uses images from the 18 and 19th centuries at will. Some of those images don’t even pretend to show Puritans, but they are marshalled for the cause. At 1:36 you see a Victorian drawing of a Puritan woman, tightly corseted, wearing blush, and sporting a fashionable beauty mark.

At 2:15 an image that looks like a representation of the Thirty Years’ War in Europe is used to illustrate the Puritans’ “in-dome-itable” spirit.

At 2:21 an 18th century image is slowly scanned as the narrator says, “By the end of the 17th century, the Puritans forever altered the world in which they had arrived, finding success where others found only death and desolation.” This odd claim must apply to English colonists, since Americans had been successfully living in North America for thousands of years. But many colonies were thriving by the end of the 17th centuries; Pennsylvania, Delaware, and Maryland were paradises of plenty that put Massachusetts to shame. The odd claim is made odder by the painting being scanned, which includes a Native American on his knees as if crawling toward busy 18th-century Puritans gathering firewood and cooking out in the middle of a wintry forest.

At 2:55 Puritan ministers are claimed as some of the earliest, most outspoken opponents of slavery, which is untrue if one means slavery of Native Americans or Africans. The claim is then quickly made that the Puritans wrote the first diaries and the first love letters, which isn’t even debatably plausible if restricted to New England, let alone British North America or, perhaps, the world.

The video ends tantalizingly with this: “Who were the Puritans, and what did they actually believe? How did one of the most influential societies in America turn itself inside out? Enter the world of those who called themselves the godly…” I don’t know what they mean by turning itself inside out. I would have liked to hear that theory, because it sounds interesting. One might hope that the video itself would explain what they actually believed and how they were influential. As this narration rolls, a painting of Pocahontas’ wedding in London is scanned.

So this video has a good idea—truth v. myth—but doesn’t deliver. Surely my next visit to YouTube – Puritans will uncover something more promising.

Are re-enactors real historians?

This is a question I first encountered two years ago at a conference, where the answer was emphatically no. But as I think about it, it seems like people who re-enact historical battles are almost uncannily similar to the historical people they are pretending to be.

Think of the Civil War re-enactors. Why do people like to re-enact its battles? For a number of reasons: a friend of theirs is doing it, and they get interested and start themselves; they are military buffs; they have an ancestor who fought; it’s exciting and involves travel and audiences; they become an established part of state and local celebrations; it’s an expression of their patriotism; they want to try to understand how those long-ago men felt, fought, lived, and died.

Now think of the actual Civil War. Why did men fight in its battles? For a number of reasons: friends were enlisting and they wanted to go with them; they had ancestors who had fought in earlier wars; it was exciting and involved travel; it was an expression of their patriotism; they wanted to take part in the most important event in American history since the Revolutionary War.

In each case, men have multiple reasons for going into battle, some personal and some political, some deep and some a little more shallow. Both groups are motivated by the same spirit of joining in something larger than themselves. Just as re-enactors try to live the experience of real soldiers, so those men who enlisted during the Civil War tried to be real soldiers, since the vast majority of them were farmers who had no experience of soldiering or war. There’s a steep learning curve regarding uniforms, weapons, drilling, camp life, and battle for both groups.

So I say re-enactors are as much historians as anyone else who comes to know a period in time intimately, and they come closer to really living as their subjects did than most of the rest of us who work strictly in our own time from our own computers.  Somehow it is seen as dreadfully amateurish to dress up and re-enact history, while reading and writing about it is respectable. Re-enactment is seen as pop culture, publishing articles and books as “real” history. But I side with the re-enactors here. It may be very hard to translate their experience into scholarly writing, but we should all have to walk an actual mile in our subjects’ shoes before we write one more word about them.

Revolution Myth #5: America had no chance of winning the war

Welcome to the last in our Truth v. Myth series on 5 Myths about the American Revolution. Here we re-examine the cherished idea that we were total underdogs in our war of independence. This article was inspired by a re-listening to the insightful Prof. Allen Guelzo’s lecture series “The American Revolution.”

Yes, the British Army was bigger than the Continental Army, and better organized. And most British officers and politicians in the spring of 1775 thought the war could be won fairly quickly.

But the British Army was not that big—at least not in America. In 1775, there were about 38,000 men serving in the British Army around the world, and around 18,000 men in the Royal Navy (in 270 ships) also spread around the world. To fight in America, men had to be impressed and mercenaries hired, because Britain did not want to pull its forces from the invaluable sugar islands in the Caribbean, which would be snapped up by France or Holland if left unguarded. Sugar was the oil of the 18th century, to borrow Prof. Robert Bucholz’s inspired phrase, and the sugar islands were far more valuable to Britain than all the colonies in North America. So when 16,000 American men enlisted to fight the British in 1775, they were fairly equal in numbers to the redcoats.

The British Army was well-organized and well-run, far more so than the Continental Army. That did stand in Britain’s favor. British soldiers were under no illusions about having control over how long they served (though there were desertions from the British Army during the war).

As for the British attitude to the war, it was far more complex than we imagine. The British knew that those Americans in rebellion would not go down easily. They knew that they could not hope to conquer the vast territory of the 13 colonies, and that any attempt to conquer land battle-by-battle would result in a hopeless loss of men and drain on money and supplies in a war of attrition. They understood that an occupied people almost always win wars of attrition because they have the motivation and the resources to resist for many, many years.

The British approach was to try to destroy the heart of the rebellion—Boston, Washington’s army, the Congress—and get Loyalists to take over local governments.  The British were hampered by poor communication, infighting between generals, the months it took to get orders from London, lack of support from Loyalists, and often conflicting goals (for instance, Howe was told to at once occupy New York City and to destroy Washington’s army in 1776; the impossibility of doing both at once led to delay and paralysis).

So while the British Army itself was well-organized internally, from the start it had management problems at the level of Parliament and its generals, and it was always low on supplies.

By 1778, opposition to the war was making itself heard in Parliament. We picture a vindictive empire trying to keep America in its clutches to the bitter end, committed to stamping out revolution, but in reality there was strong opposition to the war after three unproductive years. Boston had been occupied, and so had New York, but Washington’s army remained at large, the British had lost an army at Saratoga and an important battle at Trenton. The rebellion remained strong despite the occupation of two major cities, and the Loyalists had yet to rise up. Most important, France had joined the war on America’s side, which meant Britain had to increase its expenditures to supply its army and  navy against a stronger—and now much more important—enemy. The sugar islands were at higher risk, and the sugar planters lobbied Parliament vigorously, threatening to oppose any move to relocate  British soldiers from their islands to America. War with France meant war not only in America but in the Caribbean and India.

In these circumstances, Parliament came close to voting not to send any more soldiers to America at all in 1779, and Lord North’s government actually sent a peace committee to Congress, offering the colonies control over their taxation, no more quartering British soldiers on civilians, and acknowledgement of Congress—in short, everything the colonies wanted but independence. This offer was rejected, but it is significant to realize that by 1779, Britain was looking for a way out of the war. Washington fought his last battle against the British in July 1779, a full two years before the official surrender at Yorktown.

By the time Cornwallis’ disastrous attempts to take the Carolinas and organize Loyalists into an army to defeat Nathaniel Greene and turn the tide of the war were over, in1781, and the British surrendered their army to Washington, Parliament was mostly resigned to the loss, and already turning its attention to India, Africa, and the West Indies. It would hold on to its western territories in America, and try to foment Native American rebellion against the U.S. It would happily engage the U.S. in war in 1812, vengefully burning down our capital. But for Britain, its ever-expanding eastern empire and its wars against France in Europe were more important.

We see, then, that the deck was not totally stacked against us. This is not to say that Washington was not a genius and a powerful leader who kept our fight for liberty alive when the odds of success looked bleak. We could have lost that war. But we had more going for us than we think. Britain knew it faced substantial difficulties, just as America did.  Everyone likes an underdog, but we shouldn’t overdo it.

Revolutionary Myth #4: All was well before the war

Part four of our series on 5 Myths about the Revolutionary War dwells on the pre-war period.

In the shorthand version of American history, the colonial period is one of peace and prosperity right up to the 1770s. But especially in New England, the 17th and 18th centuries were strewn with political conflict and open war.

Canada and New England ended up acting out the wars between France and England over and over from 1689 through 1763. In 1689, New Englanders overthrew the Dominion imposed on them by James II. From 1689-1697, New England was a battlefield in King William’s War. Just five years later, word came to New England that they were at war with Canada once again, for in 1702, Queen Anne’s War began. This war lasted until 1711. The War of Jenkins’ Ear (1739-42) involved New Englanders recruited to fight in the Caribbean, most notably in the attempt to take Cartagena (in what is today Colombia). 65% of the 4,183 Americans who went to fight for Cartagena died.

In 1745 and 1758, New Englanders went to Louisbourg, the major French fort guarding Canada from the sea, successfully capturing it in 1745 only to have Great Britain return it to France in their peace treaty. Harried by French-sponsored Native American attacks from 1748-58, New Englanders retook the fort at great cost in 1758 (it was destroyed in 1760).

So we see that New England was in a state of almost constant turmoil in its colonial years, turmoil almost always caused by England’s wars with France. England spared few troops for North America, focusing on the naval battles in Europe, and more than once promised to send soldiers to back up New England, then failed to do so. This caused great anger and bewilderment in New England, which felt it was being deliberately endangered by its mother country.

Of course, the last in this series of wars between France and Britain in America was the French and Indian War, 1756-63. The road to revolution was taken the next year, when the Sugar Act was passed.

No wonder New England was the hotbed of revolution against England by 1764. A sense of betrayal and separateness had been forged by all those battles against France that New Englanders fought without British help. It would not be until 1815, when the War of 1812 ended, that New England breathed several decades’ worth of peace.

Next—our final myth!

Revolutionary Myth #3: the Revolution happened quickly

We look back and tend to see a constant boil of activity in the 1770s that led to revolution. But here in part 3 of my series of 5 Myths about the Revolutionary War, we will see this is not really so.

When you study colonial town or precinct records, you see that towns, villages, and precincts met once a year (“town meeting”) to set policy, settle debates, take actions, and elect officers. Reading the record books, it seems like discussions took years–meet in April 1737 to debate the town border, meet in April 1738 to debate the town border, April 1739, etc.  While discussions must have taken place between meetings, official actions, committees, and decisions took place only at town meeting. Sometimes an emergency meeting was called to expedite things, but not always.

So with the Revolution. While history books and narratives compress events, leading you to feel like the timespan between the first punitive Act (the Sugar Act) and Lexington and Concord was about 1-2 years, it all unfolded much more slowly. The Sugar Act was imposed in April 1764. The Stamp Act was imposed in March 1765. Patrick Henry gave his famous “if this be treason” speech about the Stamp Act that same year. The Townshend Act came two years later, in 1767. The Boston Massacre was three years later, in 1770.

Usually the Boston Massacre is presented as the tipping point, after which Revolution happened with lightning speed. Many people, if quizzed, think the Boston Massacre must have been in 1775. But it was a good five years before the fighting began. Between the Massacre and Lexington and Concord was the Boston Tea Party, in December 1773. The “Intolerable Acts” were put in place the next year, 1774.  The first Continental Congress met in fall 1774, issuing a declaration of principles…

…but still it was not until April 1775 that the war began. Why did things move so slowly?

First, of course, was communications. It took weeks to months for the Sugar Act or Townshend Acts to take full effect throughout the colonies. Thus it took that long for indignation to build amongst Americans. South Carolina would have heard about the March 1770 Boston Massacre in the spring of 1771. And they only heard about it through the determined letter writing of men like Samuel Adams; newspapers from Boston that published the story would have had it picked up by newspapers in neighboring colonies, like Connecticut and New York, and from there it might have been picked up in Pennsylvania or New Jersey, but beyond that point the story as a newspaper story would have died. It had to be disseminated by individuals’ letters, whether private or public.

There was also the winter. Winter made travel much more slow—sometimes impossible—and this meant not only that newspapers and letters traveled far more slowly, but also that all actions were on hold until the spring. Just as armies made winter camp in December and did not fight at all until April or May, so protests and meetings were on hiatus over the snowy months.

Finally, of course, there was peoples’ reluctance to go to war. Each event—Act, riot, shooting, speech—was endured or taken in and then made sense of in a way that would allow people to avoid the terror of war. Each time something happened, Americans hoped it was the last time something dangerous would happen, and that the troubles would die down and life could go back to normal. No people want to endure a war. So there was a great deal of effort expended on diplomacy and peaceful efforts to turn things around.

So really, there was no sustained fever of revolutionary activity in the 1770s, not even in Boston. Events hit people in the spring and summer, went into hibernation through the winter, and were superseded by less inflammatory, daily events the next spring—for which people were understandably grateful. It was really not until March 1774, when the Port Act went into effect against Boston, that events hit rapidly, and even then the winter of 1774-5 was quiet, with Paul Revere’s ride coming the following April 1775.

It is only when we look back and compress events from the Sugar Act to the North Bridge that it seems like a frenzy of revolution. In reality, it took 11 years, from the 1764 Sugar Act to the 1775 shootouts in Lexington, Concord, and especially Menotomy, for the Revolution to begin.

Next: Myth 4: All was well before the war