A Monetary Policy Primer, Part 5: The Supply of Money
Bonus Post: On "Shadow Money" by Julien Noizet
It’s legacy-burnishing time at the Obama White House, the New York Times reports, and the administration plans to make the president available for “articles that will allow Mr. Obama to showcase his major achievements.” In this brief interlude before the national party conventions rivet our attention on the fresh horrors to come, ’tis the season for “exit interviews” and think pieces about our 44th president’s place in history.
The Washington Post recently debuted a hagiographic “Virtual Museum” of Obama’s tenure, accompanied by “The Content of His Presidency,” a 3,000-word chin-puller by Obama biographer David Maraniss.
Maraniss writes that as an undergraduate, Obama developed “an intense sense of mission … sometimes bordering messianic,” and by the time he had the Oval Office in his sights, Obama had decided “his mission was to leave a legacy as a president of consequence.” Has he done that? Maraniss’s timid, triple-hedged answer is: “it is now becoming increasingly possible to argue that he has neared his goal.”
Seven years in, it’s clear that Obama has forged a legacy of enormous consequence. But the most transformational aspect of his presidency is something liberals never hoped for: as president, Barack Obama’s most far-reaching achievement has been to strip out any remaining legal limits on the president’s power to wage war.
Obama’s predecessor insisted that he didn’t need approval from Congress to launch a war; yet in the two major wars he fought, George W. Bush secured congressional authorization anyway. By the time Obama hit the dais at Oslo to accept the Nobel Peace Prize in 2009, our 44th president had already launched more drone strikes than “43” carried out during two full terms. Since then, he’s launched two undeclared wars, and—as Obama bragged in a speech last year defending the Iran deal—bombed no fewer than seven countries.
In 2011, what officials called “kinetic military action” in Libya completed the evisceration of the War Powers Resolution by successfully advancing the theory that if the U.S. bombs a country that can’t hit back, we’re not engaged in “hostilities” against them. In the drone campaign and the current war with ISIS, Obama has turned a 14-year-old congressional resolution targeting al-Qaeda and the Taliban into a blank check for endless war, anywhere in the world. Last year, the army chief of staff affirmed that finishing the fight against ISIS will take another “10 to 20 years.”
The issue that first animated Obama as an undergraduate was “the relentless, often silent spread of militarism in the country,” as he wrote in an article for the Columbia University Sundial as a college senior in 1983. In “Breaking the War Mentality,” Obama worried that the public’s distance from the costs of war made resisting it “a difficult task,” but a vital one of “shifting America off the dead-end track” and undoing “the twisted logic of which we are today a part.”
“It was his first expression of his views on any foreign policy subject,” James Mann writes in The Obamians, his 2012 account of national security decision-making in the Obama administration. “And years later, his aides felt it was deeply felt and lasting.”
Yet, as president, instead of “breaking the war mentality,” Obama has institutionalized it.
Will history judge Obama harshly because of that? Probably not. When it comes to presidential legacies, history has lousy judgment.
With the exception of Lyndon Johnson, whose presidential standing has suffered because of Vietnam, waging war rarely hurts a president’s historical reputation. In fact, it usually helps.
Obama needn’t fret too much about getting short shrift from historians. Not only has he been the sort of warrior president too many of them love, but by relentlessly expanding presidential war powers, he’s also empowered the presidents to come.
An article about Hillary Clinton’s foreign policy instincts in last week’s New York Times Magazine (“How Hillary Clinton Became a Hawk”) escaped widespread discussion on account of the New York and other Acela primaries last Tuesday. It deserves a second look in light of Clinton and Donald Trump’s resounding victories, and Trump’s foreign policy speech last week. Clinton is, according to the Times’ Mark Landler, “the last true hawk left in the race.” Why might that be?
Landler framed much of his analysis as a contrast between Barack Obama’s relative restraint vs. Hillary Clinton’s relative activism. He also emphasized the close ties that Clinton has cultivated with certain senior military officers, but especially those who affirm her faith in the military as an instrument of policy.
Clinton has, according to Landler, an “appetite for military engagement abroad” that far exceeds her few remaining GOP rivals, and that even surprised Defense Secretary Robert Gates and senior military officers during Obama’s first term. She was “a little more eager than they are,” explained Bruce Reidel, a long-time foreign policy hand, to get involved militarily around the world.
Clinton’s enthusiasm for military intervention was not shaken by the foreign policy debacles of the recent past. When consulting military officers for advice, she gravitated toward those urging the use of force and shied away from those who raised concerns.
For example, Clinton first met Gen. Buster Hagenbeck in October 2001, when he was the commander of the Army’s 10th Mountain Division based in upstate New York. The following year, he warned her that Bush’s plan to invade Iraq would be “like kicking over a bee’s nest.” Clinton didn’t listen, casting a vote in favor of the war that has haunted her ever since. She was similarly dismissive of former Army general Karl Eikenberry, who, as U.S. ambassador to Afghanistan in 2009, warned against a surge of U.S. troops there. Clinton also disliked Douglas Lute, another former general, who clashed often with Clinton confidante Richard Holbrooke over policy concerning Afghanistan and Pakistan.
Clinton preferred the counsel of men like David Petraeus and Jack Keane, “perhaps the greatest single influence on the way Hillary Clinton thinks about military issues.” Keane had tried but failed to convince Sen. Clinton of the need for a troop surge in Iraq in 2007. She later confided to Keane that he had been right about the surge, and she had been wrong.
They continued to confer regularly, and in 2009 she crucially influenced the debate over the Afghan surge. By wholeheartedly embracing Gen. Stanley McChrystal’s call for 40,000 additional troops, she “made it harder,” Landler writes, “for Obama to choose a lesser option.” “Hillary was adamant in her support for what Stan asked for,” Gates recalled. “She was, in a way, tougher on the numbers in the surge than I was.”
And that wasn’t the only time. In other foreign policy debates within the Obama administration— from leaving troops in Iraq after 2011, to arming anti-Assad rebels in the Syrian civil war, to extracting concessions from Russia as part of the vaunted reset (“I’m not giving up anything for nothing,” she said)—Clinton consistently adopted a hawkish line.
We know from other sources that Gates had grown tired of Clinton’s bellicosity by 2011. When she was pushing for action in Libya to overthrow Muammar Qaddafi, the then-Secretary of Defense complained that the military’s plate was already quite full. “Can I finish the two wars I’m already in before you guys go looking for a third one?” he recalled asking. The public, then and since, has seemed no more enthusiastic for starting new wars than Gates. But Clinton retains her interventionist biases.
In December 2015, Jake Sullivan, Clinton’s top foreign policy adviser, confided to Landler “There’s no doubt that Hillary Clinton’s more muscular brand of American foreign policy is better matched to 2016 than it was to 2008”; by February 2016, that claim seemed dubious, at best. An uptick in public support for the use of U.S. ground troops against ISIS had briefly spiked after the terrorist attacks in Paris and San Bernardino, but proved short-lived. Bernie Sanders’s repeated attacks on Clinton’s foreign policy views have earned the septuagenarian very strong support from millennials, the most war-averse generation in recent U.S. history.
Trump has gotten into the game, too. He declared last week that “the legacy of the Obama-Clinton interventions will be weakness, confusion and disarray.” In short, “a mess.” He’ll have many more opportunities to make that case over the next several months, but he’ll struggle to articulate a coherent worldview that can reliably be called less hawkish than Clinton’s. After all, while Trump scoffs at nation-building wars, or wars to implant Western-style democracy, his generally bellicose nature, enthusiasm for fights over trade, and chauvinistic nationalism, could lead him to stumble into foolish conflicts. If he did, his rhetoric during the course of this campaign suggests that he’d be anything but restrained as commander-in-chief.
Still, Trump has singled out the American foreign policy establishment for criticism. Elites are more inclined toward global activism than the public at large, and Hillary Clinton is more activist even than the establishment. That means that if Clinton wins in November, she will have been elected in spite of her foreign policy views, not because of them.
Elected officials tend not to parse voter sentiment so closely, however. If Clinton defeats a candidate who scorned her interventionist instincts, expect her to claim an electoral mandate to do more, in more places, than her predecessor. Expect her to generally lean toward acting abroad when others counsel caution, and to forcefully make the case for war when she meets public resistance. And expect the members of the foreign policy establishment to cheer her on.
Congress is poised to yet again deny the Pentagon’s request to reduce its excess overhead. Last month, Deputy Secretary of Defense Robert Work wrote the leaders of the relevant congressional committees making the case for a round of military base closures (also known as BRAC—Base Realignment and Closure). It was the fifth time that the Pentagon has asked Congress to approve another BRAC, the last of which occurred in 2005. Rep. Mac Thornberry (R-TX), the chairman of the House Armed Services Committee, was quick with an answer: no. The National Defense Authorization passed out of his committee late last month bans another BRAC. The versions under consideration on the Senate side would as well. Sen. Kelly Ayotte (R-NH), chair of the SASC’s Readiness subcommittee, explained ”I do not want to give the department the open-ended authority to pursue another BRAC round that will incur significant upfront costs.”
For his part, Thornberry dismissed the Pentagon’s claim that another BRAC was needed, saying that he was “interested in objective data that leads them to think there is too much infrastructure.” But, as the AP’s Robert Burns noted, “The data is fairly clear, even if Thornberry doesn’t believe it is objective.”
In a thoroughgoing review, the first of its kind in twelve years, the Pentagon concluded that the military will have twenty-two percent excess capacity as of 2019. The Army will be carrying the greatest excess overhead—thirty-three percent according to the DoD study—while the Air Force will have a thirty-two percent surplus. The Navy and Marine Corps combined will have seven percent surplus in 2019. These projections are not based primarily on expectations of a much smaller force. For example, the Pentagon estimates that the Army will have only one fewer active brigade combat team, and one fewer reserve brigade combat team (BCT), in 2019 than in 2016, while the Navy will have eighteen more ships, and the Air Force will have 47 more aircraft. Combined active duty and reserve end strength, meanwhile, will decline by a mere two percent between now and 2019 (from 2.09 to 1.97 million), with the Army accounting for more than eighty-three percent of the decline. Even if Congress or the next administration succeeds in slowing or reversing proposed personnel cuts, the Pentagon will still be saddled with considerable excess capacity well into the 2020s.
“Absent another BRAC round,” the report explained, “the Department will continue to operate some of its installations suboptimally as other efficiency measures, changing force structure, and technology reduce the number of missions and personnel.” Calling the BRAC process “the fairest approach for working with Congress and local elected officials to close installations,” the DoD noted that the alternative of “incremental reductions” “will have an economic impact on local communities without giving them the ability to plan effectively for the change.”
Work echoed these sentiments in his cover letter to congressional leaders:
Under current fiscal restraints, local communities will experience economic impacts regardless of a congressional decision regarding BRAC authorization. This has the harmful and unintended consequence of forcing the Military Departments to consider cuts at all installations, without regard to military value. A better alternative is to close or realign installations with the lowest military value. Without BRAC, local communities’ ability to plan and adapt to these changes is less robust and offers fewer protections than under BRAC law.This is certainly correct, as I wrote last year. Work and other advocates for another BRAC round must not limit themselves to green-eyeshade talk of cost savings and greater efficiency. They must also show how former defense sites don’t all become vast, barren wastelands devoid of jobs and people.
This is actually rather easy to do. Most former military bases are converted to other uses, and some quite quickly. This is true for several of the cases that I’ve studied over the past five years, including Bergstrom Air Force Base in Austin, Texas; the Brunswick Naval Air Station in Maine; the Philadelphia Navy Yard; and Pease Air Force Base in Portsmouth, New Hampshire. Some of these stories are told in this edited volume, due out next month. The communities in and around these former defense facilities were blessed by favorable locations, but active community involvement substantially eased the transition.
Keep this in mind when you hear Ayotte and Thornberry and their colleagues solemnly proclaim their great concern for protecting the economic well-being of nearby defense communities. By blocking base closures, they are saddling the military with unnecessary costs, and preventing local communities from accessing potentially valuable land and infrastructure.
In his 1961 farewell address, President Dwight D. Eisenhower warned of the growing influence of the “military-industrial complex” on American politics and policy.Interestingly, Eisenhower’s original formulation of the menace was the even more accurate “military-industrial-congressional complex.” (Emphasis added). Seeing how that network of special interests has worked its tentacles into so many aspects of American political and economic life in the intervening decades indicates just how prescient was Eisenhower’s warning.I now refer to the "military-industrial complex" with Eisenhower's "congressional" plus "security" as MICS: the "military-industrial-congressional-security complex".
But there has been an even more subtle and pervasive militarization of American culture. It has been evident since World War II, but it has been accelerating markedly in recent years. Perhaps the most corrosive domestic effect of the global interventionist foreign policy that Washington adopted after World War II has been on national attitudes. Americans have come to accept intrusions in the name of “national security” that they would have strongly resisted in previous decades. The various provisions of the Patriot Act and the surveillance regime and its abuses epitomized by the NSA are a case in point.
The trend toward a more intrusive, militaristic state has become decidedly more pronounced since the September 11 attacks and the government’s response, but there were unmistakable signs even before that terrible day. My colleagues at the Cato Institute have done an excellent job documenting the gradual militarization of America’s police forces, beginning in the 1980s, with the proliferation of SWAT teams and the equipping of police units with ever more lethal military hardware.The terrorism threat simply provides the latest, most convenient justification to intensify a trend that was already well underway. Most SWAT raids in fact have nothing to do with terrorism; they are used to serve search or arrest warrants in low-level drug cases.
Politicians learned early that the fastest way to overcome opposition to a pet initiative was to portray it as essential to national security. Thus, the statute that first involved the federal government in elementary and secondary education in the 1950s was fashioned the National Defense Education Act. Similarly, the legislation establishing the interstate highway system was officially the National Defense Highway Act. In retrospect, President George W. Bush probably missed an opportunity when he did not label his legislation for a Medicare prescription drug benefit the National Defense Elderly Care Act.
And then there is the overall militarization of language. The rise of America’s imperial era coincides with the popular use of the “war” metaphor. In recent decades, we’ve had “wars” on everything from cancer to poverty to illiteracy to obesity. And, of course, we still have the ever present war on illegal drugs that Richard Nixon declared more than four decades ago. Language matters, and the fondness for such rhetoric is a revealing and disturbing indicator of just how deeply the garrison state mentality has become embedded in American culture.
Yet another sign is the growing tendency to misapply the term “commander-in-chief.” The Constitution makes it clear that the president is commander-in-chiefof the armed forces. There were two reasons for that provision. One was to assure undisputed civilian control of the military. The other was to prevent congressional interference with the chain of command.
One thing, however, is abundantly clear. The Constitution did not make the president commander-in-chief of the country. Unfortunately, that is a distinction that is increasingly lost on politicians, pundits, and ordinary Americans The notion that the president is a national commander who can direct the country and it is our obligation as subordinates to salute and follow his lead is an alien and profoundly un-American concept. It also implicitly ratifies the perverse doctrine of the imperial presidency—that the president alone (our commander-in-chief) gets to decide when the nation goes to war. Both are thoroughly unconstitutional, ahistorical, and unhealthy attitudes. Yet they have become common, if not dominant, attitudes in late twentieth century and early twenty-first century America. And that is frightening. Viewing the president as the commander-in-chief of the nation is the epitome of a mentally militarized society.
At the dawn of the Cold War, social commentator Garet Garrett warned that America could not indefinitely remain a republic at home, enjoying the values of limited government and robust civil liberties, while taking on more and more trappings of empire abroad. Gradually, he predicted, the requirements of the latter would drastically alter and eventually eclipse the former. As in the case of Eisenhower’s Farewell Address, Garrett’s warning seems all too prescient.
Americans are rapidly approaching the point where they must make a stark choice. Either the United States adopts a more circumspect role in the world—in part to preserve what is left of its domestic liberties—or those liberties will continue to erode (perhaps beyond the point of recovery) in the name of national security. That choice will determine not only how the United States is defended in the future but whether this country retains the values and principles that make it worth defending.
The educational establishment seems to be expending a great deal of effort these days to excise “offensive” material from the curricula of history and literature. For example, Mark Twain’s great anti-racist novel The Adventures of Huckleberry Finn has been removed from the study materials in many schools because of its use of the word “nigger” in the dialogue—as if any accurate representation of the time and place Twain portrays in this book could have been written without this key word. Recently this censorial campaign has reached such heights of stupidity that new editions of Twain’s books The Adventures of Tom Sawyer and The Adventures of Huckleberry Finn are being published with the word “nigger” replaced by the word “slave.” With friends like this misguided editor, anti-racists need no enemies. One is not likely to produce an intelligent end by the use of foolish means.
More generally, the wrongheaded effort to produce feel-good instruction in history and literature undermines the entire purpose of studying these subjects as part of a liberal education; it aims to make the students feel comfortable and unchallenged rather than to help them acquire knowledge and understanding of the human past and human nature with all its potential for both good and evil. A well-warranted study of history and literature certainly will on many occasions leave the students feeling very bad indeed, as they gain knowledge and understanding of the horrible deeds that people have done and of the twists and turns of human motivations, actions, and—all too often—crimes against their fellows, frequently founded on the most brutal and senseless rationalizations.
Yet the study of true history and an unfettered immersion in great literature can also reveal mankind in its most splendid and shining moments. Rising like beacons above the monstrous ideas and savage mayhem have been individuals who resisted the mob, who defended the defenseless, who gave sustenance and protection to the victims, who put decency before popularity, who rebelled against the inhumane dominant ideologies, religions, and prejudices of elites and masses alike. But the comprehensiveness that permits a real liberal education to become uplifting as well as depressing cannot find a place in a feel-good curriculum in which avoidance of hurting someone’s feelings receives priority.
Human beings and their historical record, viewed with warts and all, give any serious student ample reason for taking offense and feeling dismayed by what people have believed, said, and done. But unless we face these aspects of our species and its actions frankly and fearlessly, we will never be able to appreciate in stark contrast the true heights to which people at their best can rise and actually have risen in the past—and we will thereby deprive ourselves of the most inspiring models we can have for carrying on our own struggles for a more humane world.
As President Obama visits still-communist Vietnam, a former American rival, in his “pivot to Asia” to recruit more countries to shelter against a rising China, the trip only serves to illustrate the global American Empire’s overextension. At the same time, he is opening missile defenses in Europe, quadrupling U.S. military spending there, and deploying more military forces near Russia—all of which will have the effect of continuing to provoke that already insecure country. Also, Obama has failed to withdraw U.S. ground forces from Afghanistan, inserted them into Iraq and Syria to battle the terror group ISIS, and continued his accelerated air wars over Afghanistan, Iraq, Syria, Pakistan, Somalia, Yemen, and Libya. Finally, the president sent the top general in the Army to Africa to showcase U.S. efforts to train 38 countries to battle terror groups that could attack Europe, including affiliates of ISIS and al Qaeda. These U.S. military forces may be valiantly battling threats to the Empire, but most of them pose very little threat to America.
In fact, in many cases—especially vis-a-via terrorists—U.S. military action may be making the largely local problems worse. For example, in Yemen, journalists have documented that the number of fighters of the al Qaeda affiliate there actually increased after U.S. forces, seen as “foreign infidels,” started bombing. Also, retaining non-Muslim U.S. and Western occupation forces on Muslim soil in Afghanistan and Iraq after initial invasions respectively led to a resurgent Islamist Taliban and the creation of al Qaeda in Iraq, which morphed into ISIS. Furthermore, U.S. interventions in Iraq and Afghanistan destabilized surrounding areas, such as Syria and the nuclear-armed state of Pakistan, respectively. Similarly, the U.S. and Western overthrow of Libyan dictator Muammar Gaddafi in Libya destabilized not only Libya (allowing chaos to reign and an ISIS affiliate to arise), but many of Libya’s weapons and fighters migrated to Mali and other parts of Africa. Hence contributing to the alleged need to send the Army’s top general to coordinate with 38 countries in battling Islamist terror groups in Africa.
All of these post-9/11 brushfire wars led that general—Gen. Mark A. Milley—to make an astounding statement: “Today, a major in the Army knows nothing but fighting terrorists and guerillas, because he came into the Army after 9/11. But as we get into the higher-end threats, our skills have atrophied over 15 years.” MIlley continued that the U.S. Army has forgotten how to fight more sophisticated enemies, such as Russia or China. So instead of being capable of deterring potentially larger threats to the United States (even this requires some imagination), the U.S. military has become bogged down in never-ending, faraway brushfire wars, which make the usually low probability threat of anti-U.S. terrorism worse.
Even in the case of Russia and China, rich U.S. European and East Asian allies—with combined GDPs of at least five times and about the same size as the threat, respectively—should take over the first line of defense, as presidential candidate Donald Trump has implied. However, if these allies can’t contain these regional threats, the U.S. military should be configured and prepared to be a backstop of last resort in case of any emergency—a defense posture that worked in World War II.
Reconfiguring U.S. forces to let regional allies’ militaries fight guerrilla and terrorists, as well as be the first line of defense against major potentially hostile powers, would allow the United States to form a coherent strategy of being a “balancer-of-last resort.” Such a more restrained policy could save bucket loads of money and help pay off the nation’s $19 trillion debt. That massive debt has impaired robust U.S. economic growth for far too long and threatens the long-term status of the United States as a great power. Recently, the British, French, and Russian Empires became financially overextended and all collapsed. The same could happen to the more informal American Empire of permanent and entangling alliances and overseas military bases—and the armed interventions and huge amounts of military and economic aid to foreign countries needed to maintain it. In other words, as the American Constitution stipulates, the U.S. military needs to defend the country, not maintain an overseas empire that causes global instability and undermines American security.
During a symposium of social workers recently, I encountered a jarring reminder of how far the country has traveled in thinking about welfare. A session on at-risk youths had been given the title “A hand up not a handout.” I thought I knew what that cliché meant but found myself mystified.
The speaker, a young manager of a shelter for homeless teens, explained: “When the kids come into the shelter,” he said, “we give them everything—food, sleeping accommodation, clothing, medicines, counseling.” To his way of thinking, providing everything is a “hand up.” A “handout” would be giving only one thing, say, food.
But the phrase originally meant that it was bad policy to simply give things away to the needy. Better to establish constructive quid pro quo exchanges. Aid recipients would be asked to set the table or mop the floor to earn supper or a bed. These tasks cultivated skills, a work ethic and a sense of accomplishment.
One problem with handouts is that if you offer something for nothing, the numbers lining up for it expand indefinitely. In 1963 the U.S. secretary of agriculture assured lawmakers that federal food stamps “could be expanded over a period of years to about 4 million needy people.” Fifty years later the country’s population had not even doubled, but this handout had grown to 47.6 million recipients—and in a time of economic recovery.
Handouts can leave recipients less capable of taking care of themselves by lowering the drive to work, encouraging recipients to have children they can’t support, or enabling addicts to continue indulging.
Older generations were aware of these dangers. Octavia Hill spent a lifetime helping London’s poor in the late 19th century and concluded that personal relationships and quid pro quo arrangements were vital. “I have often,” she wrote, “felt bound to urge, not only the evils of indiscriminate almsgiving, but the duty of withholding all such gifts as the rich have been accustomed to give to the poor.” Hill pioneered a system of housing for the poor that required punctual payment of modest rents, in buildings purchased by philanthropic investors. When tenants became unemployed, they were given maintenance work.
Franklin D. Roosevelt was clear as well. “Continued dependence upon relief,” he said in 1935, “induces a spiritual and moral disintegration fundamentally destructive to the national fiber. To dole out relief in this way is to administer a narcotic, a subtle destroyer of the human spirit.” Yet government programs, being shallow and impersonal, tend to drift into handouts. They are like the superficial giver who drops a dollar into the beggar’s cup and walks on, feeling self-satisfied.
A hand up is what Habitat for Humanity offers. Recipients must contribute “sweat equity” by spending 300 hours building the home. They make payments on a modest, no-interest loan. The Habitat chapter in my town of Sandpoint, Idaho, has built 17 homes since 1991. Sixteen of those families are still living in the houses, with two having paid off their loans in full.
To thrive, human beings need healthy expectations and purposeful challenges. In today’s era of trillion-dollar handouts, it’s time to resurrect past wisdom and find ways to offer a hand up instead.
People have asked me what political party I belong to, or what political philosophy I believe in. The answer to that is “none”. I believe in none of them at all. I have abandoned the government altogether and the political system as a whole. “Oh, really?! Then why do you write about them?” Because I have no proverbial “dog in the fight”, and I don’t really care who wins, be it an election or a war. I call it as I see it and, further, because I don’t really care what popular culture or society thinks, I am not bound by worries about those concepts.
Now, with the government, they need people paying attention to them non-stop. Their constant need for attention and ego-stroking is rather like a spoiled eight-year-old used to being told how great his woefully poor grades are by permissive parents. The government requires constant input and interaction because they are pretty much the same creature as Hollywood celebrities. This is the whole premise behind voting: It’s like the Academy Awards, except for politics. Best Supporting Actor is what we call the Vice-President and so forth. The government even relishes being the “bad guy” and being criticized as such, as long as this is coming from the “other side” of the political spectrum. What they really don’t like, though, is being called out on their poppycock and then ignored.
If you abandon the government and, indeed, the entire concept of one, you will see the illusion for what is actually is: A very poorly-crafted charade by which we are led to believe we have a say in the government. However, this is the same government that has the power to jail you over non-violent offenses like failing to register your car or smoking weed. Therefore, what power do you truly have, I ask? No, this entire enterprise is to be abandoned and ignored as much as possible. What do we honestly need from this government that they alone supply? Protection? From whom? They create the enemies they protect us from, yet, who protects us from this government itself? No one does. The fraud is obvious.
See, the government is actually not threatened by rebels or opposition parties. Because those people simply want a different government. However, in the end, it is still a government they want. What really scares the government is someone who sees nothing he needs from a government whatsoever. Such a person could be slandered, criticized, blamed, or threatened by the government and this man does not care. Neither would he feel anything if the government praised him or promised something because he would know the government is merely trying to convince him to believe in the government. In fact, when one abandons the government, he neither cares if he is slandered or praised by the government because the government is not a concept to which he is attached.
What then? Do we go on believing in this government and interacting with it? Only so far as the obligations we must distastefully endure or deal with in order to have them rid from the doorstep. Ok, here’s your tax money, now go away. Vote? To what end? Another con artist? As much as possible, one needs to abandon any attachments to the government as possible. The government obviously plays for attention, which is why elections are such a spectacle of infantile silliness and the vulgar antics of has-been celebrities. This is to be abandoned and ignored. If one participates in this, then why is one surprised when they deliver what the campaign so aptly demonstrated was to come? That being, sheer stupidity, a couple no-win wars, more taxes, and political corruption.
Take for example the concept of automobiles. Some people have no choice but to drive. But, for myself, I have discovered that a bicycle needs no driver’s license, vehicle registration, or insurance. I am free from government interference to use this vehicle. Therefore, I have succeeded in abandoning one aspect of government life-charting, movement tracking, and money confiscation. There are many government laws that I am now free from and my vehicle cannot be impounded for infractions, not that many of those exist for this vehicle. I can drive it without a license or registration as I said. Indeed, this is what we need to look to. Not revolution, not change we can believe in, but a government to be abandoned and ignored. Change we don’t believe in because there is nothing that needs to be changed or believed in as far as government participation goes.
People are afraid of this concept. “What will we do without the government?!” Oh, I don’t know, be at peace with other nations perhaps. Be less at risk of being fined, jailed, or assessed an endless procession of taxes, fees, penalties, and obligations. This is the genuine reason the government hates voter apathy. They don’t fear voters voting against them. What they fear most is being ignored and made irrelevant. Being made a laughingstock and then walked away from. People that have no need for the government don’t even need a new government, so what can possibly be promised to them? The government operates on promises it then fails to deliver every time. He who abandons the government has realized this and, therefore, no longer desires anything the government allegedly has to offer. Not that he needed anything from the government in the first place.
What has this government given us that we could not do for ourselves? Oh, right, the possibility of a nuclear war that could wipe out the entire planet. That is what the government has given us that we couldn’t do on our own. Every enemy we have has come because the government went over to their countries and provoked them. They didn’t just wake up one day and say, “Gosh, I think I’ll devote my life to killing Americans because I hate freedom!” Excuse me, but the government successfully took away a lot of our freedom quite some time ago. There is little left here for the terrorists to take away. Someone said that if Islamic radicals took over America, they’d force non-Muslims to pay a special tax. Look here, the government already does that to us but they are equal-opportunity taxers. Why, then, are we so afraid of what cannot possibly happen while we just accept what actually does happen? At this point, fine, you want your tax money? Great, here, take it. Then go away.
It’s not that I don’t vote because I don’t think it’ll make a difference. I don’t vote because I have abandoned not only the government but the entire concept of one. I feel sorry for folks that devote their lives to political parties and campaigns. They feel like they’ve saving America. In truth, there is nothing to be saved. It is the government that tells us there is in order to win elections. And saving America from what? What it needs “saving” from is its own government and the political system as a whole. The government to me has as much value as forgotten, torn, and discarded plastic shopping bags blowing across the desert here. They’re not even as valuable as an empty plastic yogurt tub because that is a valuable container that can be cleaned and re-used. This government cannot be cleaned and re-used. Neither can the political system.
Yes, I write about the government. Because the truth must be told. I don’t need to take a side because there is no side in a system as politically clone-ridden as this one. They have all been cloned from the same cell off the buttocks of North America. Some people still aren’t convinced. So, allow me to demonstrate: Hillary Clinton just said that she will help all the working people in America who are falling into poverty. How? Because she can call on her husband who knows how to do so. What?! Is it her husband running for president, or her? Or both? Excuse me, but her husband has already had two terms as president here. If he was so wonderful and such a veritable Solon of economic wisdom, why then have his economic policies not borne fruit into this future? Because of George W. Bush? Well, Bush has not been president for the past eight years. That dubious honor belongs to Hillary’s former boss, President Barak Obama. How, then, can this individual claim to be concerned with helping the poor when she comes from the same party as Obama and worked for Obama? Again, she is but a clone from the Clinton Administration, though cloned from Bill with a couple gene-splices from Obama.
What, therefore, can Hillary possibly do that would be remarkably different from the same Obama would do or did? Or that Bill did? There are re-tread tires with less mileage than Hillary. The working poor will feel it is their “duty” to vote and maybe vote for her. I have no such duty. For me, the duty does not exist because I am not asleep at the political switch. I have abandoned the switch and couldn’t care less if the train ends up on the wrong track. In truth, it is already on the wrong track and has been for quite some time. Thus, it does not require me, Switchman Jack, to pull the switch at the polling place and vote. I don’t work for this railroad. I’m not even a hobo on it.
But I’ll say this for them: It is supremely hilarious. What makes it so funny is that everyone takes this all so seriously. Even local elections are carried on with the seriousness of a death march. People even cry actual tears! Fascinating! And it’s not even good drama! This is like a grade-school pageant and people act like this is Gone With The Wind. Well, what ended up “gone with the wind” was reality and sanity. Remember those old laugh tracks they had for 1970s sitcoms? As if you needed to hear other people laugh so you knew what was funny and to be laughed at. Yes, it was television Pavlov stuff, but that was the idea. I think we need a laugh track for the entire government. Because if you don’t know this is funny and to be laughed at, you need other people to let you know.
The paradoxical divergence between the government's data on initial jobless claims, which in just over half an hour is expected to print at or close to another multi-decade low, and the actual number of layoff announcements by employers as tracked by Challenger Gray, and which continues to soar is puzzling to say the least.
"Helicopter money" started out as, and long remained, nothing more than a heuristic device — and a brazenly counterfactual one at that — employed by monetary economists as a means for gaining a better theoretical understanding of the consequences of changes in the stock of money. "Suppose," the analysis went, that instead of increasing the monetary base by buying bonds in the open market, central banks dropped new supplies of currency from helicopters, thereby instantly increasing everyone's money balances. What would that do to spending and, eventually, to prices?
Lately, however, helicopter money has made its way from the inner recesses of economics textbooks to the financial pages of major newspapers and magazines, where a debate has been joined concerning its merits, not as an abstract analytical tool, but as an actual policy tool for relieving Japan, and perhaps some other economies, of their deflationary woes. Look, for some examples, here, here, and here. And see as well this recent blog post by our dear friend Jerry Jordan, written for the Atlas Foundation's Sound Money Project.
Yet for all the controversy surrounding the suggestion that Japan should actually try dropping money from helicopters (or something close to that), my own response to it consisted, not of either surprise or dismay, but of a strong sense of déja vu. For I myself wrote an op-ed proposing helicopter money for Japan in the spring of 1997, that is, almost exactly 19 years ago. I never tried to publish it, in part because I myself couldn't quite decide just how firmly my tongue was poking my cheek as I wrote it, and because I had then as I do still an abiding dislike of "clevernomics," which is the sort of stuff economists write to show people how smart they are, and not because they are seriously trying to help the world along. Fearing that I was myself lapsing into clevernomics, I stuffed the essay into a file cabinet, where it has been buried ever since.
All the recent writing on the subject has, however, emboldened me to resurrect my dusty old essay and to publish it here on Alt-M under its original title. I don't pretend that it adds anything to what recent commentators have had to say on the topic. Consider it a bagatelle, if you like: you'll get no argument from me.
They said it was like "pushing on a string." It was the middle of the 1930s, and the U.S. and much of the rest of the world were in the midst of an unparalleled deflationary crisis. Normally the way out of such a crisis would have been for central banks, including the Federal Reserve, to inject more reserves into their banking systems by buying securities in the open market and paying for them with central bank credits. That policy would do provided banks put the new reserves to work by lending them out, thereby stimulating an increase in demand for investment or consumer goods. But in the U.S. interest rates on loans and securities had fallen so low, the Fed claimed, that adding to bank reserves no longer helped: the new reserves “pushed” into commercial banks would simply pile-up there, instead of causing the banks to extend more private credit. Hence, “pushing on a string.” Economists, following John Maynard Keynes, referred to the conundrum in question as a "liquidity trap."
Whether the U.S. economy was really stuck in a liquidity trap during the Great Depression remains controversial. For several decades afterwards, however, the issue was moot, as inflation replaced deflation throughout the world's economies. Only recently it has again taken on practical significance, with economists and Japanese monetary authorities pointing to Japan today as another instance of an economy faced with an insatiable demand for liquidity. Japanese consumer and producer spending has been shrinking for months, causing wholesale prices to decline and inventories to accumulate. The overnight call loan rate has hit zero, and short-term lending rates are at historically low levels. Although the Bank of Japan has been pumping reserves into the banking system, bank lending remains sluggish. The banks have more reserves than ever, but seem to lack any incentive for putting them to use.
Japan's dilemma has at least one Federal Reserve official worried that the same thing might happen again (or, as some would have it, for the first time) in the United States. Marvin Goodfriend, a Vice-President of the Richmond Fed, proposes in a recent paper that, in the event that we should fall into a liquidity trap, Congress should grant the Fed authority to tax bank reserves, causing them, in effect, to earn a negative return. Since even a zero interest rate on loans beats a negative return on reserves, the banks would have reason to lend even at zero rates. For good measure, Goodfriend recommends that public currency holdings be taxed as well, so as to discourage hoarding by the public.
Goodfriend's proposed taxes are meant to be emergency measures only, which would be removed in good times. Still, one shudders to think what might happen should the government decide to take advantage of the new measures' capacity for enhancing its share of the profits from the Fed's money monopoly.
Fortunately, central banks don't need new taxing powers to free their economies from liquidity traps. All they need to do is to supply new money directly to the public, instead of trying to get it to them indirectly by first adding it to bank reserves. Individual citizens, unlike commercial banks and other financial firms, do not have to decide either to hoard money or to lend it at some trivial rate of interest. They have a third, more tempting, option, namely, that of spending unwanted money balances directly on goods and services. Central banks, on the other hand, don't have to issue new money in exchange for securities or collateral owned by private financial firms: they can simply give it away to citizens, avoiding the middlemen.
In Japan today, the strategy could work as follows: the Bank of Japan could announce its intention of giving away, say Y5000 (roughly $50 U.S.) to every Japanese citizen each month until private spending picks up, bringing Japan's deflationary crisis to an end. The giveaway could be engineered in a manner similar to that employed during the 1990 German monetary unification, when the Bundesbank supplied East German citizens with limited quantities of Deutsche marks in exchange for Ostmarks. The policy would assure Japan's citizen's that, one way or another, their money earnings were about to permanently increase, giving them ample reason to consume more. Given Japan's population, the promised rate of new money creation would increase Japan's monetary base by around ten percent after a year — a substantial rate, but not large compared to recent annual figures. Moreover, the mere announcement of the policy might suffice to revive spending quickly, allowing the policy to expire in relatively short order.
Critics of the monetary giveaway proposed here might fear that it would ultimately trigger inflation. The same sort of thinking led the Federal Reserve, in the mid 1930s, to actually raise bank reserve requirements out of fear that banks might change their minds any minute and begin lending their hoards of cash. The Fed's fears turned out to be exaggerated, to put it charitably: its decision actually helped to keep the U.S. depression going for several more years. Of course, if spending had actually revived on its own, surpassing the level necessary to revive the economy, the Fed could have dealt with the "problem" easily enough, by reabsorbing excess money by means of bond sales.
Keynes had a good quip about Fed officials who worried, in 1936, about inflation: they “professed to fear that for which they dared not hope.” Let's hope that the Bank of Japan won't harbor such misplaced fears, and that it doesn't otherwise allow the liquidity-trap bogey to keep it from doing all it can to revive Japan's economy.