In 1928, the English economist Arthur Pigou coined the term “human capital” to describe investments in the acquisition and application of knowledge, asserting that these investments, just like traditional physical capital, were essential ingredients of economic production. Today, we might divide human capital into three major components. Education plays an important role in developing human capital, of course. Equally important is whether a society’s farms, factories, offices, and shops take advantage of its workers’ abilities through efficient management and the latest technology. And welcoming foreigners infuses a country’s stock of human capital with fresh energy and initiative.
Americans have long been deeply committed to building up all three aspects of human capital, even if they haven’t called it that. The United States pioneered the idea of schooling all children (as opposed to only the children of the rich); as early as 1890, it had more schools per capita than any European country, and 35 percent more of the population went to school than in England, France, and Germany, the next-best-educated large nations. American industry and agriculture were generally the first to harness the most advanced technologies in production, transportation, and communications, as well as the most apt to hire, pay, and promote workers on the basis of merit (see “Who Killed Horatio Alger?”). Continuously replenished by eager and well-assimilated migrants, the American workforce outshone its rivals in its high aspirations, openness to opportunity, and extraordinary diligence. All this paid off handsomely: by 1900, perhaps even earlier, America was the world’s most productive and prosperous nation.
There are signs, however, that America’s era of human-capital superiority may be coming to an end. That has disturbing economic implications. For one thing, the severity of the recent recession—and the limping recovery—can be attributed partly to deficiencies in human capital. For another, when economic growth eventually resumes, the condition of the country’s human capital will determine how well Americans profit from the recovery. A recent study from the Organisation for Economic Co-operation and Development (OECD) estimates that if the school performance of Americans born this year could be raised to the level of their higher-scoring international peers, the resulting U.S. economic growth would boost GDP by $41 trillion within 20 years.
Happily, Americans have always done their part by acquiring as much human capital as they can. What is left is for federal, state, and local governments to do their part by fashioning more effective human-capital policies.
Start with education, which has properly troubled the country since at least 1983, when the U.S. Department of Education published A Nation at Risk: The Imperative for Educational Reform. This explosive report charged that “the educational foundations of our society are presently being eroded by a rising tide of mediocrity that threatens our very future as a Nation and a people.” The tide hasn’t receded much since then. In various recent international tests in mathematics, science, and language arts, American students scored about the same as they did in 1983, or only marginally better. As of 2009, the average college-bound student’s Scholastic Aptitude Test reading score was virtually the same as in 1983, and his math score was only modestly higher. American K–12 students spend less time in class than those in Europe and Asia do. And the volume of college remedial courses today is higher than in 1983.
Dominating the education-reform debate today are two broad ideas: increasing schools’ accountability and improving teacher quality. The No Child Left Behind Act of 2002, by far the most significant of the accountability initiatives, mandates annual testing of students’ performance, and by reducing federal aid to schools that perform poorly, it motivates immediate administrative responses to inferior test results. Though NCLB’s testing requirements were a leap forward in education reform, it contained a fatal flaw: allowing states to measure progress according to their own, rather than nationally normed, tests, which gave them a powerful incentive to escape sanctions by adopting weak ones.
As for better teaching, there is no proven way of achieving it. Most American teachers are trained in schools of education that impart limited teaching content and few useful teaching skills to their typically mediocre students—yet no teacher-quality initiative that I know of has targeted teacher-training programs. There is little evidence that raising overall pay levels increases teaching effectiveness, despite U.S. Secretary of Education Arne Duncan’s recent call for big pay increases for teachers. The highly lauded Teach for America program lures smart graduates of elite colleges into teaching, but there is more to good teaching than intelligence; further, many TFA recruits view teaching as merely a career detour and leave after a few years. True improvement in teacher performance would probably require rewriting teachers’ collective-bargaining agreements to give administrators greater discretion in hiring, firing, pay-setting, and class assignments.
There is widespread belief that efforts to improve education must begin with preschool. E. D. Hirsch, one of the foremost scholars of educational development, blames much of the poor academic performance of low-income minority students, which drags down overall U.S. academic outcomes, on the lack of cultural stimulation that they receive at home in their formative years. It seems logical to think that free preschool for disadvantaged children might overcome the cultural deficit. Unfortunately, America’s largest preschool program for poor youngsters—Head Start, now enrolling about 1 million children and costing $9.5 billion annually—is a near-total failure. A 2010 study of Head Start by the U.S. Department of Health and Human Services concluded that by the end of first grade, children who had formerly been enrolled in Head Start displayed no measurable social or academic advantage over those who had never participated in the program. Other data show that Head Start enrollees’ literacy and math proficiency are no greater than those of children kept at home.
Nevertheless, some early-childhood programs have actually succeeded in giving culturally deprived children a “head start”—notably, programs based on “core knowledge” and “direct instruction,” which use highly detailed, extensively tested curricula to introduce math and language-arts concepts to preschoolers in a systematic way. This means that increasing access to preschool, given the right instructional approach, could be a highly cost-effective human-capital initiative, generating lifetime income gains for socially disadvantaged children as well as reduced costs for later remediation and special education worth many times the initial outlay. Publicly funded preschool for all American three- and four-year-olds would cost about $30 billion annually, twice the $15 billion that we currently spend on Head Start and other preschool programs. If we want it to succeed, however, it shouldn’t be operated by local “community” agencies and funded by the Department of Health and Human Services, as Head Start is. Rather, local school districts would receive preschool funding directly from the U.S. Department of Education, with future federal school aid contingent on the academic progress that preschoolers made once they entered grade school.
Two reforms yielding proven results that states can implement with administrative consistency are requiring a rigorous, evidence-based curriculum in all grades and extending time in school, both daily and annually. Massachusetts—populous, urban, and ethnically diverse—made rapid progress after it instituted both measures. The Massachusetts Education Reform Act of 1993 imposed a rigorous, statewide K–12 core curriculum and a somewhat longer school day and school year; in 2005, the state began piloting an even longer school day in certain districts. The reforms worked. Massachusetts now leads the nation in K–12 education gains. A recent Harvard study, examining results from international math and reading tests, found that Massachusetts eighth-graders not only scored highest among American states on the tests; they outscored test takers in all but a few countries. The state’s own annual assessment reports show that between 1998 and 2009, tenth-graders’ proficiency rates jumped from 38 percent to 79 percent in reading and writing and from 24 percent to 75 percent in math.
Another encouraging statistic from Massachusetts: 61 percent of high school freshmen there proceed to college, and 53 percent of the state’s young adults have college degrees, both the highest proportions of any American state. But Massachusetts’s success contrasts sharply with conditions in most other American states, where too many high school students fail to graduate and where those who do aren’t prepared for college. A February 2011 report issued by New York’s Board of Regents, for example, found that last year, only 77 percent of the students who had been ninth-graders four years earlier managed to get a high school diploma; further, only 41 percent of students who went on to college were “college ready.”
It’s true that the U.S. education system partly makes up for these failures by providing opportunities for underperforming Americans to get their act together later in life. By allowing high school dropouts to earn their degrees as adults, the General Education Development (GED) test raises the overall U.S. high school completion rate to that of the leading European and Asian countries. And there is an accelerating trend of older adults’ going to college: people over 35 now account for 19 percent of all U.S. college students.
Still, to address the problem of high school graduates who haven’t been properly prepared for college, U.S. governors and the Obama administration have been eagerly promoting the Common Core State Standards Initiative (CCSSI), which commits its 43 enlisted states to setting stringent English and math content standards from first through 12th grades. Surprisingly, a recent Brookings Institution study found that current differences in states’ content standards didn’t appear to correlate with their students’ scores on national tests. But if content standards are set high enough and, as in Massachusetts, embedded in detailed, pedagogically proven curricula, CCSSI may turn out to be one of the most effective of the current crop of educational reforms.
The importance of a college education in the global, information-age economy is now taken for granted. So the push by the federal and many state governments to raise rates of college attendance would make a lot of human-capital sense—if college students actually completed their studies and mastered college-level material. Unfortunately, despite devoting massive resources to “remediation,” American colleges have largely failed to offset the inadequacy of the typical student’s high school preparation. Only two-thirds of baccalaureate enrollees today graduate with degrees even after six years of study, and only a third of those attending community or technical colleges do so, including those who transfer to baccalaureate schools. And many students who do finish college are graduates in name only, never having done true college-level work.
One way to encourage better performance is to change the terms under which the federal government distributes over $100 billion each year in student grants and subsidized loans. Such a restructuring, modeled on the Cameron government’s education-reform initiative in Britain, would give anyone graduating from high school with good grades an interest-free college loan. The loan would be large enough to cover the average student’s education costs at a public college or university (currently, roughly $12,000 a year), though it could vary among regions and specialized academic programs, such as engineering or nursing. It would have none of the family means-testing that complicates and distorts federal student aid today. To stay eligible for the loan each year, college students would need to take a minimum number of courses and remain in good academic standing. In paying off their loans, students who graduated would get a sizable rebate, and all repayment would be through the income-tax system, with more forgiving repayment schedules for those with lower incomes.
Such a federal college-loan program would open up great vistas of student choice and institutional competition, and it would place no additional burden on the national budget, provided that it replaced, rather than supplemented, the current system of financial aid. With federal loans pegged to public-college costs, students choosing private colleges would have to absorb any tuition differential, probably constraining tuition hikes in the private-college sector. Whatever state funding remained would ideally be distributed to campuses based on their academic outcomes, including graduation rates—a practice that currently exists in only a few states.
The second piece of the human-capital picture—workplace productivity—happens to be the most encouraging. In 2010, the OECD, measuring countries’ productivity according to GDP per hour worked, placed the United States ahead of all leading industrialized countries, at $59.50, beating France ($54.80), Germany ($53.40), the United Kingdom ($46.70), Canada ($45.00), and Japan ($39.40). A study published in the Journal of Economic Perspectives by Bart van Ark, Mary O’Mahony, and Marcel Timmer found that between 1995 and 2004, annual American productivity growth exceeded that of Europe by 68 percent, and 70 percent of that difference could be traced to what they called “knowledge economy” factors and what we would call advantages in human capital: better workplace technology, greater worker skill and effort, and more efficient management. America’s challenge now is to maintain its technological supremacy and to motivate its labor force to work even harder and better.
In his recent book The Great Stagnation, economist Tyler Cowen argues that the slowing of American technological progress since the 1970s accounts for most of the country’s current economic woes, and he asserts that unless we make further technological progress, workplace productivity and incomes will not rise. The best gauge of Cowen’s thesis is annual national patent volume, since patents are the way science and technology are linked to workplace productivity (see “Patently American”). The good news here is that the U.S. patent office still leads the world in patent volume, with more than 490,000 patent applications in 2010, followed by Japan, China, South Korea, and the European Union. But the rate of growth in U.S. patents is falling, a growing share of those patents have been filed by foreigners, and the volume of patents for civilian industrial products (as opposed to defense-related items) is declining relative to international competitors.
The United States’s technological edge is grounded in its leadership in scientific research. The federally funded National Science Foundation, National Institutes of Health, and other agencies competitively award grants to more than 1,000 American research universities; most recent American technological advances, from barcode scanning and the Hubble telescope to global positioning systems and countless biomedical discoveries, grew out of these grants. The federal government also directly oversees a network of university-affiliated national laboratories, such as Brookhaven in New York and Lawrence Berkeley in California. American research and development expenditures in 2008 amounted to $398 billion—in absolute terms, the highest number in the world—and roughly $104 billion of that came from the federal budget (much of it from the defense budget). But the U.S. cannot rest on its laurels; Japan spends a higher share of its GDP on research, and China has doubled its investment in research over the last 20 years.
A large hike in the federal government’s direct support of civilian research, in grants to research universities and institutes and to the national research laboratories, could generate a flood of new science and technology. But private-sector research is also critical, and the tax law allowing corporations to write off research investments is grudging and temporary, requiring annual renewals. Currently costing the Treasury $8.4 billion a year, the corporate research tax credit should be made permanent and more generous. If you add that $8.4 billion to all the federal government’s nondefense and non-NASA grants, you have a total of $45 billion (less than 1.2 percent of the federal budget and 0.3 percent of GDP). So doubling direct federal support for research and increasing the corporate research tax credit would come to another $40 billion, an investment that would pay for itself in new tax revenues generated by higher earnings. The real payoff of such a move, however, is that it would greatly multiply the human-capital output of America’s workers and catapult the United States back into its status as unchallenged global technological leader.
The Journal of Economic Perspectives study ascribes roughly a quarter of America’s productivity advantage over other advanced economies to the fact that its workers work longer and harder than those in most other rich nations do. In 2008, according to the OECD, the average American worker put in 1,796 hours, while the equivalent figure was 1,727 in Canada, 1,652 in the United Kingdom, 1,560 in France, and 1,430 in Germany. In 2005, the International Social Survey Programme, a survey of workers’ attitudes in the world’s most advanced economies, also documented Americans’ superior work ethic. Asked if a job was only a way to earn money, 60 percent of U.S. workers disagreed, a number strikingly higher than Germany’s 55 percent, France’s 53 percent, the U.K.’s 51 percent, Japan’s 42 percent, and South Korea’s 36 percent. Americans also outpolled workers of most surveyed countries when asked whether they would work even if they didn’t need the money, whether they were willing to work for less pay, and whether they would accept a part-time job if nothing else was available.
Still, Americans are working a lot less than they used to. Adult male labor-force participation, which peaked at 78 percent in the 1970s, has fallen to 71 percent in 2010 and was only 73 percent just prior to the recent economic collapse. The decline can’t be blamed entirely on American workers. Government and labor unions, believing that keeping potential workers out of the labor market is a viable way to reduce the official unemployment rate, have promoted early-retirement policies and authorized disability indiscriminately, leading fully capable laid-off or discontented workers to jump at pension or disability stipends rather than look for work.
At the moment, federal entitlement programs contribute to the problem by encouraging Americans to retire early. Social Security allows enrollees to receive partial retirement benefits at 62, an option of which roughly 40 percent of those eligible avail themselves, and Medicare begins offering benefits at 65 (see “Is There a Retirement Crisis?”). Budget hawks these days suggest raising the age of eligibility for both programs to reduce the federal deficit, but a better reason to do so is to encourage mature workers to stay employed. Further, the current rules modestly increase payouts to those who defer taking Social Security benefits until their late sixties. But to motivate them to retire later still, benefits need to increase enough that, on a lifetime basis, workers will be significantly better off. The cost of higher payouts would be defrayed by the additional years of payroll tax contributions and fewer years of drawing benefits, as well as the increased contribution to GDP growth. A parallel Medicare reform might enrich the benefits package with age, enabling older retirees to forgo the need for supplemental private “Medigap” insurance.
The national system of unemployment insurance—with funding shared by the federal and state governments at an annual cost that ranges from $50 billion in good times to $132 billion and counting during the current recession—is an indispensable component of the American safety net. Economists disagree about whether its current rules deter unemployed workers from seeking or accepting employment. But even if that isn’t the case, the billions of dollars backing the program might be better deployed, as in Germany, if they subsidized employment rather than idleness. Unemployed German workers are aggressively encouraged to take available jobs and, if necessary, are trained for them—somewhat as in American welfare-to-work programs. To discourage employers from laying off workers when their business drops off, Germany also allows them to reduce workers’ hours and uses public funds to maintain the employees’ benefits and partly offset their reduced pay. As a result, Germany’s unemployment rate today is considerably lower than America’s.
A third—often unrecognized, but essential—component of America’s human capital consists of immigrants, who have always brought fresh, diverse skills and a fierce work ethic to this country, along with a keen appreciation of its freedom and opportunity. Without them, the United States would never have become the world’s richest and most powerful nation. The level of national benefit from immigration does depend, however, on the contours of immigration policy. Until the 1920s, that policy consisted essentially of an open door to anyone who wished to come (except for the Chinese after the 1880s). At a time when most of Europe was quite poor and when migrating meant leaving one’s native land forever, immigrants to the United States were likely to be Europe’s—and, to a lesser extent, Asia’s—most talented and venturesome people. Further, coming to an America that hadn’t yet instituted a social welfare safety net or labor protections, immigrants had to be extremely hardy and hardworking—highly valuable attributes in a rapidly industrializing society. America attracted the world’s best and brightest without even trying.
If the United States is to replicate its earlier success today, however, it must do so by design. In the most recent three years for which data exist, 2007 through 2009, the U.S. admitted 3.29 million legal immigrants. Over 65 percent were sponsored by families, 15 percent were admitted as victims of persecution, and 14 percent were skilled workers sponsored by employers. (The small remainder span a number of admissions categories, including the 4 percent who were selected by lottery.) Like most immigrants, those admitted under family and persecution quotas might be hardworking, but the majority have limited education and training and are working in low-skill, low-productivity jobs.
That need not be the case, however, for future immigrants. Even if total immigration levels don’t change, bending the admissions trajectory in favor of hundreds of thousands of immigrants who are well educated, skilled, and perhaps English-speaking would give the U.S. an instantaneous infusion of human capital. It would also have a profound impact on the future. Children raised by educated immigrants in stable families and encouraged to high achievement in school would unquestionably have academic, social, and economic prospects superior to those of American immigrant children today. As adults, they in turn would provide rich human-capital nurturing grounds for subsequent generations.
To see such an immigration policy at work, look to Canada, the only country in the world that admits more immigrants proportionately and assimilates them more successfully than the United States does. Of the 736,000 immigrants entering that country from 2007 to 2009, 60 percent were admitted because of superior education or skills, 27 percent because of family sponsors, and 10 percent because they faced persecution. In the Canadian system, an unsponsored adult applicant seeking residency must compile sufficient “points,” awarded on the basis of education, work experience, intended occupation, employer sponsorship, knowledge of English or French, at least two years of Canadian college, or marriage to an educated spouse. The result is that among immigrants to Canada today, 55 percent have college degrees, 82 percent speak a Canadian language, over 70 percent become citizens, and the average household income exceeds $50,000. Among current legal U.S. immigrants, by contrast, only 26 percent have completed college, 47 percent speak English well, and 40 percent seek citizenship, while household income averages under $40,000.
Adopting an immigration system like Canada’s could upgrade America’s human-capital mix more quickly than even the most effective program of education reform. A good first step would be admitting as permanent residents all foreign graduates of American colleges and universities who wish to stay. However, the failure to stem illegal immigration over three decades has so poisoned the national immigration dialogue that legal admissions criteria probably cannot be altered until that issue is settled.
The United States can regain its human-capital supremacy with the right federal and state policies. But none of them will be easy to adopt in the current political climate. Education reform is blocked on the left by the doctrinal rigidity of the professional education establishment and the job-security concerns of the teachers’ unions, and on the right by those who, believing that education policy should be strictly a local matter, reject reasonable national academic standards, as well as by those who are more concerned with holding down school expenditures than with lifting school outcomes. Workplace innovation is thwarted on the left by those who demand more regulation and taxation of business, as well as needlessly early retirement and unsustainable fringe benefits, and on the right by antagonism to anything that smacks of “industrial policy,” a reluctance to devote more federal funds to basic research, and a general mistrust of scientists and the academy. Immigration reform is constrained on the left by Hispanic and other ethnic advocacy groups opposed to stemming illegal immigration and directing legal admissions away from family sponsorship, and on the right by numerous groups that want to curtail immigration altogether.
Beyond political considerations, some critics fear that a robust human-capital initiative may succeed only too well. Would American workers then possess qualifications—and aspirations—greater than the labor market could absorb? Would the new crop of high school, college, and professional-school graduates, supplemented by educated immigrants, find high-paying jobs to match their skills? And who would be left to do the dirty work? We needn’t worry. Historical experience and a considerable body of research clearly show that no national economy is bound by a fixed proportion of skilled or unskilled jobs. We also know that labor markets rapidly adapt to the rising skills of their workers with higher productivity and wages.
The worst that might happen is that some of the work currently being done by America’s lowest-skilled workers might have to be performed by teenagers and machines or, say, homeowners mowing their own lawns and supermarket customers using self-checkout machines. In the meantime, we can draw reassurance from the unassailable truth that two centuries of growing human capital have yielded spectacular increases in American prosperity and well-being.