Image: 401kcalculator.org via Flickr (CC BY-SA 2.0)

Image: 401kcalculator.org via Flickr (CC BY-SA 2.0)

by Adam Goldstein and Neil Fligstein

The resurgence of finance over the past three decades represents one of the most remarkable trends in the recent history of capitalism. “Financialization” has become a common byword to describe the growing role of financial markets, motives, actors, and institutions in the operation of the overall economy.

One aspect of financialization that has received less attention is the role of households. As the financial industry has expanded, it has done so in large part by marketing more products to households, such as mortgages, second mortgages, mutual funds, stock trading accounts, student loans, car loans, and various forms of retirement products. But how have households themselves changed their attitudes and behavior in relation to financial markets? Should we view their primary role as that of consumers who supply the raw inputs for Wall Street’s machinations?  Or have households also started to think about their own economic activity in more financial terms? What is the scope of popular financialization?

To answer these questions we examined eighteen years of survey data from the U.S. Federal Reserve Board’s Survey of Consumer Finances. We charted changes in the financial activities and attitudes of U.S. households from 1989 to the onset of the financial crisis in 2007. Our goal was to provide a global view of the various ways that households at different points in the income distribution have become more involved with the financial economy, and how this has coevolved with their attitudes towards risk and debt.

The patterns are revealing: In terms of financial consumption, we find secular growth in the use of all kinds of financial products across the socio-economic structure. This implies that financial firms sought out customers for their products and made them available to people up and down the income distribution.

Read More

It’s been a rough summer for academics. Just in the last few months, two black women sociologists have become the subjects of national news stories when comments they wrote on twitter drew the ire of conservatives who branded them racists and demanded that the institutions where they worked fire them. First Saida Grundy, then Zandria Robinson drew media attention when conservative websites critiqued their twitter comments on the confederate flag, white college men, and other subjects related to issues of race and inequality. In Grundy’s case, she issued a statement saying that she wished she’d chosen her words more carefully, and the furor essentially died down. In Robinson’s case, after public speculation that the university fired her, she wrote a lengthy blog post desribing the details of her long association with her former employer and ultimate decision to leave for another university.

Read More

ball

Source: pixabay.com

Matthew Futterman’s recent piece in The Wall Street JournalThe U.S. Soccer Double Standard,” should sound very familiar to scholars of work.

He makes the very strong case that women athletes in the U.S. are held to a different standard than men.  The evidence?  Female soccer players on the UNDEFEATED 2015 U.S. World Cup team are accused of “sloppy, uncreative play” (despite outscoring opponents by a wide margin), having “unsatisfying wins,” and some players are even apologizing for some victories, saying that their play will improve.  The coach has even been defensive in response to recent victories.

In contrast, Futterman goes on to point out, soccer players in last year’s men’s U.S. World Cup team won one game (against Ghana), tied a game (against Portugal), and lost to Germany and Belgium yet were hailed as the “spunky underdogs storming the gates of the soccer establishment” and widely celebrated upon their return home.  I’m not able to find record of ANY member of the men’s national team apologizing for their 2014 World Cup performance. The double standard, some argue, is because we have heightened expectations for women to win—and win big—and expect less of the men.  The women recently won two World Cup titles (in 1991 and 1999) and at the start of World Cup play were ranked second in the world by FIFA. The men have qualified for every World Cup since 1990 but have never won a title.

Read More

Image: Dave Crosby via Flickr (CC BY-SA 2.0)

Image: Dave Crosby via Flickr (CC BY-SA 2.0)

by Peter Fleming

Are you paid what you are worth? What is the relationship between the actual work you do and the remuneration you receive?

The revelation that London dog walkers are paid considerably higher (£32,356) than the national wage average (£22,044) tells us much about how employment functions today. Not only are dog walkers paid more, but they work only half the hours of the average employee.

It is clear that the relationship between jobs and pay is now governed by a new principle. The old days in which your pay was linked to the number of hours you clocked up, the skill required and the societal worth of the job are long over. Other factors play a bigger role in determining how much you are rewarded today. This is why we live in a world where the task of walking a millionaire’s dog through Hyde Park is considered more valuable than an NHS nurse (starting salary £21k).

Read More

Black libraryby Melissa E. Wooten

Earlier this year, South Carolina State University became a national topic of conversation. PBS, NPR, and the New York Times each ran stories documenting the school’s financial woes and the resulting tumult. The South Carolina House Ways and Means Subcommittee on Higher Education proposed to shut down the state’s only publicly supported historically black university because the school was in debt to the tune of $11 million.

The university’s trustees voted to place the school’s president on administrative leave, alumni protested, and ultimately, South Carolina legislators did not close the school.

The fact that casual observers mostly hear about historically black colleges and universities in moments of crises adds fuel to the fire of those that wonder “Are black colleges still necessary?” More than any other, this is the question I was asked as I researched, discussed, and wrote about historically black colleges and universities (HBCUs).

A consequence of living in a multi-cultural society that purports to value diversity is that we are suspicious of black colleges. At a fundamental level, the question, “Are black colleges still necessary?” implies that it is easy to identify the value in some colleges – those that are predominantly white – but not those that are predominantly black.

HBCUs play a critical role in the production of highly educated, successful black Americans. Though they account for a relatively small proportion (3%) of U.S. colleges and universities, roughly 40 percent of blacks earning science, technology, engineering, and math degrees do so at black colleges. Eighty-five percent of black medical doctors attend a black college at some point in their educational career. Forty percent of black doctoral degree holders earned their bachelor’s degree at a black college. These statistics beg the question of why it is so difficult to conceive of HBCUs as prestigious entities worthy of the same level of respect and accord we so easily dole out to so called “mainstream” or predominantly white colleges.

Read More

Image: Lendingmemo via Flickr (CC BY 2.0)

Image: Lendingmemo via Flickr (CC BY 2.0)

by Michael Kumhof

The United States have experienced two major economic crises during the last century, starting in 1929 and 2008. In each case the pre-crisis decades were characterized by a sharp increase in income inequality, and by a similarly sharp increase in household debt leverage. In new research, we propose a theoretical mechanism that links growing income inequality to growing debt leverage, and ultimately to financial fragility and financial crises. We find that this mechanism can account for around three quarters of the 1983-2008 increase in the U.S. debt-to-income ratio, and therefore for the increased probability of a financial crisis such as the one observed in 2008.

Read More

Time chefsby Deborah A. Harris and Patti Giuffre

Imagine you’ve stepped inside one of those foodie television shows. You know, the ones set in fine dining restaurants where waiters and sous chefs dash around the kitchen at a frenetic pace, calling out food orders, and tasting dishes in hopes they will live up to the executive chef’s exacting palate. The executive chef moves through the kitchen and is clearly in charge of the action. Maybe the chef you’re imagining is barking orders at subordinates. Maybe they’re appraising the kitchen with a cool eye.

Now, imagine you’re in a different type of kitchen—a kitchen in the “typical” middle-class American home. In this setting, the “chef” is grabbing food out of the refrigerator and, instead of sous chefs, young children are whipping around the kitchen talking about soccer games and piano lessons that have to be worked into everyone’s schedule. Instead of worrying about earning another Michelin star or impressing a food reviewer, this “chef” is just trying to get dinner on the table for the family.

In the two scenarios above, what genders did you imagine for the chefs? If you’re like most people, you probably pictured the professional chef as a man dressed in a white jacket and toque while the second scene may have led to visions of harried mothers, perhaps still in the clothes they wore to work, frantically trying to get dinner on the table for her family.

Read More

Since 1980 there has been a vast expansion of the economic power and centrality of financial service sector. Less well known is the simultaneous shift in the investment strategies of non-financial firms, at least in the US, toward investing in financial instruments of various sorts. By the early 2000s financial investments had risen to almost 30 percent of total assets in the U.S. private sector. In our on-going research we have tried to figure out if this shift in corporate investment strategies has been economically destructive. Our answer is that it has and that economic growth and U.S. standards of living have suffered as a result.

We already know that the concentration of wealth and power in the financial service industry has introduced fragility into the world economy, reduced both fixed investment and R&D, and increased inequality in the advanced economies. Instability, sector shifts and inequality are not, however, evidence that financialization has been harmful to general economic growth. In a capitalist system shocks, sectoral shifts and cyclic destruction, even rising inequality are not inconsistent with long run growth in standards of living.

It is possible that financial investment strategies, despite increasing inequality, lifted all boats. If this is what happened then the policy case against financialization is weakened considerably. On the other hand, if financialization of the non-finance sector is associated with decreased total production then contemporary movements toward a financialized non-finance sector are economically as well as socially destructive.

Prior to 1980 firms like GE and GM used debt financing of their products to support growth in market share. During this period we find that financial investments by non-finance firms were much lower  and encouraged economic growth..

After 1980 many non-financial firms moved in more speculative directions, into stock, credit, currency and even derivative markets. After 1980 we find that financial investment strategies by non-financial corporations are associated with reductions in value added, as Figure 2 shows. We estimate that since 1980 financial investments on Main Street stripped the US economy of at least 3.9 percent of aggregate growth, or about three years of lost growth in GNP. For comparison sakes, the great recession of 2008 produced aggregate negative growth of 2.9 percent.

Read More

Image: ScoRDS via Flickr (CC BY-NC-SA 2.0)

For the first time in its history, the City Council in my hometown of Austin, Texas is run by a female majority. This important milestone should be cause for celebration. Instead, Austin was embroiled in controversy when it was revealed that the city manager’s office had brought in experts to help staffers “cope” with this new reality.

Consultants from Florida provided a two-hour training session to explain how new approaches would be needed to present issues before the council now that women are in charge. Staffers were told that men and women do not process information in the same way. Nor do they care about the same things: men are interested in the financial bottom line, while women want to know how various issues impact the community, families, and children. Women also ask a lot more questions than men do, and take more time reaching a decision.

The staffers at the event (most of whom were women) were encouraged to adopt a gender-appropriate repertoire as soon as possible because more women political leaders are inevitable, thanks to the inspiration of Hillary Clinton.

This egregious stereotyping of women leaders led to a barrage of embarrassing press coverage. Consequently, the City Manager suspended the person responsible for arranging the training session.

Read More

A recent New York Times article discusses research showing that pressures to work long hours and a culture of overwork are reinforcing gender inequality.

The article quotes Robin Ely, professor of business administration at Harvard Business School, who states that “24/7 work cultures lock gender inequality in place, because the work-family balance problem is recognized as primarily a woman’s problem.” The study was coauthored by sociologists Irene Padavic of Florida State University and Erin Reid of Boston University.

The article also quotes sociologist Mary Blair-Loy, of University of California, San Diego, and refers to her research on gender identities and cultural expectations regarding gender and work performance.