Matthew Futterman’s recent piece in The Wall Street Journal “The U.S. Soccer Double Standard,” should sound very familiar to scholars of work.
He makes the very strong case that women athletes in the U.S. are held to a different standard than men. The evidence? Female soccer players on the UNDEFEATED 2015 U.S. World Cup team are accused of “sloppy, uncreative play” (despite outscoring opponents by a wide margin), having “unsatisfying wins,” and some players are even apologizing for some victories, saying that their play will improve. The coach has even been defensive in response to recent victories. Read More
Image: Dave Crosby via Flickr (CC BY-SA 2.0)
by Peter Fleming
Are you paid what you are worth? What is the relationship between the actual work you do and the remuneration you receive?
The revelation that London dog walkers are paid considerably higher (£32,356) than the national wage average (£22,044) tells us much about how employment functions today. Not only are dog walkers paid more, but they work only half the hours of the average employee.
It is clear that the relationship between jobs and pay is now governed by a new principle. The old days in which your pay was linked to the number of hours you clocked up, the skill required and the societal worth of the job are long over. Other factors play a bigger role in determining how much you are rewarded today. This is why we live in a world where the task of walking a millionaire’s dog through Hyde Park is considered more valuable than an NHS nurse (starting salary £21k).
by Melissa E. Wooten
Earlier this year, South Carolina State University became a national topic of conversation. PBS, NPR, and the New York Times each ran stories documenting the school’s financial woes and the resulting tumult. The South Carolina House Ways and Means Subcommittee on Higher Education proposed to shut down the state’s only publicly supported historically black university because the school was in debt to the tune of $11 million.
The university’s trustees voted to place the school’s president on administrative leave, alumni protested, and ultimately, South Carolina legislators did not close the school.
The fact that casual observers mostly hear about historically black colleges and universities in moments of crises adds fuel to the fire of those that wonder “Are black colleges still necessary?” More than any other, this is the question I was asked as I researched, discussed, and wrote about historically black colleges and universities (HBCUs).
A consequence of living in a multi-cultural society that purports to value diversity is that we are suspicious of black colleges. At a fundamental level, the question, “Are black colleges still necessary?” implies that it is easy to identify the value in some colleges – those that are predominantly white – but not those that are predominantly black.
HBCUs play a critical role in the production of highly educated, successful black Americans. Though they account for a relatively small proportion (3%) of U.S. colleges and universities, roughly 40 percent of blacks earning science, technology, engineering, and math degrees do so at black colleges. Eighty-five percent of black medical doctors attend a black college at some point in their educational career. Forty percent of black doctoral degree holders earned their bachelor’s degree at a black college. These statistics beg the question of why it is so difficult to conceive of HBCUs as prestigious entities worthy of the same level of respect and accord we so easily dole out to so called “mainstream” or predominantly white colleges.
Image: Lendingmemo via Flickr (CC BY 2.0)
by Michael Kumhof
The United States have experienced two major economic crises during the last century, starting in 1929 and 2008. In each case the pre-crisis decades were characterized by a sharp increase in income inequality, and by a similarly sharp increase in household debt leverage. In new research, we propose a theoretical mechanism that links growing income inequality to growing debt leverage, and ultimately to financial fragility and financial crises. We find that this mechanism can account for around three quarters of the 1983-2008 increase in the U.S. debt-to-income ratio, and therefore for the increased probability of a financial crisis such as the one observed in 2008.
by Deborah A. Harris and Patti Giuffre
Imagine you’ve stepped inside one of those foodie television shows. You know, the ones set in fine dining restaurants where waiters and sous chefs dash around the kitchen at a frenetic pace, calling out food orders, and tasting dishes in hopes they will live up to the executive chef’s exacting palate. The executive chef moves through the kitchen and is clearly in charge of the action. Maybe the chef you’re imagining is barking orders at subordinates. Maybe they’re appraising the kitchen with a cool eye.
Now, imagine you’re in a different type of kitchen—a kitchen in the “typical” middle-class American home. In this setting, the “chef” is grabbing food out of the refrigerator and, instead of sous chefs, young children are whipping around the kitchen talking about soccer games and piano lessons that have to be worked into everyone’s schedule. Instead of worrying about earning another Michelin star or impressing a food reviewer, this “chef” is just trying to get dinner on the table for the family.
In the two scenarios above, what genders did you imagine for the chefs? If you’re like most people, you probably pictured the professional chef as a man dressed in a white jacket and toque while the second scene may have led to visions of harried mothers, perhaps still in the clothes they wore to work, frantically trying to get dinner on the table for her family.
Since 1980 there has been a vast expansion of the economic power and centrality of financial service sector. Less well known is the simultaneous shift in the investment strategies of non-financial firms, at least in the US, toward investing in financial instruments of various sorts. By the early 2000s financial investments had risen to almost 30 percent of total assets in the U.S. private sector. In our on-going research we have tried to figure out if this shift in corporate investment strategies has been economically destructive. Our answer is that it has and that economic growth and U.S. standards of living have suffered as a result.
We already know that the concentration of wealth and power in the financial service industry has introduced fragility into the world economy, reduced both fixed investment and R&D, and increased inequality in the advanced economies. Instability, sector shifts and inequality are not, however, evidence that financialization has been harmful to general economic growth. In a capitalist system shocks, sectoral shifts and cyclic destruction, even rising inequality are not inconsistent with long run growth in standards of living.
It is possible that financial investment strategies, despite increasing inequality, lifted all boats. If this is what happened then the policy case against financialization is weakened considerably. On the other hand, if financialization of the non-finance sector is associated with decreased total production then contemporary movements toward a financialized non-finance sector are economically as well as socially destructive.
Prior to 1980 firms like GE and GM used debt financing of their products to support growth in market share. During this period we find that financial investments by non-finance firms were much lower and encouraged economic growth..
After 1980 many non-financial firms moved in more speculative directions, into stock, credit, currency and even derivative markets. After 1980 we find that financial investment strategies by non-financial corporations are associated with reductions in value added, as Figure 2 shows. We estimate that since 1980 financial investments on Main Street stripped the US economy of at least 3.9 percent of aggregate growth, or about three years of lost growth in GNP. For comparison sakes, the great recession of 2008 produced aggregate negative growth of 2.9 percent.
Image: ScoRDS via Flickr (CC BY-NC-SA 2.0)
For the first time in its history, the City Council in my hometown of Austin, Texas is run by a female majority. This important milestone should be cause for celebration. Instead, Austin was embroiled in controversy when it was revealed that the city manager’s office had brought in experts to help staffers “cope” with this new reality.
Consultants from Florida provided a two-hour training session to explain how new approaches would be needed to present issues before the council now that women are in charge. Staffers were told that men and women do not process information in the same way. Nor do they care about the same things: men are interested in the financial bottom line, while women want to know how various issues impact the community, families, and children. Women also ask a lot more questions than men do, and take more time reaching a decision.
The staffers at the event (most of whom were women) were encouraged to adopt a gender-appropriate repertoire as soon as possible because more women political leaders are inevitable, thanks to the inspiration of Hillary Clinton.
This egregious stereotyping of women leaders led to a barrage of embarrassing press coverage. Consequently, the City Manager suspended the person responsible for arranging the training session.