Archive for the 'Books' Category

Reflections of our era’s zeitgeist visionary on the alternate reality world

Thursday, January 18th, 2024

Doppelganger: A Trip Into the Mirror World, Naomi Klein, 2023


Naomi Wolf on Steve Bannon’s War Room

It’s a reminder that just because something is currently enclosed in a certain kind of financial arrangement does not mean it must forever stay enclosed. History is filled with successful struggles against earlier forms of enclosure – colonial powers were ejected from their onetime colonies; foreign owned mines and oil fields have been nationalized and put under public control; Indigenous peoples have won legal victories reclaiming sovereign control over their ancestral territories; Unjust ownership structures have been changed before and they can be changed again.

It bears remembering that many of the technologies that form the building blocks of modern tech giants were first developed in the public sector, with public dollars, whether by government agencies or public research universities. These technologies range from the internet itself to GPS and location tracking. In essence, Big Tech has appropriated commonly held tools for private gain, while adopting the discourse of the commons to describe their gated platforms.

That is the real source of my speechlessness in this unreal period: a feeling of near violent rupture between the world of words and the world beyond them. In recent years, left social movements have won huge victories in transforming the way we talk about all kinds of issues – billionaires and oligarchic rule, climate breakdown, white supremacy, prison abolition, gender identity, Palestinian rights, sexual violence – and I have to believe that those changes represent real victories, that they matter. And yet, on almost every front, tangible ground is being lost. Changing the discourse did not prevent the world’s ten riches men from doubling their collective fortunes from $700 billion to $1.5 trillion in the first two years of the (Covid) pandemic; it did not stop police forces from increasing their budgets while teachers have to pay for basic supplies out of pocket; it did not prevent fossil fuel companies from collecting more billions in subsidies and new permits; it did not prevent the Israeli police forces from attaching the funeral of the revered Palestinian American socialist Shirleen Abu Akleh after a bullet that was almost certainly fired by an Israeli soldier took her life.


Bill Gates at Davos solving the world’s problems

On the very exclusive meetings at Davos, Aspen, etc.: “In every case they take up the mantle of solving the world’s problems – climate breakdown, infectious diseases; hunger – with no mandate and no public involvement and most notably, no shame about their own central roles in creating and sustaining these crises.”

Novelist Daisy Hildyard; “You are stuck in your body right here, but in a technical way you could be said to be in India and Iraq, you are in the sky causing storms, and your are in the sea herding whales towards the beach. You probably don’t find your body in those places; it is as if you have two distinct bodies. You have an individual body in which you exist, eat, sleep and go about your day-to-day life. You also have a second body which has an impact on foreign countries and on whales…a body which is not so solid as the other one, but much larger.”

In Hildyard’s conception, our complicity in wars fought with our tax dollars to protect the oil and gas that likely warms our houses, cooks our food, and propels our vehicles, and in turn fuels extinction, is not separate from us, it’s an extension of our physical beings. “This second body,” she writes, “is your own literal and physical biological existence—it is a version of you.”

Elite Capture and Value Capture

Wednesday, August 30th, 2023

Elite Capture: How the Powerful Took Over Identity Politics (And Everything Else), Olufemi O Taiwo, 2022

In democracies, ostensibly, the elites (policymakers) are put in office by the non-elites (citizens), who can remove and replace them if they fail to defend public interests. Much like the mythical market, mythical liberal democracy is supposed to be self-correcting and self-justifying by definition. This way of casting the conversation about power and governance has been integral to the framing that links “freedom” and “capitalism” in the ideals and practices of liberal democracy: a country’s freedom need only be found at its ballot boxes rather than in, say its workplaces. Thus, if one believes in liberal democracy, they may believe that imbalances of power everywhere could be fixed by instituting arrangements like the “rules-based international order,”democratic elections,” and “formal political representation.” In a nutshell, if the right ideals are embodied in the right formal systems, then the outcomes of those systems are justified.

Ronald Reagan’s 1986 Anti-Drug Abuse Act, which helped supercharge mass incarceration by establishing mandatory minimum sentencing guidelines and adding $1.7 billion toward the drug war, while welfare programs were cut…The consequences led Democratic senator Daniel Patrick Moyniham to make a striking appraisal: “If we blame crime on crack, our politicians are off the hook. Forgotten are failed schools, the malign welfare programs, the desolate neighborhoods, the wasted years. Only crack is to blame. One is tempted to think that if crack did not exist, someone, somewhere would have received a federal grant to develop it.”

And then there’s capital. The 1950’s and 60’s saw important innovations in corporate management (particularly in the United States, which stood comfortably atop the post-World War II global economy):leveraged buyouts, divestitures, mergers, major sell-offs of “non-core businesses,” and other forms of reorganization of business by profit-hungry shareholders. These trends intensified in the 1980’s, producing what researchers called the “shareholder revolution”: a proliferation of management techniques that put previously complacent industry managers under strict discipline of activist shareholders. This second phase of shareholder revolution coincided with and helped produce a larger “global business revolution,” a “fast-developing process of concentration at a global level in numerous industries supplying goods and services,” to “system integrators”–the few large firms who can reorganize global production around their “core” business models and assets.

These institutions emerged as the world order was being reconstructed in the waning years of Second World War, with the United States newly emergent as a global hegemon. The architects met in Bretton Woods, New Hampshire, where they set up the International Monetary Fund (IMF) and what later became the World Bank. Whatever the “narrow” technical pretensions of their mandates, these organizations in fact have immense governing powers. They offer aid packages that are conditional on certain governance decisions by the receiving country – decisions that help determine the availability of jobs, public services, and the price of food. These basic features of non-elite life are thus placed in the hands of foreign bureaucrats over whom the country’s population have no means of democratic control, nor even the pretense of any sort of democratic relationship.

Rather than a cataclysmic putsch or violent event, for (Wolfgang) Streek, the end of democracy simply is the gradual capture of the political by the elites. “[A]s one crisis followed the next, and the fiscal crisis of the state unfolded alongside them, the arena of distributional conflict shifted, moving upward and away from the world of collective action of citizens toward ever more remote decision sites where interests appear as ‘problems’ in the abstract jargon of technocratic specialists.

Value capture is a process by which we start with rich and subtle values, encounter simplified versions of them the social wild, and revise our values in the direction of simplicity—thus rendering them inadequate…Capitalism is itself such a system: it rewards the relentless and single-minded pursuit of profit and growth—extremely narrow value systems that exclude much of what makes life worth living. But societies organized around fundamentalisms (whether religious or secular) and war have resulted in similarly warped value systems long before capitalism arrived on the scene.

The Laissez-Faire Myth in America

Tuesday, August 15th, 2023

The Big Myth; How American Business Taught us to Loath Government and Love the Free Market, Naomi Oreskes and Erik M. Conway, 2023


Merchants of Doubt Movie 2014


<> <> <> Nierenberg <> <> <> <> <> <> Seitz <> <> <> <> Singer <> <> <> <> <> <> Ebell

(In Merchants of Doubt, 2010 featuring Bill Nierenberg, Fred Seitz, Fred Singer, Myron Ebell) we stumbled across the story of four physicists who laid the foundations for climate change denial as far back as the late 1980s. These men were prominent scientists—one was a former president of the the U.S. National Academy of Sciences, another headed a major NASA lab—so it wasn’t remotely plausible that they didn’t understand the facts. We discovered that they hadn’t just rejected climate science but had fought settled science on a host of public health and environmental issues, starting the the harms of tobacco. Two of these four scientists had worked with tobacco companies. (Why?) The right answer was ideology: market fundamentalism.
These men feared that government regulation of the marketplace—whether to address climate change or protect consumers from lethal products—would be the first step on a slippery slope to socialism, communism, or worse. This fear was rooted in their personal histories designing weapons systems in the Cold War. On behalf of the U.S. Government, they had worked to build the atomic bomb, the hydrogen bomb, and the rockets and submarines to deliver those weapons. They saw the Soviet threat as serious, a threat they had to help “contain”. When the Cold War ended, they couldn’t stop fighting. Instead they found a new enemy—environmentalism–which they viewed as a back door to socialism, if not communism. As one of them put it, “if we do not carefully delineate the government’s role in regulating…dangers there is essentially no limit to how much government can ultimately control our lives.”…So these men would do whatever they could to prevent government regulation of the marketplace, even if it meant fighting facts, challenging hard-won knowledge, and betraying the science they had helped to build.
Market Fundamentalism is not just the belief that free markets are the best means to run an economic system but also the belief that they are the only means that will not ultimately destroy our other freedoms. It is the belief in the primacy of economic freedom not just to generate wealth but as a bulwark of political freedom. And it is the belief that markets exist outside of politics and culture, so that it can be logical to speak of leaving them “alone”.
Milton Friedman, America’s most famous market fundamentalist, went so far as to argue that voting was not democratic, because it could be so easily be distorted by special interests and because in any case most voters were ignorant.
Because (Morris Llewellyn) Cooke and (Gifford) Pinchot put social considerations on par with—or perhaps even ahead—financial ones, critics accused them of being :communistic”, but neither man had a sympathy for communism or socialism Rather, they advocated a more rational form of capitalism than what Americans at that time had, in which the needs of the people would be a primary consideration and in which engineering decisions would be made by engineers rather than financiers. Cooke envisaged Giant Power as but one part of a “Great State”…placing the government of our individual states on a plane of effective social purpose.” We might call it “guidez-faire”: a form of capitalism in which government combated corruption, remedied market failure, and redressed social injustice.
What was radical about Giant Power was its reorganization of the entire system for generating and distributing electricity in Pennsylvania. Under Pinchot’s plan, the Giant Power Survey Board would be authorized to construct and operate coal-fired power plants, to mine the necessary coal, to appropriate land for mining, and to build and operate transmission lines, or issue permits to others to do so. Private operators could still function—indeed–the system would need them—but they would have to play by the rules and rates the board established. The board could also buy electricity from operators in neighboring states, or sell to them. Farmers could create rural power districts and mutual companies, and if the board chose to it could subsidize them”
The goals of Giant Power were socialistic in one important respect; the “chief idea” behind it was “not profit but the public welfare.” Its object was not “greater profit to the companies”, but “greater advantage to the people.” To make this happen, “effective public regulation of the electric industry” was an essential condition.
. When the Giant Power Survey Board’s report came out in 1923, industry leaders attacked both its contents and its sponsor. Immediately they mobilized to prevent Pinchot’s reelection. (He eked out a victory.) They also launched a disinformation campaign, using advertising, public relations, experts-for-hire, and academic influence to counter any suggestion that public management of electricity was desirable (much less necessary). Above all, the industry sought to discredit the very idea that the public sector could do anything more fairly or efficiently than the private sector. The goal was to strengthen the American people’s conviction that the private sector knew best, and to promote the idea that anything other than complete private control of industry was socialistic and un-American.

The Federal Trade Commission (FTC) held six years of hearings from 1930-1936 on a wide range of regulatory issues in the aftermath of the 1929 stock market crash that initiated the great depression, Their reports covered eighty volumes.

Economic historians William Hausman and John Neufeld: “private utilities led by [their] industry trade group the National Electric Light Association (NELA)…mounted a large and sophisticated propaganda campaign that placed particular emphasis on making the case for private ownership to the press and in schools and universities.” Historian David Nye: “The thousands of pages of testimony revealed a systematic covert attempt to shape public option in favor of private utilities, in which half truths and at times outright lies presented municipal utilities in a consistently bad light.” Historian Ronald Cline called the campaign “underhanded” and “unethical”.
The FTC found that industry actors had attempted to control the entire American educational system—from grade school to university—in their own economic interest. This effort focused on the social sciences—economics, law, political science, and government—but also included engineering and business. Its purpose was to ensure “straight economic thinking”–by which NELA meant capitalist free market principles—and to supply young people and their teachers with “correct information”. The goal was to mold the minds of the current generation and those to come…This was achieved, the FTC concluded through “false and misleading statements of fact, as well as opinions on public policy, found in reports and expert testimony of prominent university professors who are now discovered to have been in the pay of the private utilities.
The opinions NELA promoted were embedded in a larger argument that private property was the foundation not only of the American economy but of American life, so any attempt to interfere with the private operation of the electric industry threatened to undermine that way of life. Opinions to the contrary were denigrated as “unsound”, “socialistic”, and fundamentally un-American. The FTC found that the “character and objective of these activities was fully recognized by NELA and its sponsors as propaganda,” and that in their documents they “boasted that the ‘public pays’ the expense.”..the goal expressed outright in numerous documents—was to change the way Americans thought about private property, capitalism, and regulation.
On the surface NELA lost its fight. David Nye concludes that the “public revulsion that the NELA hearings caused put [rural] electrification back on the public agenda…and prepared the way for New Deal utility legislation.”
The United States today still has a predominantly private electric system (about 90 percent) that is less strongly regulated than in many other countries. On average customers of publicly owned utilities pay about 10 percent less than customers of investor-owned utilities and receive more reliable service. When attempts were made in the 1990s to deregulate the system entirely (California), it was a disaster for customers. The Enron company gamed the system before going bankrupt, and several of its executives went to jail for fraud, conspiracy, and insider trading. Electricity deregulation also proved a disaster for the people of Texas: when the state’s power grid failed in the face of an extreme winter storm in 2021, if left more than seven hundred head and somewhere between $80 and $130 billion in damages.

Since the crash of 1929, (National Association of Manufacturers) NAM had hemorrhaged members…The organization might have collapsed, but a new group of executives—mostly from large corporations—took over, with a bold idea to increase membership and their political power…NAM’s “education” efforts centered around a massive campaign to persuade the American people that big business’s interests were the American people’s interests. Americans needed to see that the real threat came not from “Big Business”, but from “Big Government”. While NAM took care to call the campaign “public relations” or “education” its goals were overtly political and mush of what was said was untrue.
Historian Richard Tedlow: “Unnerved by the impact of the depression, apprehensive of the growing strength of labor, enraged at critics of the failures of business, and rejecting almost in toto the devices of the new administration in Washington to find solutions to the problems inherited in 1933”, NAM leaders refused to engage in a serious attempt to identify the structural weaknesses, errors, and abuses that had contributed to the Great Depression, and honestly consider how they might be corrected. Instead, they turned to propaganda, spending millions to persuade the American people of the greatness of business and industry, and that the fault lay entirely with government and unions standing in the way of effective management.
NAM’,s radio series purpose: “The American Family Robinson seeks to emphasize the countless benefits which derive from living in a free country with CIVIL AND RELIGIOUS LIBERTY, REPRESENTATIVE DEMOCRACY, FREE PRIVATE ENTERPRISE.” The key claim of the NAM statement to the broadcasters hinged on the word inseparable; that freedom of speech and of the press, freedom of religion, and freedom of enterprise are inseparable.
In 1947, NAM put its weight behind (the Taft Hartley) bill…that banned many kinds of strikes, restored employer’s propaganda rights, barred union donations to federal political campaigns, and allowed state “right to work” laws…For the rest of the century, American businessmen would celebrate unregulated markets and corporate freedom, while disparaging any protection of workers, consumers, or the environment as government encroachment, even “shackles. They would insist that reforms intended to address market failures or defend workers were alien, un-American idea. They would decry “excessive” government spending on social programs, while accepting an orgy of military spending from which they would often benefit. Above all, they would develop and hone what in time became the mantra of American conservatism in the second half of the twentieth century: limited government, low taxation, individual responsibility, personal freedom.

All of this was definite strike against (Austrian Laissez-faire economist Ludwig von) Mises in FDR’s America, where the theories of the British philosopher and economist John Maynard Keynes dominated government offices and university departments. Keynes argued that the state had to manage the marketplace, to smooth out otherwise crushing business cycles and attend to needs not met by the private sector. Laissez-faire capitalism had produced devastating social evils—brutal child labor, deadly working conditions, poor public health, low education levels, an impoverished elderly population, and so on—as well as serious economic problems such as boom and bust cycles, widespread unemployment, and bank panics…In 1924, even before the Great Depression hit America, Keynes could credibly declare that the world had come to “The end of Laissez-Faire”…In his 1936 work The General Theory of Employment, Interest, and Money, Keynes dismantled the notion that markets were rational and could be trusted to restore prosperity.
Mises collapsed socialism into centralized planning…Was socialism indistinguishable from communism in this regard? Most Americans in the mid-twentieth century did not think so.

<> <> <> <> <> <> Thomas Friedman <> <> <> <> <> <> Frederick Hayek <> <> <> <> <> <> John Maynard Keynes

In the hands of ideologues, (Austrian Laissez-faire economist Friedrich August von Hayek’s) The Road to Serfdom was transmogrified from a complex and subtle argument about the risks of governmental control into an anti government polemic.
Added to the evidence free works of the Chicago school of economics was added the myth making novels of Ayn Rand (The Fountainhead Atlas Shrugged) and Laura Ingalls Wilder. (Little House on the Prairie).
Under Teddy Roosevelt and his successor, William Howard Taft, the federal government prosecuted several corporations, most famously Standard Oil. But within a few decades, the Chicago school was turning facts of history on their head, arguing that state power was the real threat, both to democracy and to capitalism.
With the publication of Capitalism and Freedom, Luhnow, Crane, Read, and Pew, along with their allies in NAM, the American Enterprise Institute, the Foundation of Economic Education, had their wish fulfilled. They now had the American version of The Road to Serfdom, their New Testament of market fundamentalism. Above all, they had what appeared to be a serious academic argument to transform their cruel and self-interested “opposition to unions and the welfare state from reactionary politics to good judgment in the public’s mind.” They had taken a self-interested and essentially unsubstantiated ideology—one with scant empirical foundation and bucket loads of available historical refutations—and transmogrified it into respectable academic theory. What had begun in the 1930’s as self-interested propaganda had been reconstructed as respectable intellectualism.
What (Ronald) Reagan carried forward from Mises, Hayek, and other twentieth century neoliberal thinkers was an unfalsifiable reverence for markets and hostility to government…It also underpinned one of the most disturbing (and ironic) aspects of the Reagan legacy: the rejection of information.
“Overregulation” was a libertarian construct linked to the myth of the indivisibility thesis and the metaphor of the road to serfdom…With the election of Ronald Reagan, market fundamentalists finally had the opportunity to put their ideology into practice, whether it was supported by facts or not.
In the 1970s, the idea gained a new name “supply-side economics”, because it would stimulate the economy not through government spending or increased consumer demand, but by boosting production…The resurrection of supply-side economics did not come from NAM however, but from a new economic guru: Arthur Laffer. tThe “Laffer Curve” was a graphical argument for a Goldilocks to taxation; rates that were too high would prevent investment in new production and or suppress the desire to work hard; rates that were too low would be inadequate to finance necessary government functions like law enforcement and national defense.
Supply-side economics—now referred to as Reaganomics—didn’t produce an American investment boom or surging tax receipts—neither did it produce the vaunted “trickle-down” effect. What it did demonstrably do was explode income inequality…The rich got richer, the poor got poorer, and the middle class treaded water. Supply-side economics was a shell game. The theory had been tested and failed. In a rational world, this should have discredited the idea and the economists who preached it. In 2019, this refuted theory earned Arthur Laffer the Presidential Medal of Freedom.
Lord Nicholas Stern, a former chief economist at the World Bank, would later (2006) call climate change the “greatest and most wide-ranging market failure” ever seen.
This (climate change) was a potentially fatal blow to the idea of the “magic of the marketplace”. At minimum it showed there were potentially enormous sacrifices associated with business as usual, evading the conventional cost-benefit analysis conservatives demanded to justify regulation. At maximum, it suggested that capitalism as practice threatened the future of life on earth.
Both Carter and Reagan worked to deregulate large swaths of the American economy, but Clinton, in some ways, went further, with dramatic deregulation of telecommunications and financial markets. In 1996, he declared that the “era of Big Government is over.”
Financial regulation—like telecom regulation—needed retooling for the twenty-first century. But instead of updating the relevant regulations Congress and the Clinton administration gutted them. The Gramm-Leach-Bliley Act (1999) effectively removed (repealed Glass-Steagall) the guardrails that for six decades had done the job they were built to do. When they were removed, the system crashed (2008).
All of this may make one wonder: was this ever really about capitalism? Or freedom? Or was it all just one long semi-continuous, shape shifting defense of the prerogatives of big business? Of freedom for capital and capitalists? Or had the market fundamentalists spent so much time defending the magic of the marketplace that they simply couldn’t accept market failure on a global scale? To accept the enormity of what climate change portended for civilization was to accept that capitalism as practiced, was undermining the very prosperity it was supposed to delivery. And not just in some distant future but now.
The conservative preoccupation with constraining government power has left us with a federal government too weak and too divided to handle big problems like Covid-19 and climate change. Even as the pandemic raged, millions of Americans refused to get vaccinated in large part because of distrust of “the government”, and the lion’s share of those American’s were political conservatives.

How Foreign Policy Elites Launched U.S. Hegemony in 1940

Wednesday, July 26th, 2023

Tomorrow the World; The Birth of U.S. Global Supremacy, Stephen Wertheim, 2023

Quincy Institute of Responsible Statecraft <> Henry Luce <> Wakter Lippmann

In the 1930s the United States found itself surrounded with totalitarian governments in Russia, Italy, Japan, Spain, and Germany. The U.S. belief in its democratic exceptionalism was under direct economic and ideological threat from aggressive totalitarian regions both in Europe and Asia. The U.S. economy and way of life required a broader reach than could be supplied in the western hemisphere. Foreign policy elites too numerous to list together with public influencers Henry Luce, publisher of Time, Life, and Fortune magazines, and political commentator and journalist, Walter Lippmann began seriously to wrestle with new conceptions of the proper role of the U.S. in world affairs. Consensus could not be reached in such diverse intellectual circles but there was broad agreement that any new foreign policy focus would require widespread public support. This stalemate might have continued had Nazi Germany not invaded and quickly defeated a superior French-British military force in France in 1940. This book is a study of how our current, 83 year old foreign policy stance of U.S. global military supremacy came into existence.

Peace, however, came at an unprecedented price after Germany conquered France and briefly bestrode Europe. For the United States to maintain a hemispheric military posture could potentially leave Europe to the worst Europeans and Asia to the worst Asians—totalitarian dictatorships harnessing the tools of industrial modernity to achieve armed conquests and subjugation. This too, was problematic for the traditional self-definition of the United States, even though the country faced no grave and imminent threat to its physical security or economic prosperity. If the Axis powers, or any other hostile combination, were to achieve domination of Europe and Asia, the United States could no longer fulfill its mission of ushering the world into an “American” emancipated future. Much of the earth, moreover, might effectively close down to U.S. Participation, at least in ways that would be compatible with liberal commerce and law. These objectives, which once dictated that the United States practice peaceful intercourse outside its hemisphere, now became militarized, requiring force to back them. For this reason many Americans decided prior to Pearl Harbor that it was worse to risk leaving Europe and Asia to a brutal fate than to risk engaging in global warfare evermore. To them, World War II was a war of choice—the right choice– and so would be U.S. Dominance after victory.

Each side formulated a reasonably coherent account of U.S. Interests and responsibilities. Each side developed a plausible ideological presentation of its case as American and internationalist alike. Each side also possessed significant political power at elite and popular levels, enough to make the outcome genuinely uncertain until 1941. Still, the sides did not face equal odds of success. America’s foreign policy class, built up over two decades between the wars in Europe, was not about to see U.S. political, economic, and cultural influence reduced to a hemispheric husk if such a fate could be avoided. American elites expected, and felt entitled, to traverse the globe and exchange ideas, goods, and money widely. They saw themselves as part of a small cohort of people who governed the world, whether they understood world governance to be a principally American, Anglophone, white, Western, or civilizational project.

In these respects, U.S. Policymakers and intellectuals in World War II continued the trajectory of American foreign policy since the turn of the twentieth century. Presidents William McKinley and Theodore Roosevelt had brought the country into the ranks of the colonial great powers. They positioned the United States as one of a handful of guardians of civilization tasked with disciplining lawless savages and lawbreaking aggressors, although the United States would exercise its police power within a hemispheric and Pacific realm. In the First World War, President Woodrow Wilson sent U.S. Troops into the heart of the balance of powers in Europe and then attempted to commit the United States to enforce the peace, albeit on the assumption that the peace would not need much enforcing. Moral suasion would replace physical coercion, or so Wilson and Wilsonians anticipated. Once totalitarian powers dashed those hopes in the 1930s, the officials and intellectuals who had stewarded America’s rise in previous decades forthrightly aligned with Anglo-French preeminence. Then, once totalitarians conquered much of Europe and East Asia, destroying France and weakening Britain, U.S. foreign policy elites overwhelmingly decided to cast off old restraints and seek maximum power for the United States. Not without reason have some scholars interpreted this outcome as the logical and perhaps inevitable conclusion to decades of American conduct.

Yet it took the most improbable of events—the German military’s rapid and absolute victory over its superior French counterpart–for the United State to contemplate and pursue global supremacy, to make itself the armed policeman of the world. Had Hitler not made a massive gamble by ordering an invasion of France, had allied leaders anticipated that Germany might launch an offensive through the Ardennes Forest, had the Wehrmacht been slightly slowed in its advance to the Meuse River, had the French generals not sent their best troops and tanks to the wrong spot, then Germany would not have overrun France and might never have come close to dominating Europe. Without the credible prospect of an ascendant Axis, the United States might well have played a circumscribed role in the war and the world to come. U.S. Policymakers would have had insufficient cause to abandon their traditional extra hemispheric formula of peaceful engagement by Americans and power balancing by Asians and Europeans. They applied this formula during the opening months of the war, prior to the fall of France. They likely would have continued to do so if French defenses had held. It remains possible, of course, that the United States would have entered the war nonetheless, perhaps in roughly the way it did. Even so, the United States may well have declined to police the postwar world, a costly and morally dubious undertaking that could have been left to Britain and France if it needed to be performed at all. Alternatively, the United States might have expanded its postwar security perimeter in the Far East but played more of a supporting role in Europe and the rest of Asia. Only the specter of a Nazi Europe—with France flattened and Britain potentially next—created a sense of crisis capable of breaking the old consensus and forging a new one. Axis world leadership, leading to a hemispherically “isolated” United States, compelled elites to envision U.S. Military dominance and opened the political space to achieve it.

As it turned out, France did fall. The United States eyed a hemispheric existence, if only for a matter of months. And with all the vehemence with which prior generations vowed to avoid entangling alliances and faraway wars, a new class of leaders insisted that it was precisely America’s reluctance to commit to apply military power that courted disaster. In a world tormented by rampaging great powers, their prescription looked responsible and humane. At the time, however, many planners of U.S. Dominance also understood themselves to be making a tragic choice. In order to stop existing aggressors and prevent future ones, they risked putting the United States in the position of an aggressor itself. Some planners, like Isiah Bowman, wondered aloud how to distinguish U.S. Ambitions from Nazi ones. Others were troubled that the United States was inheriting the basic role of the British Empire; Arthur Sweetser, for example, warned that global policing by the United States, in tandem with Britain, constituted a bid for “Anglo-Saxon domination” that would provoke the rest of the world. Those who made the decision for primacy appreciated that American statecraft had operated differently for so long. They understood U.S. military supremacy to be fraught with moral compromise and strategic risk.

Since then, Americans have lost sight of both the specific circumstances that elicited the decision for primacy and the weighty trade-offs that primacy entails. Primacy has come to seem obvious, not contestable; thrust upon the United States, not ambitiously chosen by it. This predicament is rooted in the story Americans have told themselves about their global ascendance in World War II. The story says that the United States turned away from selfish isolationism in order to take the lead in world affairs. By developing the pejorative concepts of isolationism, and applying it to all advocated of limits on military intervention, American officials and intellectuals found a way to make global supremacy sound unimpeachable. With isolationism as its foil, primacy became the only basis through which the United States could participate in the world. Anything else would be an abdication, tantamount to inactivity, absence, and head-in-the-sand disregard for the fate of the world. In other words, the only way to practice internationalism, to constrain and transcend power politics, was to dominate power politics. The United States forged global supremacy by erasing this fundamental contradiction from view.

As the decades advanced, primacy’s privileged status in American politics was reinforced by several factors that primacy itself helped to bring into being. One was a succession of external threats, starting with Soviet-backed communism, to the now expansive global interests of the United States. Another was what President Dwight D. Eisenhower called the “military-industrial complex”: domestic interests dependent on, and perpetuating, large-scale mobilization. Perhaps these factors suffice to explain the endurance of America’s commitment to armed supremacy for eight decades and counting after the fall of France triggered its conception. Yet perhaps they do not. The existence of foreign threats might have given rise to the criticism that it was America’s excessive definition of its interests, and its provocative actions in this pursuit, that produced unnecessary adversaries. Moreover, few other world-shaping initiatives—even those supported by entrenched interests—have generated so little intellectual scrutiny and political opposition as has the global dominance of the United States. American supremacy has been sustained, virtually without challenge, by a policymaking elite and a collective imagination that holds supremacy to be the only viable course and rejects those who disagree as beyond the pale.


<> Quincy Wright

In 1966, four years before his death, (Quincy) Wright saw American B-52s over Vietnam and no world constitution on the horizon. He was ready to reckon with U.S. Supremacy. He wrote to Walter Lippmann, himself dismayed by the indiscriminate use of American power, that “the trouble with the American people is that they do not recognize the difference between ‘imperialism’ and ‘internationalism’”. In the 1940s, the country jumped from ‘isolationism’ to ‘imperialism’, acquiring a taste for unilateral intervention everywhere in order to remake the world in the image of the United States. Wright recognized the same impulse in the imperialism of Rudyard Kipling’s Britain. No longer willing to assume the best of U.S. Policymakers, he specified exactly what they needed to do. “We should renounce unilateral intervention in both Europe and Asia”; accept Ho Chi Minh’s victory in a unified Vietnam; and bring both Germanys, both Koreas, and Communist China into the United Nations. “Such are the policies,” after all, “to which we committed ourselves in the (UN) San Francisco Charter.”

Americans told themselves they were casting off isolationism and committing, through the U.N. Charter, to build a just and durable order. Far from openly espousing imperialism, foreign policy elites generate a surfeit of terms to evoke the scale of imperial power while sidestepping its moral undertones: the American Century, Pax Democratica, the Grand Area, world leadership. Their favorite formula called on Americans to graduate from isolationism into precisely the internationalism in which Wright still invested his hopes. The American people did not necessarily fail to appreciate what this kind of internationalism meant. It was Wright who misunderstood. He did not see that so long as the phantom of isolationism is held to be the most grievous sin, all is permitted.

One strange and glaring emission from this study is China. During the period 1940-1941 when appeasing Japan was being discussed by the foreign policy elites, it was suggested that Japan be acknowledged as the legitimate colonizer of Manchuria if Japan in return agreed to abandon its invasion of China. The Communist Chinese takeover of China in 1949 sent a shockwave through the foreign policy community probably second only to the defeat of France in 1940. China is not even listed in the index of this work. China, with possibly the world’s biggest economy, is now competing with the U.S. for global hegemony.

Congo Blood Powers Our Electric Vehicles and Cell Phones

Friday, July 21st, 2023

Cobalt Red; How the Blood of the Congo Powers Our Lives, Siddharth Kara, 2023

One must acknowledge, however, the following crucial fact—for centuries, enslaved Africans was the nature of colonialism. In the modern era, slavery has been universally rejected and basic human rights are deemed egra omnes and jus congens in international law. The ongoing exploitation of the poorest people of the Congo by the rich and powerful invalidates the purported moral foundation of contemporary civilization and drags humanity back to a time when the people of Africa were valued only by their replacement cost.

The harsh realities of cobalt mining in the Congo are an inconvenience to every stakeholder in the chain. No company wants to concede that the rechargeable batteries used to power smartphones, tablets, laptops, and electric vehicles contain cobalt mined by peasants and children in hazardous conditions.

As scrutiny over the conditions under which cobalt is mined has increased, stakeholders have formulated international coalitions to help ensure that their supply chains are clean. The two leading coalitions are the Responsible Minerals Initiative (RMI) and the Global Battery Alliance (GBA)…In all my time in the Congo, I never saw or heard of any activities linked to either of these coalitions, let alone anything that resembled corporate commitments to international human rights standards, third party audits, or zero-tolerance policies on forced child labor. On the contrary, across twenty-one years of research into slavery and child labor, I have never seen more extreme predation for profits than I witnessed at the bottom of global cobalt supply chains…Our daily lives are powered by a human and environmental catastrophe in the Congo.

The truth, however, is this—but for their demand for cobalt and the immense profits they accrue through the sale of smartphones, tablets, laptops, and electric vehicles, the entire blood-for-cobalt economy would not exist.

No one knew at the outset that the Congo would prove home to some of the largest supplies of almost every resource the world desired, often at the time of new inventions or industrial developments—ivory for piano keys, crucifixes, false teeth, and carvings (1880s), rubber for car and bicycle tires (1890s), palm oil for soap (1900s), copper, tin, zinc, silver, and nickel for industrialization (1910s), diamonds and gold for riches (always), uranium for nuclear bombs (1945), tantalum and tungsten for microprocessors (2000s), and cobalt for rechargeable batteries (2012+)…At no point in their history have the Congolese people benefited in any meaningful way from the monetization of their country’s resources. Rather, they have often served as slave labor force for the extraction of those resources at minimum cost and maximum suffering.

The battery packs in electric vehicles require up to 10 kilograms of refined cobalt each, more than a thousand times the amount required for a smartphone battery. As a result, demand for cobalt is expected to grow by almost 500 percent from 2018 to 2050, and there is no known place on earth to find that amount of cobalt other than the DRC (Democratic Republic of Congo).

As of 2022, there is no such thing as a clean supply chain of cobalt from the Congo. All cobalt sources from the DRC is tainted by various degrees of abuse, including slavery, child labor, forced labor, debt bondage, human trafficking, hazardous and toxic working conditions, pathetic wages, injury and death, and incalculable environmental harm. Although there are bad actors at every link in the chain, the chain would not exist were it not for the substantial demand for cobalt created by the companies at the top.

The industrial mining operations in the DRC are typically structured as joint ventures between the state owned mining company, Gecamines, and a foreign mining company. As of my last ground count in November 2021, there were nineteen major copper-cobalt companies operating in Haut_Katanga and Lualaba provinces, fifteen of which were owned or financed by Chinese mining companies. Most of the Chinese-owned mining sites I visited were secured either by a military force called the FARDC or the elite Republican Guard.

In 2021 China produced 75 percent of the world’s refined cobalt…The largest lithium-ion battery manufacturers in the world are CATL and BYD in China, LG Energy Solution, Samsung SDI, and SK Innovation in South Korea, and Panasonic in Japan. In 2021, these six companies produced 86 percent of the world;s lithium-ion rechargeable batteries, with CATL alone holding one-third of global share.

Gloria, a student at the University of Lubumbashi:

“Let me tell the most important thing that no one is discussing. The mineral reserves in Congo will last another forty years, maybe fifty? During that time, the population of Congo will double. If our resources are sold to foreigners for the benefit of the political elite, instead of investing in education and development for our people, in two generations, we will have two hundred million people who are poor, uneducated, and have nothing left of value. This is what is happening, and if it does not stop, it will be a disaster.”

Industrial mining is like doing surgery with a shovel. Artisanal mining is like doing it with a scalpel. During industrial excavation, tons of dirt, stone, and ore are gathered indiscriminately with large machinery, crushed down to pebbles, and processed to extract minerals of value. It is by design a blunt-force, low-yield, high-volume business. Artisanal miners, on the other hand, can use more precise tools to dig or tunnel for high-grade deposits of ore, extract only the ore, and leave the value-less dirt and stones behind…Artisanal mining techniques can yield up to ten or fifteen times a higher grade of cobalt per ton than industrial mining can.

Pentagon Militarizes Climate Change

Thursday, June 15th, 2023

The Pentagon, Climate Change, and War; Charting the Rise and Fall of U.S. Military Emissions, Neta C. Crawford, 2023

For most of the last 250 years humans have carried on with making “progress”–industrialization in the service of the good life—assuming that fossil fuels were an essential ingredient in shaping the world we were certain would be better than the one we were leaving behind. Our grand strategy for security took for granted that we would need fossil fuel for industry and fossil fuel for war. Only in the last fify years or so has it become clear that burning all that fuel and at the same time destroying the forests and the wetlands that take up the carbon released by the fire, is not just leaving the past behind, but destroying the possibility of preserving what is increasingly– in essential life giving respects—understood as a better world.

In sum, the economy, foreign policy beliefs, and military doctrine institutionalized greater demand for fossil fuels. The deep cycle of oil demand, consumption, militarization, and conflict begins with demand for oil and increasing consumption. Then, when U.S. policy makers feel anxious about guaranteeing oil supplies in the face of dependency, or are concerned about the price of oil, they back allies in the Persian Gulf and greater Middle East, as occurred in 1946 and 1949, in 1957. in 1973, in 1980 and 1990. The United States played favorites within the states that had large oil reserves, even if some of those government leaders were autocratic, such as the Shah of Iran, Saddam Hussein, and the leaders of Saudi Arabia.
Yet, the risk of supporting authoritarian regimes is that those regimes are increasingly unstable as the citizens who demand more say in their government push back against authoritarian kings, emirs, and shahs. When challenges to the undemocratic, autocratic or kleptocratic rulers within states with large oil supplies occurred, or there were external challenges, the United States sometimes backed the leaders or system that is thought could bring stability. Thus with the Eisenhower Doctrine, the United States backed Saudi Arabia’s King Saud and Crown prince Faisal as a way to balance against Egyptian leader Gamal Abdel Nasser. At times, as in the case of backing the Shah of Iran in the 1970s and Saddam Hussein in Iraq in the 1980s, these alliances backfired…This in turn increases the sense among U.S. Elites that the Middle East is a volatile region that needs U.S. Intervention to remain stable.

The military has understood the science and the consequences of global warming quite well for decades. They paid for much of that research. National security strategists have sounded muted alarms, the Pentagon has adapted some of their equipment and operations, and experts have imagined scenarios of increasingly dire complex emergencies and catastrophes and climate-caused wars.
Pentagon leaders have been farsighted and tactically flexible. Some of the smartest, best-trained and most determined people on the planet, given the resources of the richest nation on earth, the people at the Pentagon are trying to make things better…They have developed better batteries, put up solar arrays at bases, and even thought about moving some bases.


Camp Lejeune Hurricane Destruction 2018

Camp Lejeune (North Carolina) was hit by Hurricane Florence in September 2018 and suffered $3.6 billion worth of damages. In October 2018 , Michael, a category 5 hurricane devastated much of Tyndall Air Force Base in Florida, including F-22 aircraft, damaged beyond repair or destroying hundreds of buildings…Estimated to cost $4.9 billion to repair and reconstruct the base, the rebuilding was expected to last as long as five to seven years.

TYNDALL AIR FORCE BASE, Florida — Hangers once used to keep aircraft out of the elements now lie scattered across the flight line following Hurricane Michael on October 10, 2018. Hurricane Michael is the third largest hurricane to make landfall in the United States, reaching peak winds of 155 miles per hour.


Tyndall Hurricane 2018

And yet at the same time, the Pentagon has been strategically inflexible and blind…For the most part, the armed forces, and our political leaders, have not put away the tools and habits of mind that got us here in the first place. Our grand strategy for national security has not fundamentally changed. In some ways it can’t, because it is premised on the anticipation and fear of war—the idea that for us to be safe, we have to be prepared to meet every threat anywhere at any time with overwhelming force. The national security strategy is also premised on the idea that force works, that the threat of coercion and the actuality of destruction can get us what we want. So once we believe those things—that war is possible and may be imminent, that we must have a capacity to make war that far exceeds our enemy’s abilities, and that coercion and destruction are effective—it seems that the only way to deal with the threat of climate change-caused war is to prepare for more war. Of course, in preparing for more war, governments give the armed forces everything they need: money, weapons, people, bases, and fossil fuels. We defend and protect the oil we think we need to defend ourselves…At the same time that we are making our weapons more energy efficient and the installations more resilient, we scarcely question whether war is inevitable or in fact made more likely by our bases and our burning fuel to be the most powerful nation on earth…The military is inadvertently or perhaps deliberately militarizing climate change.

American Banks Fail to Serve Half of the People

Saturday, April 22nd, 2023

How the Other Half Banks; Exclusion, Exploitation, and the Threat to Democracy, Mehra Baradaran, 2015

In the process of deregulation, several bedrock principles of banking policy were sidelined. First, that banks had unique public responsibilities was rejected. Banks were relieved not only of public serving functions, but even laws protecting consumers from harmful bank products were weakened by the regulators tasked with their enforcement…For example, the two regulators of the national banks, the Office of the Comptroller of the Currency (OTC) and the (OTS). announced that national banks did not have to follow state consumer protection laws–or state laws designed to protect their citizens from predatory financial practices…Here the state laws were “preempted” by exactly zero consumer protection federal laws. most laws affecting banks had a requirement written into them that bank supervisors deciding on bank related issues should only allow an action if it benefits the public. However the question of whether a certain bank action would benefit the public morphed into an inquiry about bank profitability.

Between 1980 and 2000, the assets held by commercial banks , securities firms and the secularization they created grow from 55 percent of GDP to 95 percent.—today, a handful of behemoth banks control most of the country’s assets.

The deregulatory ideology of the new financial oligarchy infiltrated the “Wall Street Washington Corridor” by means of campaign money and ideological capture. An ideological capture of key policy makers– Larry Summers, Robert Rubin, Alan Greenspan, Timothy Geithner, Henry Paulson, and others–who had spent their careers marinating in the industry, working in “captured” regulatory agencies, or captured by extreme laissez-faire ideology, also took place during this era. Demands are rising across the political spectrum, including from Joseph Stiglitz, Simon Johnson, Paul Krugman, Richard Fisher, and even the king of deregulation, Alan Greenspan, for breaking up the banks.

It was inevitable that in an era of deregulated banks, large failures would occur. What was surprising was that the market rules would only be applied when banks were making profits and not when they ultimately failed. Instead of allowing the market to enforce its discipline and allow banks to fail, as the repudiation of the social contract dictated, the government stepped in and bailed out the banking industry…In other words, instead of using the crisis to effect real reform in making like Roosevelt did, these policymakers focused myopically on maintaining bank profitability without requiring anything in return—“the government bent over backward to make the deal attractive for the banks, charging below market interest and eschewing any significant ownership–so shareholders, not taxpayers, would benefit when the banks recovered.

The average American has $15,000 in credit card debit, $33,000 in student loan debt, and $156,000 in mortgage debt. Not only do the majority of the American public borrow their way up the income ladder, but federal mortgage and student loan markets and loose credit policies led to the creation the American middle class…However,in a society built on credit as a means to wealth, a large portion of people at the bottom are currently left out…We cannot tolerate such heavy state involvement in providing credit to the banks while leaving the less well-to-do at the mercy of the modern day sharks These are real people who live and work in cities and towns, poor neighborhoods and wealthy ones, both public servants and blue-color workers. They pay for things that are widely considered essential. They borrow with forethought and with care. They are mainstream ordinary people forced to borrow on the fringe. And fringe lenders (payday loans, title loans, money transfers) are the only ones meeting this large market demand because banks, credit unions, and other mainstream lenders have chosen not to.

Fringe banking has grown exponentially since the 1980’s and hasn’t stopped…”Virtually nonexistent in this country 20 years ago, [this sector] has grown into a $100 billion business. Since the mid 1990’s, the number of payday lenders nationwide has grown over 10 percent annually.” With over twenty thousand stores, the payday lending industry makes $40 billion in loans annually. There are more payday lender storefronts than Starbucks and McDonald’s combined…Banks do give credit in the form of credit cards and overdrafts, but relying on these products as loans can get expensive and banks hide many of their fees and interest rates in small print and do not make them clear up front. Research has shown that payday borrowers who have “credit card liquidity” or the ability to borrow on a credit cared, still opt for payday loans…This is another critical and somewhat ironic, aspect of the alternative financial service industry’s success: its ability to take advantage of the federally sponsored banking system, using its access to clearinghouses and even its banking charters to lend.

But many people are failing. They are failing even as the banks are succeeding, and they are failing because the banks are no longer involved in providing them credit. The government has outsourced the provision of credit to the banking system, and it provides this system with state support and cheap credit–cheap credit that is not flowing out to reach those that need it most. In fact, while the government has provided interest rates to banks at 0 percent, it has allowed interest rates to the poor to skyrocket to triple digits. If the intended result is to help the public with credit, why continue to use the banks as a medium when they are clearly leaving out a significant portion of the population? Why not provide credit directly to those who need it in order to be certain they get it?

A social contract has existed between banks and the government since the early days of the Republic. The government support the banks through trust-inducing insurance, bailouts, liquidity protection, and a framework that allow the allocation of credit to the entire economy. Banks, in turn, operate as the central machinery of the economy by providing transaction services, a medium for trade, and individual and business loans that spur economic growth. This entanglement between the state and the banking system must surely mean that banks should not exclude a significant portion of the public from the bounty of government support. This is not just a banking market problem but a threat to our society’s democratic principles. When the state becomes involved in the banking system, that system cannot create or contribute to such a vast inequality. The supply of credit has always been a public policy issue, with banks functioning as intermediaries. Insofar as the state enables credit markets, all creditworthy Americans deserve equal access to credit, especially because reasonable and safe credit can provide a smoother path both through and out of poverty. If banks are not providing credit to the poor, the state should provide it directly.

The existing post office framework represents the most promising path toward effecting such a public option. American banks long ago deserted their most impoverished communities, but post offices, even two centuries later, have remained–still rooted in an egalitarian mission. There have never been barriers to entry at post offices, and their services have been available to all, regardless of income. And so, it is not unreasonable to suggest that as America’s oldest instrument of democracy in action, the post office can once again level the playing field, and in the process, save itself from imminent demise…The social contract has been breached. Banks enjoy government support but do not serve the entire public. Direct government involvement remedies the breach and bridges the gap in services.

A deep-seated problem exists within our stratified banking system: the state is shoring up a powerful banking industry that is, in turn, excluding those Americans most in need. This is corrupting our democracy, and we ignore it at our own peril.

Ode to (Gordon) Moore’s Law

Monday, February 6th, 2023

Chip War, The Fight For the World’s Most Critical Technology, 2022, Christopher Miller

This reader is a retired software engineer whose 40 year career began with punch card mainframes and ended with microcontrollers with embedded graphic displays, WiFi, and flash memory on a single chip. My introduction to computing was as a graduate research assistant on an ARPA funded project to study the dimensionality of nations. Since that time I followed the development of the ARPANET. I have lived the profound impact of Moore’s Law, needing to constantly anticipate where technology would be when project development required several years before introduction. Moore’s Law has still not been broken after all these years.
To get some sense of what exponental increase in transistors looks like consider the 1980s Cray 2 supercomputer which was export restricted for national security reasons. The CRAY-2 stood nearly 4 feet tall with a 5.5-foot diameter and weighed 5,500 pounds. The iPhone 12 is 5,000 times faster than the Cray-2!
Exponential growth of a technology is unique in human history.

  • If the computing power on each chip continued to grow exponentially, Moore realized, the integrated circuit would revolutionize society far beyond rockets and radars…At Fairchild, Noyce and Moore were already dreaming of personal computers and mobile phones.

    Alongside the rise of these new industrial titans (Intel and Micron), a new set of scientists were preparing a leap forward in chipmaking and devising revolutionary new ways to use processing power. Many of these developments occurred in coordination with government efforts, usually not the heavy hand of Congress or the White House, but the work of small, nimble organizations like DARPA (originally ARPA, Larry Robert director 1966-1973) that were empowered to take big bets on futuristic technologies — and to build the educational and R&D infrastructure that such gambles required.


    Morris Chang Founder of TSMC

    (Dutch) ASML‘s history of being spun out of Philips helped in a surprising way, too facilitating a deep relationship with Taiwan’s TSMC (founder Morris Chang). Philips had been the cornerstone investor in TSMC, transferring its manufacturing process technology and intellectual property to the young foundry. This gave ASML a built-in market, because TSMC’s fabs were designed around Philip’s manufacturing processes. An accidental fire in TSMC’s fab in 1989 helped too, causing TSMC to buy additional nineteen new lithography machines, paid for by the fire insurance. Both ASML and TSMC started as small firms on the periphery of the chip industry, but they grew together, forming a partnership without which advances in computing today would have ground to a halt.

    The next generation EUV (Extreme ultraviolet) lithography would therefore be mostly assembled abroad, though some components continued to be built in facility in Connecticut. Anyone who raised the question of how the U.S. would guarantee access to EUV tools was accused of retaining a Cold War mindset in a globalizing world. Yet the business gurus who spoke about technology spreading globally misrepresented the dynamic at play. The scientific networks that produced EUV spanned the world, bringing together scientists from countries as diverse as America, Japan, Slovenia, ad Greece. However, the manufacturing of EUV wasn’t globalized, it was monopolized. A single supply chain managed by a single company would control the future of lithography.

    By the mid 2000’s, just as cloud computing was emerging, Intel had won a near monopoly over data center chips, competing only with AMD. Today nearly every major data center uses X86 chips from either Intel or AMD. The cloud can’t function without their processors… Some companies tried challenging z86’s position as the industry standard in PCs. In 1990 Apple and two partners established a joint venture called Arm, based in Cambridge England. The aim was to design processor chips using a new instruction set architecture based on the simpler RISC (reduced instruction set computer) principles that Intel had considered but rejected. As a startup Arm faced no costs of shifting away from x86, because it had no business and no customers. Instead, it wanted to replace X86 at the center of the computing ecosystem. Arm’s first CEO, Robin Saxby, had vast ambitions for the twelve-person startup…However Aim failed to win market share in PC’s in the 1990s and 2000’s, because Intel’s partnership with Microsoft’s Windows operating system was simply too strong to challenge. However Arm’s simplified, energy-efficient architecture quickly became popular in small, portable devices that had to economize on battery use. Nintendo chose Arm based chips for its handheld video games…

    The problem wasn’t that no one realized Intel ought to consider new products, but that the status quo was simply too profitable. If Intel did nothing at all it would still own two of the world’s most valuable castles–PC and server chips–surrounded by a deep x86 moat.

    Intel turned down the iPhone contract…Apple looked elsewhere for its phone chips. Jobs turn to Arm’s architecture, which, unlike the x86 was optimized for mobile devices that had to economize on power consumption. The early iPhone processors were produced by Samsung (founder Lee Byung-chul), which had followed TSMC into the foundry business… By the time Otellini (Intel) realized his mistake, however it was too late.

    By the 2000’s, it was common to split the semiconductor industry into three categories. “Logic” refers to the processors that run smartphones, computers, and servers. “Memory” refers to DRAM which provides the short-term memory computers need to operate, and flash, also called NAND, which remembers data over time. The third category of chips is more diffuse, including analog chips like sensors that convert visual or audio signals into digital data, radio frequency chips that communicate with cell phone networks, and semiconductors that manage how devices use electricity.

    It (America’s second class status) dates to the late 1980s when Japan first overtook the U.S. DRAM output. The big shift in recent years is the collapse in the share of logic chips produced in the United States. Today, building an advanced logic fab costs $20 billion, an enormous capital investment that few firms can afford…Given the benefits of scale, the number of firms fabrication advanced logic chips has shrunk relentlessly.


    Jensen Huang CEO Nvidia

    Nvidia (which became dominant in graphics) not only designed chips called graphic processors units (GPUs) capable of handling 3D graphics, it also devised a software ecosystem around them. Making realistic graphics requires use of programs called shaders, which tell all the pixels in a image how they should be portrayed in, say a given shade of light. The shader is applied to each of the pixels in a image, a relatively straightforward calculation conducted over many thousands of pixels. Nvidia’s GPUs can render images quickly because, unlike Intel’s microprocessors or other general-purpose CPUs, they’re structured to conduct lots of simple calculations–like shading pixels–simultaneously.
    In 2006, realizing that high-speed parallel computations could be used for purposes besides computer graphic, Nvidia released CUDA, software that lets GPUs be programmed in a standard programming language, without any reference to graphic at all. Even as Nvidia was churning out top-notch graphic chips, Huang (CEO) spent lavishly on this software effort, at least $10 billion,…to let any programmer–not just graphic experts–work with Nvidias chips…Nvidia discovered a vast new market for parallel processing, from computational chemistry to weather forecasting. At the time, Huang could only dimly perceive the potential growth in what would become the biggest use case for parallel processing, artificial intelligence.
    Today Nvidia’s chips, largely manufactured by TSMC, are found in most advanced data centers.


    Dr. Irwin Jacobs co founder Qualcom

    For each generation of cell phone technology after 2G, Qualcomm contributed key ideas about how to transmit more data in the radio spectrum and sold specialized chips with the computing power capable of deciphering this cacophony of signals. The companies patents are so fundamental it’s impossible to make a cell phone without them. Qualcomm soon diversified into a new business line, designing not only the modem chips in a phone that communicate with a cell network, but also the application processors that run a smartphone’s core systems. These chip designs are monumental engineering accomplishments, each built on tens of millions of lines of code.

    For many years, each generation of manufacturing technology was named after the length of the transistor’s gate, the part of the silicon chip whose conductivity would be turned on and off, creating and interrupting the circuit. The 180nm node was pioneered in 1999, followed by 130nm, 90nm, 65nm, and 45nm, with each generation shrinking transistors enough to make it possible to cram roughly twice as many in the same area. This reduced power consumption per transistor, because smaller transistors needed fewer electrons to flow through them.
    Around the early 2010s, it became unfeasible to pack transistors more densely by shrinking them two dimensionally. One challenge was that, as transistors were shrunk according to Moore’s Law, the narrow length of the conductor channel occasionally caused power to “leak” through the circuit even when the switch was off. On top of this, the layer of silicon dioxide atop each transistor became so thin that quantum effects like “tunneling”–jumping through barriers that classical physics said should be insurmountable–began seriously impacting transistor performance. By the mid 2000s. the layer of silicon dioxide on top of each transistor was only a couple of atoms thick, too small to keep a lid on all the electrons sitting in the silicon.
    To better control the movement of electrons, new materials and transistor designs were needed. Unlike the 2D design used sine the 1960s, the 22nm node introduced a new 3D transistor, called a FinFET (pronounced finfet), that sets the two ends of the circuit and the channel of semiconductor material that connects them on top of a block, looking like a fin protruding from a whale’s back. The channel that connects the two ends of the circuit can therefore have an electric field applied not only from the top but also from the sides of the fin, enhancing control over the electrons and overcoming the electricity leakage that was threatening the performance of new generations of tiny transistors…These nanometer-scale 3D structures were crucial for the survival of Moore’s Law, but they were staggeringly difficult to make, requiring even more precision in deposition, etching, and lithography. This added uncertainty about whether the major chip-makers would all flawlessly execute the switch to FinFET architectures or whether one might fall behind…Moreover, the 2008-2009 financial crisis was threatening to reorder the chip industry. Consumers stopped buying electronics, so tech firms stopped ordering chips.

    Smartphones and PCs are both assembled largely in China with high-value components mostly designed in the U.S., Europe, Japan, or Korea. For PCs, most processors come from Intel and are produced at one of the company’s fabs in the U.S., Ireland, or Israel. Smartphones are different. They are stuffed full of chips, not only the main processor (which Apple designs itself), but modem and radio frequency chips for connecting with cellular networks, chips for WiFi and Bluetooth connections, an image sensor for the camera, at least two memory chips, chips that sense motion (so your phone knows when you turn it horizontal), as well as semiconductors that manage the battery, the audio, the wireless charging. These chips make up most of the bill of materials needed to build a smartphone.
    As semiconductor fabrication capacity migrated to Taiwan and South Korea, so too did the ability to produce many of these chips. Application processors, the electronic brain inside each smartphone, are mostly produced in Taiwan and South Korea before being sent to China for final assembly inside a phones plastic case and glass screen. Apple’s iPhone processors are fabricated exclusively in Taiwan.

    These tricks kept Moore’s Law alive, as the chip industry shrank transistors from the 180nm node in the late 1990s, through the early stages of 3D FinFET chips, which were ready for high-volume manufacturing by the mid-2010s.
    However, there were only so many optical tricks that could help 193nm light carve smaller features. Each new workaround added time and cost money. By the mid-2010s, it might have been possible to eke out a couple of additional improvement, but Moore’s Law needed better lithography tools to carve smaller shapes. The only hope was that the hugely delayed EUV lithography tools, which had been in development since the early 1990s, could finally be made to work at a commercial scale.

    Even the deep pockets of the Persian Gulf royals who owned GlobalFoundries weren’t deep enough. The number of companies capable of fabricating leading-edge (7nm) chips fell from four to three. (TSMC, Intel, and Samsung).

    As investors bet that data centers will require ever more GPUs, Nvidia has become America’s most valuable semiconductor company. Its assent isn’t assured however, because in addition to buying Nvidia chips the big cloud companies — Google, Amazon, Microsoft, Facebook, Tencent, Alibaba, and others — have also begun designing their own chips, specialized to their processing needs, with a focus on artificial intelligence and machine learning.


    Master Chip Designer Jim Keller

    Gordon Moore’s famous law is only a prediction, not a fact of physics…At some point, the laws of physics will make it impossible to shrink transistors further. Even before then, it could become too costly to manufacture them. The rate of cost declines has already significantly slowed. The tools needed to make ever-smaller chips are staggeringly expensive, none more so than the EUV lithography machines that cost more than $100 million each.
    The end of Moore’s Law would be devastating for the semiconductor industry — and for the world. We produce more transistors each year only because it’s economically viable to do so.
    The durability of Moore’s Law, in other words has surpassed even the person who it’s named after and the person who coined it. It may well surprise today’s pessimists too. Jim Keller, the star semiconductor designer who is widely credited for transformative work on chips at Apple, Tesla, AMD, and Intel has said he sees a clear path toward a fifty times increase in the density with which transistors can be packed on chips.”we’re not running out of atoms”, Keller has said. “We know how to print single layers of atoms.”

  • Trauma, Illness & Healing in a Toxic Culture

    Tuesday, December 6th, 2022

    The Myth of Nornal; Trauma, Illness & Healing in a Toxic Culture, Gabor Mate with Daniel Mate, 2022

    If we begin to see such illness itself not as a cruel twist of fate or some nefarious mystery but rather as an expected and therefore normal consequence of abnormal, unnatural circumstances, it would have revolutionary implications for how we approach everything health related.
    The current medical paradigm, owing to an ostensibly scientific bent that in some ways bears more resemblance to an ideology than to empirical knowledge, commits a double fault. It reduces complex events to their biology, and it separates mind from body, concerning itself almost exclusively with one or the other without appreciating their essential unity.

    My own observations of self and others have led me to endorse fully what a review of the stress literature concluded, namely that “psychological factors such as uncertainty, conflict, lack of control, and lack of information are considered to most stressful stimuli and strongly activate the HPA axis.” A society that breeds these conditions, as capitalism inevitably does, is a super powered generator of stressors that tax human health.

    Capitalism is “far more than just an economic doctrine,” Yoval Noah Harari observes in his influential bestseller Sapiens. “It now encompasses an ethic — a set of teachings about how people should behave, educate their children, and even think. Its principal tenet is that economic growth is the supreme good, because justice, freedom, and even happiness all depend on economic growth.” Capitalism’s influence today runs so deep and wide that its values, assumptions, and expectations potently infuse not only culture, politics, and law but also such subsystems as academia, education, science, news, sports, medicine, child-rearing, and popular entertainment. The hegemony of materialist culture is now total, its discontents universal.

    “The political system seems to be failing as much as the economic system,” (Joseph) Stiglitz writes in his 2012 book, The Price of Inequality. In the eyes of many, he continues, “capitalism is failing to produce what was promised — inequality, pollution, unemployment, and most important of all, the degradation of values to the point where everything is acceptable and no one is accountable.”

    The Swiss bank UBS reported in October 2020 that during the COVID-19 induced market turmoil the international billionaire stratum had grown their fortunes to over ten trillion dollars between April and July of that year. The worlds than richest individual Jeff Bezos had increased his wealth by over $74 billion Tesla owner Elon Musk by up to $103 billion. …the Toronto Star reported. “That’s in the midst of an economic crisis that has left millions of Canadians unemployed or working reduce hours and struggling with bills and our governments are borrowing to fund emergency financial aid for individuals and businesses to stave off even greater hardship.
    In the realm of political decision making, a widely circulated U.S. study showed that the views of ordinary people make no difference to public policy: a lack of control on a mass scale. “When a majority of citizens disagree with economic elites or with organized interests, they generally lose.” “Even when fairly large majorities favor policy change, they generally do not get it.”

    Scottish labor leader Jimmy Reid:

    “Alienation is the precise and correctly applied word for describing the major social problem in Britain today. People feel alienated by society…Let me right at the outset define what I mean by alienation. It is the cry of men who feel themselves the victims of blind economic forces beyond their control. It’s the frustration of ordinary people excluded from the processes of decision-making. The feeling of despair and hopelessness that pervades people who feel with justification that they have no real say in shaping or determining their own destinies.”

    Not only does our individual and societal sanity depend on connection so does our physical health. Because we are biopsychosocial creatures, the rising loneliness epidemic in Western culture is much more than just an psychological phenomenon it is a public health crisis.


    psychoanalyst Steven Reisner:

    “Narcissism and sociopathy describes corporate America. But it’s flat-out wrong to think in twenty-first century America that narcissism and sociopathy are illnesses. In today’s America, narcissism and sociopathy are strategies. And they’re very successful strategies, especially in business and politics and entertainment.”

    in what the Wall Street Journal called “an unprecedented plea” editors of two hundred health journals internationally, including the Lancet, the British Medical Journal, and the New England Journal of Medicine, called the failure of political leaders to confront the climate crisis “the greatest threat to global public health.” The harms of climate change include acute and chronic physical illness such as cardiovascular disease and susceptibility to infections, along with mental health challenges. Especially at risk are people with heart or kidney conditions, diabetes, and respiratory ailments. I need hardly mention food and water insecurity, major stressors already affecting millions.

    Underlying the active and callous disregard of our Earth’s health is the sociopathology of the most powerful entities, whose planetary poison-pushing removes any hint of metaphor thus this book’s subtitle phrase “toxic culture.” The oil companies pumped billions of dollars into thwarting government action. They funded think tanks and paid retired scientists and fake grassroots organizations to pour doubt and scorn on climate science. They sponsored politicians, particularly in the U.S. Congress, to block international attempts to curtail greenhouse gas emissions. They invested heavily in greenwashing their public image…in 2020 the top hundred or more American corporations channeled their political donations largely to lawmakers with a record of stalling climate legislation…Compared with financial gain the climate is, well, small change.

    Historically the idea of race arose from the impulse of European capitalism to enrich itself by subjecting, enslaving, and if necessary, destroying Indigenous people on other continents, from Africa to Australia to North America. Indeed, the word “race” did not exist in any meaningful way until it was created in the late eighteenth century. Psychologically, on the individual level, the “othering” of racism entails an antidote to self-doubt: if I don’t feel good about myself, at least I can feel superior to somebody and gain a sense of power and status by claiming privilege over them.

    The brilliant writer James Baldwin once said, “What white people have to do is try and find out in their own hearts why it was necessay to have a n_____ in the first place. If you, the white people, invented him, then you’ve got to find out why.”

    For another take on the current worldwide health crisis see Deep Medicine

    Why China Will Not Rule the World

    Saturday, November 5th, 2022

    The China Boom; Why China Will Not Rule the World, Ho-fung Hung 2016

    The China Boom <><> <> <> <> Ho-fung Hung

    Hong Kong born Hung is a sociologist who has studied capitalism and Chinese history extensively. This book sites the many studies and research about the rise of modern China, its politics, economics, its capitalist development, and its place in the global world order. Here is a short summary of his conclusions.

    Amid the late twentieth-century rise of global neoliberalism, under which the United States and Europe shifted to financial expansion, debt-driven consumption, and reliance on imported manufactured goods from low-wage countries, China eschewed central economic planning and absorbed substantial foreign-capital accumulated during the industrial takeoff of its Asian neighbors, particularly those of Chinese diasporic origins, turning itself into a dynamic center of export-driven capitalism…It is apparent that China has no intention of or capacity for transforming the global neoliberal order because the China boom has been relying heavily on transnational free trade and investment flow. China also makes significant contribution to the perpetuation of U.S. global dominance through its addiction to U.S. public debt.


    Mao and Deng

    ..SOEs (State Owned Enterprises) and state control of the marketing of agricultural products as a means to speed up rural surplus extraction and industrial capital accumulation began in certain KMT (Kuomintang) controlled areas before 1949. What the CCP (Chinese Communist Party) did after 1949 was to expand this state-owned sector to the whole economy and to collectivize agriculture, turning the state into the sole agent of capital accumulation. As a consequence, China managed to build an extensive network of heavy industries and infrastructure despite its international isolation in 1949-1979. It also successfully defended its sovereignty and geopolitical security vis-a-vis both the United States and the Soviet Union. The Mao period in China represented the culmination of a century of the state elite’s quest for state-led industrialization…the expansion of KMT-controlled state enterprises, the successful land reform, and the rise of state-directed rural cooperatives that facilitated agriculture-to-industry surplus transfers in Taiwan can be seen as a mild variation of SOEs and the People’s Commune in Mao China. This continuity attests to Immanuel Wallerstein’s provocative formulation that “actually existing socialist countries” emerging in mid-twentieth century were always part of the capitalist world system and that their socialist system has been little more than a strategy of rapid capital accumulation and industrial catch up under the strong hands of mercantilist states.

    In retrospect, many Deng and post-Deng reform measures would not have been that successful had it not been for the legacies of the Mao era. The SOEs and infrastructure constructed in Mao times, though moribund and unprofitable at the advent of reform, were important foundations for the capitalist takeoff during the reform period. For example, many foreign companies investing in China did not start from scratch but began as joint ventures with preexisting SOEs. At the same time, many SOEs developed into sizeable transnational capitalist corporations with financial and policy support from the state, though ownership changed from the state by itself to other combinations — for example public listing but with the government owning a majority share. Most of China’s biggest corporations today originated in the Mao era or were built on state assets developed in that era…It is not surprising that many other former socialist countries in Russia and Eastern Europe have also witnessed a similar predominance of state corporations.

    Other Mao-era legacies include the restriction of rural-urban migration by means of the household registration system and public investment in rural education and rural health care in the People;s Communes. These policies created a generation of literate and healthy rural laborers available in great numbers for private, export-oriented enterprises as well as TVEs (Township and Village Enterprises) from the 1980s on. The self reliance policy in the Mao period prevented the large-scale external borrowing in the 1970s that many other developing or socialist countries indulged in, thus sparing China from the international debt crisis in the 1980s that brought large setbacks to the developing world and the Soviet bloc.

    China has not challenged U.S. global dominance despite its leader’s postures and its nationalist press’s rhetoric. On the contrary, it has been a key force in helping perpetuate U.S. global dominance. China’s SOEs have been transformed into U.S. style capitalist corporations, many of them with the aid of Wall Street financial firms, and floated in overseas stock markets such a Hong Kong and New York. China’s export-oriented growth relies on the United States and Europe, the two biggest markets for its manufactured goods, and China’s exports to both places have been paid for mostly in U.S. dollars. The massive flow of U.S. dollars into China in the form of trade surplus impels China to invest addictively in U.S. Treasury bonds as the most liquid and largest US dollar-denominated store of value.Since 2008, China has replaced Japan as the biggest foreign creditor to the United States, and such financing enables the United States to continue living and fighting beyond its means. This investment in U.S. Treasury bonds in turn facilitates the perpetuation of the global dollar standard, which has been the single most important foundation of U.S. global power. The foreign exchanges brought in by China’s export sector have been the foundation of the state banks’ profligate creation of liquidity that fuels fixed-asset investment. In short, the China boom relies on the global free market instituted and warranted by the United States. It is thus far from China’s interest to undermine the global neoliberal status quo and U.S. leadership in it.

    Any readjustment of the structure of capitalist development in China will have to involve an increase in domestic consumption’s share in GDP and a corresponding reduction in export and investment’s share…such restructuring must be associated with a profound redistribution of wealth and income that will let average households share a larger slice of the pie of the expanding economy, reducing the advantages that the state has been offering to the export sector and state enterprises, both of which have been protected by the entrenched interests in the political process. Such readjustment, coupled with the cleaning up of existing bad debts in the system, will inevitably bring a slowdown in economic growth through either a disorderly hard landing or an orderly soft landing…Although such a slowdown is inevitable and normal in the adjustment and rebalancing process, it is unknown whether existing political institutions in China can withstand it.