Archive for the 'Technology' Category

TechnoFeudalism Killed Capitalism

Thursday, May 16th, 2024

Techno Feudalism; What Killed Capitalism, Yanis Varoufakis, 2023


Self proclaimed libertarian Marxist.

The Rise of Big Finance and Big Business

To produce the rivers of credit necessary to fund the Edisons, the Westinghouses, and the Fords of early twentieth-century capitalism, small banks merged to form large ones and lent either to the industrialists directly or to speculators eager to buy shares in the new corporations…And it led to emergence of Big Finance, which grew up alongside Big Business in order to lend it monies borrowed effectively from the future: from profits not yet realized but which Business promised to delivery.

The Creation of the American technostructure in WWII (Eisenhower’s military industrial complex)

(after Pearl Harbor brought the US Into WWII) the US government began to emulate…the Soviet one.


Galbraith at work in 1940
It told factory owners how much to produce and to what specifications, from aircraft carriers to processed food. It even employed a price czar – the economist John Kenneth Galbraith – whose job, literally, was to decide the price of everything, to fend off inflation, and to ensure a smooth economic transition from wartime to peacetime is no exaggeration to say that American capitalism was run according to Soviet planning principles, with the exception that the networked factories remained under private ownership of Big Business.

Under President Roosevelt, the US government’s deal with Big Business was simple: they would produce what was necessary to win the war and, in exchange, the state would reward them with four incredible gifts. First, state guaranteed sales translated into state guaranteed profits. Second, freedom from competition, since prices were fixed by government. Third, huge government funded scientific research (e.g. the Manhattan Project, jet propulsion) that provide Big Business with wonderful new innovations and a pool of highly skilled scientific personnel to recruit from during and after the war. And forth, a patriotic aura to help rinse off the stench of corporate greed that clung to them after the crash of 1929 and make them over as heroic enterprises that helped America win the war.

Galbraith called this nexus (at the end of the war) the technostructure.

With the war behind them, one thing kept the good folks of the technostructure up at night: if the government would no longer guarantee sales and prices, where would they find the customers ready and willing to pay for all the chocolate bars, cars, and washing machines that they were planning to manufacture…(hence the rise of Madison Avenue and consumer behavior modification)

In the 1960s, a decade marked by an ideological and nuclear clash between America and the Soviet Union that almost blew up the world, Soviet planning principle were implemented with remarkable success in …the United States. Irony has seldom taken a more effective revenge over earnest ideology.

The American Golden Age of Bretton Woods 1944-1971

This dazzling design, America’s Global Plan to remake Europe and Japan in the image of its technostructure, led to capitalism’s Golden Age. From the wars end until 1971, America, Europe and Japan enjoyed low unemployment, low inflation, high growth and massively diminished inequality.

As long as America was the major surplus nation, Bretton Woods was safe as houses. And that’s why, by the late 1960s, the Bretton Woods system was dead in the water. The reasons? Three developments which caused America to lose its surplus and become a chronically deficit economy. The first was the escalating Vietnam War which forced the US government to spend billions in South East Asia on supplies and services for its military. The second was President Lyndon Johnson’s attempt to make amends for the ill effects of conscription on working-class America, its black communities in particular. His valiant but expensive Great Society program substantially reduced poverty but, at once, sucked lots of imported goods from Japan and Europe into the United States. Lastly, Japan’s and Germany’s factories surpassed America’s in terms of quality and efficiency, partly due to the support successive US governments had extended to Japan’s and Germany’s manufacturing sectors – the car industry being an obvious example.

Nixon Shock the death of Bretton Woods 1971

..on 15 August 1971 President Nixon announced the eviction of Europe and Japan from the dollar zone. Bretton Woods was dead. The door had been opened on a new and truly dismal phase in capitalism’s evolution.

Fed Dismal Chairmen Volcker, Greenspan, Bernanke

In 2002, thirty years after the Nixon Shock, humanity’s total income approximated $50 trillion. In the same year, financiers around the world had wagered $70 trillion on a variety of bets…. By 2007 humanity’s total income had risen from $50 to $75 trillion – a decent 33 percent increase over five years. But the sum of bets in the global money market had gone up from $70 to $750 trillion – a rise in excess of 1000 percent…Bretton Woods was designed to prevent such greed-fueled recklessness from bringing humanity to the brink of another Great Depression, indeed another world war, ever again.

Once they lost their fixed exchange with the dollar, the dollar value of European and Japanese money began fluctuating wildly…The dollar became the only safe harbor, courtesy of its exorbitant privilege, namely, that if any French, Japanese, or Indonesian company, indeed anyone wanted to import oil, copper, steel, or even just space on a freight ship, they had to pay in dollars…The Nixon Shock produced a magic trick for the ages; the country going deeper and deeper into the red was the country whose currency was becoming more and more hegemonic…But there was another reason why the dollar’s hegemony grew: the intentional impoverishment of America’s working class.,,It is also no coincidence that union busting became a thing in the 1970s.

Crash of 2008

With investment first knocked out by the crash of 2008 and finished off soon after by austerity, throwing new money at the financiers was never going to resurrect it. Put yourself in the position of a capitalist at a time when austerity is eliminating your customer’s income. Suppose I give you a billion dollars to play with for free, i.e. at a zero interest rate. Naturally you will take the free billion but as we’ve established you would be mad to invest it in new production lines. So what are you going to do with the free cash? You could buy real estate or art or better still, shares in your own company. That way, the shares in your company appreciate in value, and if you are the CEO running it, your stature and share-linked bonuses rise too. No new investment, in other words, but a lot more power in the hands of the powerful.

For while the American deficit returned with a vengeance a year after the crash of 2008 and the subsequent bankers bailouts, it never restored the beast’s capacity to recycle the world’s profits. True, the rest of the world continued to send most of its profits to Wall Street. But the recycling mechanism was broken: only a small fraction of the monies rushing to Wall Street returned in the form of tangible investments in factories, technologies, agriculture. Most of the world’s money rushed to Wall Street to stay in Wall Street. There, it sloshed around doing nothing useful. As it piled up, it bid up share prices, thus giving the Jills and the Jacks of of finance yet another opportunity to do stupid things at a mammoth scale.

When an activist states makes fabulously wealthier, the same banks whose quasi-criminal activities brought misery to the majority, while they are punished with self-defeating austerity, two new calamities beckon: poisoned politics and permanent stagnation. The poisoned politics we need not elaborate on – from Greece’s neo-Nazis to America’s Donald Trump we have all lived through the nightmare. But permanent stagnation? Why would more wealth for the ultra-rich stagnate capitalism? And how did it lead to the funding of cloud capital?

Cloud proles and cloud serfs

…capital has hitherto been reproduced within some labor market – within the factory, the office, the warehouse. Aided by machines, it was waged workers who produced the stuff that was sold to generate profits, which in turn financed their wages and the production of more machines– that’s how capital accumulated and reproduced. Cloud capital, in contrast, can reproduce itself in ways that involve no waged labor. How? By commanding almost the whole of humanity to chip in to its reproduction – for free…By doing so, we shall see that while workers have become ‘cloud proles’ we all have become ‘cloud serfs’…Cloud proles – my term for waged workers driven to their physical limits by cloud based algorithms—suffer at work in ways that would be instantly recognized by whole generations of earlier proletarians. (As in Chaplin’s 1936 movie Modern Times)
Workers employed by General Electric, Exxon-Mobil, General Motors or any other major conglomerate pay in salaries and wages approximately 80 percent of the company’s income. This proportion grows larger in smaller firms. Big Tech’s workers, in contrast, collect less than 1 percent of their firm’s revenues. The reason is that paid labor performs only a fraction of the work that Big Tech relies on. Most of the work is performed by billions of people for free…The fact that we do so voluntarily, happily even, does not detract from the fact that we are unpaid manufacturers – cloud serfs whose daily self-directed toil enriches a tiny band of multibillionaires residing mostly in California and Shanghai.

Amazon and Jeff Bezos the end of capitalism

Enter Amazon.com and you have exited capitalism. Despite all the buying and selling that goes on there, you have entered a realm which can’t be thought of as a market, not even a digital one…Even the ugliest of markets are meeting places where people can interact and exchange information reasonably freely. In fact, it’s even worse than a totally monopolized market – there at least, the buyer can talk to each other, form associations, perhaps organize a consumer boycott to force the monopolist to reduce a price or to improve a quality. Not so in Jeff’s realm, where everything and everyone is intermediated not by the disinterested invisible hand of the market but by an algorithm that works for Jeff’s bottom line and dances exclusively to his tune.
(Amazon is) a type of digital fief…A post-capitalist one, whose historical roots remain in feudal Europe but whose integrity is maintained today by a futuristic, dystopian type of cloud-based capital.

Tesla and Elon Musk Amazon copycat


Copycat ecommerce platforms, offering variations on the Amazon theme, are springing up everywhere, in the Global South as well as the Global North. More significantly,other industrial sectors are turning into cloud fiefs too. Take for example Tesla,,, Elon Musk’s successful electric car company. One reason financiers value it so much higher then Ford or Toyota is that its cars’ every circuit is wired into cloud capital. Besides giving Tesla the power to switch off its cars remotely, if not, for instance, the driver fails to service it as the company wishes, merely driving around Tesla owners are uploading in real time information (including what music they are listening to!) that enriches the company’s cloud capital.

A.I. Algorithms produce Cloud Proles and Cloud Serfs

It took mind-bending scientific breakthroughs, fantastical sounding neural networks and imagination-defying A.I. Programs to accomplish what? To turn workers tolling in warehouses, driving cabs and delivering food into cloud proles. To create a world where markets are increasingly replaced by cloud fiefs. To force businesses into the role of vassals. And to turn all of us into cloud serfs, blued to our smartphones and tablets, eagerly producing the cloud capital that keeps our new overlords on cloud nine.

Privatization of the Internet Commons

Capitalism surfaced when owners of capital goods (steam engines, machine tools, spinning jennies, telegraph poles, etc.) acquired the power to command people and nations– powers that far exceeded, for the first time, those of landowners. It was a Great Transformation made possible by the prior privatization of common lands. Same with cloud capital. To acquire its eve greater powers to command, it too required the prior privatization of another crucial commons: Internet One.
Previously, to exercise capital’s power to command and make other humans work faster and consume more, capitalists required two types of professionals; managers and marketeers. Especially under the auspices of the post-war technostructure, these two service professions achieved greater prominence even than bankers and insurance brokers…Then cloud capital arrived. At one fell swoop it automated both roles. The exercise of capital’s power to command workers and consumers alike was handed over to the algorithms. This was a far more revolutionary step than replacing autoworkers with industrial robots. After all, industrial robots simply do what automation has been doing since before the Luddites: making proletarians redundant, or more miserable or both. No, the truly historic disruption was to automate capital’s power to command people outside the factory, the shop or office – to turn all of us, cloud proles (blue collar working-class) and everyone else, into cloud serfs in the direct (unrenumerated) service of cloud capital, unmediated by any market.
Meanwhile, conventional capitalist manufacturers increasingly have no option but to sell their goods at the discretion of the cloudalists, paying them a fee for the privilege, developing a relationship with them no different to that of vassals vis-a-vis their feudal overlords.

The Apple iPhone and the Apple Store


The stroke of genius that unlocked cloud rent for Steve Jobs was his radical idea to invite ‘third party developers’ to use free Apple software with which to produce applications for sale via the Apple Store. In one fell swoop Apple had created an array of unwaged laborers and vassal capitalists whose hard work yielded a host of capabilities available exclusively of iPhone owners in forms of thousands of desirable apps that Apple engineers could never have produce themselves in such variety or volume.

Google’s Android Operating System and Google Play


Only one other conglomerate managed to persuade a significant proportion of those developers to create apps for its own store: Google. Long before the iPhone arrived, Google’s search engine had become the centerpiece of a cloud empire which included Gmail and YouTube, and which would later include Google Drive, Google Maps and a host of other online services…Google followed a different strategy to Apple’s. Instead of manufacturing a handset in competition with the iPhone, it developed Android, an operating system that could be installed for free on the smartphones of any manufacturer, including Sony, Blackberry and Nokia, who chose to use it. The idea was that if enough of Apple’s competitors installed it (Android) on their phones, the pool of smartphones operating on the Android software would be large enough to lure third-party developers to produce apps not only only for the Apple Store but for a new store running on Android software. That’s how Google created Google Play, the only serious alternative to the Apple Store.

Creation of Vassal Capitalists and the Precariat

But large or small, powerful or otherwise, all vassal capitalists are by definition dependent to a greater or lesser extent on selling their wares via an ecommerce site, whether Amazon or Ebay or Alibaba, with a sizable portion of their net earnings being skimmed off by the cloudalists they depend on.
Meanwhile, as Amazon was snaring makers of physical products within its cloud fief, other cloudalists were focusing their attention on the precariat (people whose employment and income are insecure). Companies like Uber, Lyft, Grubhub, DoorDash and Instacart in the Global North, along with their imitators in Asia and Africa, wired into their cloud fiefs a vast array if drivers, delivery people, cleaners, restauraneurs – even dog walkers – collecting from these unwaged, piece-rate workers a fixed cut of their earnings too. A cloud rent.
The Great Transformation from feudalism to capitalism, was predicated on the usurpation of rent by profits as the driving force of our socio-economic system. That was why the word capitalism proved so much more useful and insightful than a term like market feudalism. It is this fundamental fact – that we have entered a socio-economic system powered not by profit but by rent – that demands we use a new term to describe it. To think of it as hyper capitalism or rentier capitalism would be to miss this essential defining principle. And to reflect the return of rent to its central role, I can think of no better name than technofeudalism.

Technofeudalism Underlies the Great Inflation

…the Great Inflation and cost-of-living crisis that have followed the recent pandemic cannot be properly understood outside the context of Technofeudalism…I recounted how for twelve long years after the crash of 2008, central banks printed trillions to replace the bankers’ losses. We saw how socialism for bankers and austerity for the rest of us dampened investment, blunted Western capitalism’s dynamic and pushed it into a state of gilded stagnation. The only serious investment of the central banks’ poisoned money during this time went into the accumulation of cloud capital. By 2020, cloud rents accruing to cloud capital accounted for much of the developed world’s aggregate new income.
…rents stunning comeback could only mean deeper and more toxic stagnation. Wages get spent by the many struggling to make ends meet. Profits get invested in capital goods to maintain the capitalists’ capacity to profit. But rent is stashed away in property (mansions, yachts, art, cryptocurrencies, etc.) and stubbornly refuses to enter circulation, stimulate investment into useful things, and revive flaccid capitalist societies. And so the vicious cycle begins: deeper stagnation.
The pandemic (2020) exacerbated the same trend. The only significant difference from the pre-pandemic period was that, this time, and for the first time since 2008, some of the fresh trillions printed by the central banks were spent by governments on the population, to keep their citizens alive while locked down. Nevertheless, most of the new monies ended up bolstering the share price of Big Tech corporations. This explains the report of the Swiss Bank UBS, published in October 2020, which found that billionaires had increased their wealth by more than a quarter (27.5 per cent) between April and July of that year, just as millions of people around the world lost their jobs or were struggling to get by on government schemes. What happens when supply suddenly dies? Especially during times when the locked-down masses get some income support from the central banks’ money tree. The price of groceries, exercise bikes, bread makers, natural gas, petrol, housing and host other goods goes through the roof and, following a dozen years of subdued prices, a Great Inflation sets in.
When, for whatever reason, prices surge across the board, a social power game is afoot in which everyone attempts to suss out their bargaining power. Business managers try to work out how far they can raise prices – if not to profit then, at least, to recoup their own rising costs. Rentiers, both traditional and cloudalists, test the water with rent hikes. Workers assess the extent to which they can push for a pay rise – at least to compensate for the higher bills they must meet. Governments play the game too: do they intervene by using the greater income and VAT tax receipts flowing from the rising prices to assist weaker citizens being crushed by inflation? Or do they subsidize Big Business as it is squeezed by high energy prices? Or do they do nothing much? Until these questions get answered inflation continues to roll.

Delayed Green Energy Adoption

The need to switch from fossil fuels to green energy could not be more urgent. The rise in energy costs that is an integral part of the Great Inflation would seem to have taken us away from that goal, offering a windfall to the fossil fuel industry. But this will not last long. Advances in green energy are pushing down fast the costs of green electricity generation. Even though the life cycle of fossil fuels has been extended, ruinously for the planet, cloud based green energy is growing – and, with it, so is the relative power of cloudalists.

China’s Dark Deal Post 1971 Global Capitalism

From the 1970s onward, global capitalism was founded on this fascinating recycling of, mainly, Asian manufacturing profits into American rents, which in turn sustained the American imports that provided Asian factories with sufficient demand.
Why call it a Dark Deal? Because in the small print of this pact between America’s and East Asia’s ruling classes was written misery for workers on both sides of the Pacific. American workers faced the exploitation and immiseration that resulted from under investment and its industrial heartland being hollowed out by manufacturing in Asia and the underdeveloped Global South. Meanwhile, in China’s fast-industrializing coastal cities, workers suffered the frenzied exploitation associated with over investment…
The came the crash of 2008. This had two main effects that, together, underpin today’s New Cold War: it strengthened China’s position in the global recycling mechanism, ant it turbocharged the build-up of cloud capital both in the United States and China.
…when the bottom fell out of Wall Street, China stabilized global capitalism by cranking up domestic investment to more than half of China’s national income. It worked in that Chinese investment took up much of the global slack caused by Western commitment to austerity. China’s international stature rose, and its accumulating dollar surpluses allowed Beijing, in addition to feeding Wall Street, to become a major investor in Africa, Asia, even in Europe through its famed Belt and Road Initiative.

Chinese cloudalist agglomeration

…to grasp the enormity and nature of China’s big five cloudalist conglomerates – Alibaba, Tencent, Baidu, Ping An and JD.com – consider the following thought experiment. Imagine if, in the West, we were to roll into one Google, Facebook, Twitter, Instagram and the version of Chinese owned Tik Tok still available to American users. Then include the applications that play the role that telephone companies used to: Skype, WhatsApp, Viber, Snapchat. Add to the mix ecommerce cloudalists like Amazon, Spotify, Netflix, Disney Plus, Airbnd, Uber and Orbitz. Lastly, throw in PalPal, Charles Schwab and every other Wall Street bank’s own app.
Unlike Silicon Valley’s Big Tech, China’s is directly bound into government agencies that make all-pervading use of this cloudalist agglomeration: to regulate urban life, to promote financial services to unbanked citizens, to link its people with state health care facilities, to conduct surveillance of them using facial recognition, to guide autonomous vehicles through the streets – and, outside its borders, to connect Africans and Asians participating in China’s Belt and Road Initiative to its super cloud fief.
With this great leap into financial services, China’s cloudalists acquire a 360-degree view of their users’ social and financial life. If cloud capital is a produced means of behavior modification, Chinese cloudalists have accumulated cloud capital beyond the wildest dreams of their Silicon Valley competitors, who, by comparison, enjoy far less power per capita to accumulate cloud rent.

American Dollar Reign

It (dollar’s reign) has allowed countries with large trade surpluses, like China and Germany, to convert their excess production – their net exports – into property and rents in the United States: real estate, US government bonds, and any companies that Washington allowed them to own. Without the dollar’s global role, Chinese, Japanese, Korean, or German capitalists would never have been able to extract such colossal surplus value from their workers and then stash it away somewhere safe. Michael Pettis: “While the US dollar may create an exorbitant privilege for certain American constituencies, this status creates an exorbitant burden for US the economy overall, especially for the vast majority of Americans who must pay for the corresponding trade deficits either with higher unemployment, more household debt, or greater fiscal deficits.”

Cloud Capital Affect on the Liberal Individual

It (cloud capital) has produced individuals who are not so much possessive as possessed, or rather persons incapable of being self-possessed. It has diminished our capacity to focus by co-opting our attention. We have not become weak-willed. No, our focus has been stolen. And because technofeudalism’s algorithms are known to reinforce patriarchy, stereotypes and pre-existing oppressions, those that are most vulnerable – girls, the mentally ill, the marginalized and yes, the poor – suffer the outcomes most…Bigotry is technofeudalism’s emotional compensation for the frustrations and anxieties we experience in relation to identity and focus…it is intrinsic to cloud capital, whose algorithms optimize for cloud rents, which flow more copiously from hatred and discontent.
And therein lies the greatest contradiction: to rescue that foundational liberal idea – the liberty of self-ownership—will therefore require a comprehensive reconfiguration of property rights, over the increasingly cloud-based instruments of production, distribution, collaboration, and communication. To resuscitate the liberal individual, we need to do something that liberals detest: plan a new revolution.
To stand a chance of overthrowing technofeudalism and putting the demos back into democracy, we need to gather together not just the traditional proletariat and the cloud proles but also the cloud serfs and, indeed, at least some of the vassal capitalists. Nothing less than such a grand coalition that includes them all can undermine technofeudalism sufficiently.

Cloud mobilization

The beauty of cloud mobilization is that it stands on its head the conventional calculus of collective action. Instead of maximal personal sacrifice for minimal collective gain, we now have the opposite: minimal personal sacrifice delivering large collective and personal gains. This reversal has the potential to pave the way toward a coalition of cloud serfs and cloud proles that is large enough to disrupt cloudalists control over billions of people.
Under technofeudalism, we no longer own our minds. Every proletarian is turning into a cloud prole during working hours and into a cloud serf the rest of the time. Every self employed striver mutates into a cloud vassal, while every self employed struggler becomes a cloud serf. While privatization and private equity asset-strip all physical wealth around us, cloud capital goes about the business of asset stripping our brains. To own our minds individually, we must own cloud capital collectively. It’s the only way we can turn our cloud-based artifacts from a produced means of behavior modification to a produced means of human collaboration and emancipation.
For a more thorough discussion of the required mobilization see Yanis Varoufakis’novel, Another Now, 2021.

Pentagon Militarizes Climate Change

Thursday, June 15th, 2023

The Pentagon, Climate Change, and War; Charting the Rise and Fall of U.S. Military Emissions, Neta C. Crawford, 2023

For most of the last 250 years humans have carried on with making “progress”–industrialization in the service of the good life—assuming that fossil fuels were an essential ingredient in shaping the world we were certain would be better than the one we were leaving behind. Our grand strategy for security took for granted that we would need fossil fuel for industry and fossil fuel for war. Only in the last fify years or so has it become clear that burning all that fuel and at the same time destroying the forests and the wetlands that take up the carbon released by the fire, is not just leaving the past behind, but destroying the possibility of preserving what is increasingly– in essential life giving respects—understood as a better world.

In sum, the economy, foreign policy beliefs, and military doctrine institutionalized greater demand for fossil fuels. The deep cycle of oil demand, consumption, militarization, and conflict begins with demand for oil and increasing consumption. Then, when U.S. policy makers feel anxious about guaranteeing oil supplies in the face of dependency, or are concerned about the price of oil, they back allies in the Persian Gulf and greater Middle East, as occurred in 1946 and 1949, in 1957. in 1973, in 1980 and 1990. The United States played favorites within the states that had large oil reserves, even if some of those government leaders were autocratic, such as the Shah of Iran, Saddam Hussein, and the leaders of Saudi Arabia.
Yet, the risk of supporting authoritarian regimes is that those regimes are increasingly unstable as the citizens who demand more say in their government push back against authoritarian kings, emirs, and shahs. When challenges to the undemocratic, autocratic or kleptocratic rulers within states with large oil supplies occurred, or there were external challenges, the United States sometimes backed the leaders or system that is thought could bring stability. Thus with the Eisenhower Doctrine, the United States backed Saudi Arabia’s King Saud and Crown prince Faisal as a way to balance against Egyptian leader Gamal Abdel Nasser. At times, as in the case of backing the Shah of Iran in the 1970s and Saddam Hussein in Iraq in the 1980s, these alliances backfired…This in turn increases the sense among U.S. Elites that the Middle East is a volatile region that needs U.S. Intervention to remain stable.

The military has understood the science and the consequences of global warming quite well for decades. They paid for much of that research. National security strategists have sounded muted alarms, the Pentagon has adapted some of their equipment and operations, and experts have imagined scenarios of increasingly dire complex emergencies and catastrophes and climate-caused wars.
Pentagon leaders have been farsighted and tactically flexible. Some of the smartest, best-trained and most determined people on the planet, given the resources of the richest nation on earth, the people at the Pentagon are trying to make things better…They have developed better batteries, put up solar arrays at bases, and even thought about moving some bases.


Camp Lejeune Hurricane Destruction 2018

Camp Lejeune (North Carolina) was hit by Hurricane Florence in September 2018 and suffered $3.6 billion worth of damages. In October 2018 , Michael, a category 5 hurricane devastated much of Tyndall Air Force Base in Florida, including F-22 aircraft, damaged beyond repair or destroying hundreds of buildings…Estimated to cost $4.9 billion to repair and reconstruct the base, the rebuilding was expected to last as long as five to seven years.

TYNDALL AIR FORCE BASE, Florida — Hangers once used to keep aircraft out of the elements now lie scattered across the flight line following Hurricane Michael on October 10, 2018. Hurricane Michael is the third largest hurricane to make landfall in the United States, reaching peak winds of 155 miles per hour.


Tyndall Hurricane 2018

And yet at the same time, the Pentagon has been strategically inflexible and blind…For the most part, the armed forces, and our political leaders, have not put away the tools and habits of mind that got us here in the first place. Our grand strategy for national security has not fundamentally changed. In some ways it can’t, because it is premised on the anticipation and fear of war—the idea that for us to be safe, we have to be prepared to meet every threat anywhere at any time with overwhelming force. The national security strategy is also premised on the idea that force works, that the threat of coercion and the actuality of destruction can get us what we want. So once we believe those things—that war is possible and may be imminent, that we must have a capacity to make war that far exceeds our enemy’s abilities, and that coercion and destruction are effective—it seems that the only way to deal with the threat of climate change-caused war is to prepare for more war. Of course, in preparing for more war, governments give the armed forces everything they need: money, weapons, people, bases, and fossil fuels. We defend and protect the oil we think we need to defend ourselves…At the same time that we are making our weapons more energy efficient and the installations more resilient, we scarcely question whether war is inevitable or in fact made more likely by our bases and our burning fuel to be the most powerful nation on earth…The military is inadvertently or perhaps deliberately militarizing climate change.

Ode to (Gordon) Moore’s Law

Monday, February 6th, 2023

Chip War, The Fight For the World’s Most Critical Technology, 2022, Christopher Miller

This reader is a retired software engineer whose 40 year career began with punch card mainframes and ended with microcontrollers with embedded graphic displays, WiFi, and flash memory on a single chip. My introduction to computing was as a graduate research assistant on an ARPA funded project to study the dimensionality of nations. Since that time I followed the development of the ARPANET. I have lived the profound impact of Moore’s Law, needing to constantly anticipate where technology would be when project development required several years before introduction. Moore’s Law has still not been broken after all these years.
To get some sense of what exponental increase in transistors looks like consider the 1980s Cray 2 supercomputer which was export restricted for national security reasons. The CRAY-2 stood nearly 4 feet tall with a 5.5-foot diameter and weighed 5,500 pounds. The iPhone 12 is 5,000 times faster than the Cray-2!
Exponential growth of a technology is unique in human history.

  • If the computing power on each chip continued to grow exponentially, Moore realized, the integrated circuit would revolutionize society far beyond rockets and radars…At Fairchild, Noyce and Moore were already dreaming of personal computers and mobile phones.

    Alongside the rise of these new industrial titans (Intel and Micron), a new set of scientists were preparing a leap forward in chipmaking and devising revolutionary new ways to use processing power. Many of these developments occurred in coordination with government efforts, usually not the heavy hand of Congress or the White House, but the work of small, nimble organizations like DARPA (originally ARPA, Larry Robert director 1966-1973) that were empowered to take big bets on futuristic technologies — and to build the educational and R&D infrastructure that such gambles required.


    Morris Chang Founder of TSMC

    (Dutch) ASML‘s history of being spun out of Philips helped in a surprising way, too facilitating a deep relationship with Taiwan’s TSMC (founder Morris Chang). Philips had been the cornerstone investor in TSMC, transferring its manufacturing process technology and intellectual property to the young foundry. This gave ASML a built-in market, because TSMC’s fabs were designed around Philip’s manufacturing processes. An accidental fire in TSMC’s fab in 1989 helped too, causing TSMC to buy additional nineteen new lithography machines, paid for by the fire insurance. Both ASML and TSMC started as small firms on the periphery of the chip industry, but they grew together, forming a partnership without which advances in computing today would have ground to a halt.

    The next generation EUV (Extreme ultraviolet) lithography would therefore be mostly assembled abroad, though some components continued to be built in facility in Connecticut. Anyone who raised the question of how the U.S. would guarantee access to EUV tools was accused of retaining a Cold War mindset in a globalizing world. Yet the business gurus who spoke about technology spreading globally misrepresented the dynamic at play. The scientific networks that produced EUV spanned the world, bringing together scientists from countries as diverse as America, Japan, Slovenia, ad Greece. However, the manufacturing of EUV wasn’t globalized, it was monopolized. A single supply chain managed by a single company would control the future of lithography.

    By the mid 2000’s, just as cloud computing was emerging, Intel had won a near monopoly over data center chips, competing only with AMD. Today nearly every major data center uses X86 chips from either Intel or AMD. The cloud can’t function without their processors… Some companies tried challenging z86’s position as the industry standard in PCs. In 1990 Apple and two partners established a joint venture called Arm, based in Cambridge England. The aim was to design processor chips using a new instruction set architecture based on the simpler RISC (reduced instruction set computer) principles that Intel had considered but rejected. As a startup Arm faced no costs of shifting away from x86, because it had no business and no customers. Instead, it wanted to replace X86 at the center of the computing ecosystem. Arm’s first CEO, Robin Saxby, had vast ambitions for the twelve-person startup…However Aim failed to win market share in PC’s in the 1990s and 2000’s, because Intel’s partnership with Microsoft’s Windows operating system was simply too strong to challenge. However Arm’s simplified, energy-efficient architecture quickly became popular in small, portable devices that had to economize on battery use. Nintendo chose Arm based chips for its handheld video games…

    The problem wasn’t that no one realized Intel ought to consider new products, but that the status quo was simply too profitable. If Intel did nothing at all it would still own two of the world’s most valuable castles–PC and server chips–surrounded by a deep x86 moat.

    Intel turned down the iPhone contract…Apple looked elsewhere for its phone chips. Jobs turn to Arm’s architecture, which, unlike the x86 was optimized for mobile devices that had to economize on power consumption. The early iPhone processors were produced by Samsung (founder Lee Byung-chul), which had followed TSMC into the foundry business… By the time Otellini (Intel) realized his mistake, however it was too late.

    By the 2000’s, it was common to split the semiconductor industry into three categories. “Logic” refers to the processors that run smartphones, computers, and servers. “Memory” refers to DRAM which provides the short-term memory computers need to operate, and flash, also called NAND, which remembers data over time. The third category of chips is more diffuse, including analog chips like sensors that convert visual or audio signals into digital data, radio frequency chips that communicate with cell phone networks, and semiconductors that manage how devices use electricity.

    It (America’s second class status) dates to the late 1980s when Japan first overtook the U.S. DRAM output. The big shift in recent years is the collapse in the share of logic chips produced in the United States. Today, building an advanced logic fab costs $20 billion, an enormous capital investment that few firms can afford…Given the benefits of scale, the number of firms fabrication advanced logic chips has shrunk relentlessly.


    Jensen Huang CEO Nvidia

    Nvidia (which became dominant in graphics) not only designed chips called graphic processors units (GPUs) capable of handling 3D graphics, it also devised a software ecosystem around them. Making realistic graphics requires use of programs called shaders, which tell all the pixels in a image how they should be portrayed in, say a given shade of light. The shader is applied to each of the pixels in a image, a relatively straightforward calculation conducted over many thousands of pixels. Nvidia’s GPUs can render images quickly because, unlike Intel’s microprocessors or other general-purpose CPUs, they’re structured to conduct lots of simple calculations–like shading pixels–simultaneously.
    In 2006, realizing that high-speed parallel computations could be used for purposes besides computer graphic, Nvidia released CUDA, software that lets GPUs be programmed in a standard programming language, without any reference to graphic at all. Even as Nvidia was churning out top-notch graphic chips, Huang (CEO) spent lavishly on this software effort, at least $10 billion,…to let any programmer–not just graphic experts–work with Nvidias chips…Nvidia discovered a vast new market for parallel processing, from computational chemistry to weather forecasting. At the time, Huang could only dimly perceive the potential growth in what would become the biggest use case for parallel processing, artificial intelligence.
    Today Nvidia’s chips, largely manufactured by TSMC, are found in most advanced data centers.


    Dr. Irwin Jacobs co founder Qualcom

    For each generation of cell phone technology after 2G, Qualcomm contributed key ideas about how to transmit more data in the radio spectrum and sold specialized chips with the computing power capable of deciphering this cacophony of signals. The companies patents are so fundamental it’s impossible to make a cell phone without them. Qualcomm soon diversified into a new business line, designing not only the modem chips in a phone that communicate with a cell network, but also the application processors that run a smartphone’s core systems. These chip designs are monumental engineering accomplishments, each built on tens of millions of lines of code.

    For many years, each generation of manufacturing technology was named after the length of the transistor’s gate, the part of the silicon chip whose conductivity would be turned on and off, creating and interrupting the circuit. The 180nm node was pioneered in 1999, followed by 130nm, 90nm, 65nm, and 45nm, with each generation shrinking transistors enough to make it possible to cram roughly twice as many in the same area. This reduced power consumption per transistor, because smaller transistors needed fewer electrons to flow through them.
    Around the early 2010s, it became unfeasible to pack transistors more densely by shrinking them two dimensionally. One challenge was that, as transistors were shrunk according to Moore’s Law, the narrow length of the conductor channel occasionally caused power to “leak” through the circuit even when the switch was off. On top of this, the layer of silicon dioxide atop each transistor became so thin that quantum effects like “tunneling”–jumping through barriers that classical physics said should be insurmountable–began seriously impacting transistor performance. By the mid 2000s. the layer of silicon dioxide on top of each transistor was only a couple of atoms thick, too small to keep a lid on all the electrons sitting in the silicon.
    To better control the movement of electrons, new materials and transistor designs were needed. Unlike the 2D design used sine the 1960s, the 22nm node introduced a new 3D transistor, called a FinFET (pronounced finfet), that sets the two ends of the circuit and the channel of semiconductor material that connects them on top of a block, looking like a fin protruding from a whale’s back. The channel that connects the two ends of the circuit can therefore have an electric field applied not only from the top but also from the sides of the fin, enhancing control over the electrons and overcoming the electricity leakage that was threatening the performance of new generations of tiny transistors…These nanometer-scale 3D structures were crucial for the survival of Moore’s Law, but they were staggeringly difficult to make, requiring even more precision in deposition, etching, and lithography. This added uncertainty about whether the major chip-makers would all flawlessly execute the switch to FinFET architectures or whether one might fall behind…Moreover, the 2008-2009 financial crisis was threatening to reorder the chip industry. Consumers stopped buying electronics, so tech firms stopped ordering chips.

    Smartphones and PCs are both assembled largely in China with high-value components mostly designed in the U.S., Europe, Japan, or Korea. For PCs, most processors come from Intel and are produced at one of the company’s fabs in the U.S., Ireland, or Israel. Smartphones are different. They are stuffed full of chips, not only the main processor (which Apple designs itself), but modem and radio frequency chips for connecting with cellular networks, chips for WiFi and Bluetooth connections, an image sensor for the camera, at least two memory chips, chips that sense motion (so your phone knows when you turn it horizontal), as well as semiconductors that manage the battery, the audio, the wireless charging. These chips make up most of the bill of materials needed to build a smartphone.
    As semiconductor fabrication capacity migrated to Taiwan and South Korea, so too did the ability to produce many of these chips. Application processors, the electronic brain inside each smartphone, are mostly produced in Taiwan and South Korea before being sent to China for final assembly inside a phones plastic case and glass screen. Apple’s iPhone processors are fabricated exclusively in Taiwan.

    These tricks kept Moore’s Law alive, as the chip industry shrank transistors from the 180nm node in the late 1990s, through the early stages of 3D FinFET chips, which were ready for high-volume manufacturing by the mid-2010s.
    However, there were only so many optical tricks that could help 193nm light carve smaller features. Each new workaround added time and cost money. By the mid-2010s, it might have been possible to eke out a couple of additional improvement, but Moore’s Law needed better lithography tools to carve smaller shapes. The only hope was that the hugely delayed EUV lithography tools, which had been in development since the early 1990s, could finally be made to work at a commercial scale.

    Even the deep pockets of the Persian Gulf royals who owned GlobalFoundries weren’t deep enough. The number of companies capable of fabricating leading-edge (7nm) chips fell from four to three. (TSMC, Intel, and Samsung).

    As investors bet that data centers will require ever more GPUs, Nvidia has become America’s most valuable semiconductor company. Its assent isn’t assured however, because in addition to buying Nvidia chips the big cloud companies — Google, Amazon, Microsoft, Facebook, Tencent, Alibaba, and others — have also begun designing their own chips, specialized to their processing needs, with a focus on artificial intelligence and machine learning.


    Master Chip Designer Jim Keller

    Gordon Moore’s famous law is only a prediction, not a fact of physics…At some point, the laws of physics will make it impossible to shrink transistors further. Even before then, it could become too costly to manufacture them. The rate of cost declines has already significantly slowed. The tools needed to make ever-smaller chips are staggeringly expensive, none more so than the EUV lithography machines that cost more than $100 million each.
    The end of Moore’s Law would be devastating for the semiconductor industry — and for the world. We produce more transistors each year only because it’s economically viable to do so.
    The durability of Moore’s Law, in other words has surpassed even the person who it’s named after and the person who coined it. It may well surprise today’s pessimists too. Jim Keller, the star semiconductor designer who is widely credited for transformative work on chips at Apple, Tesla, AMD, and Intel has said he sees a clear path toward a fifty times increase in the density with which transistors can be packed on chips.”we’re not running out of atoms”, Keller has said. “We know how to print single layers of atoms.”

  • Smart WIFI Home Thermostats and the Notorious C-Wire

    Sunday, December 4th, 2022

    This is a short note for anyone struggling to install a new “Smart” WIFI thermostat in their home.

    We installed a new Carrier heat pump model 38YRA in our home in 2001. I soon replaced the simple thermostat that was installed with the Lux 1500 with a 5-2 day programmable thermostat so we could take advantage of SRP’s (our local power monopoly) time of usage savings plan.

    Numerous vendors started offering Black Friday specials on Honeywell Smart Thermostat 9000 series with color touchscreen for $99 (list price $199). Then I noticed that many local power providers, including SRP were offering a $50 incentive to install “Smart” thermostats so I ordered a Honeywell unit. Included in the box was a C-Wire Addendum. It seems that to power the new touchscreen WIFI thermostat you need full time 24V power which is delivered on a wire labeled C. Except my wiring at the thermostat only had 4 wires connected to G Y B-O and RC-RH on the old Lux 1500. There were 3 additional unused wires at the thermostat but none provided 24V power. The 3 wires were not connected at the Carrier unit installed in our attic during installation.

    So I climbed into the attic (difficult) and removed a large cover panel from the Carrier unit and discovered the transformer that converts 120V to 24V and a connecting terminal board with labeled wires R C O W2 and Y (5 wires). Terminal C was never connected to a thermostat destined wire. So I arbitrarily picked the unused brown wire and connected it to the C terminal. Like magic I now had continuous 24V power and the new “Smart” thermostat powered up.

    Why didn’t the original installers in 2001 attach one of the unused 3 wires to the C terminal and save me and millions of other installers a lot of trouble installing “Smart” thermostats? Clearly the C wire existed in 2001.

    Tech can regenerate Rural Heartland America’s Economy

    Friday, April 22nd, 2022

    Dignity in a Digital Age; Making Tech Work for all of Us, Ro Khanna, 2022

    (Frederick) Douglass’s vision informs this book, which, at its heart is an attempt to imagine how technology can advance democratic patriotism, which is predicated on respecting the dignity of every American. The book shares Douglass’s faith that we can be a composite nation– that we can embrace a holistic, resplendent American identity that is more than just a formal contract among citizens. It offers a blueprint for structuring the technology revolution to empower left-behind Americans, regardless of their background, so they have a stronger voice in our economic and political life, building thriving communities, and are on more equal footing to participate in the dynamic process of developing our national culture.

    We have looked thus far at how to respect dignity in a digital age domestically by focusing on distributing jobs, empowering workers, cultivating our freedoms, protecting online rights, creating deliberative forums, and including a multiplicity of voices in science policy.

    This era calls for an Essential Workers Bill of Rights…The framework would promise livable wages, benefits, and bargaining rights for workers. It envisions giving employees a voice in shaping automation and pushing back against intrusive surveillance and abusive supervisors–a particular challenge in a remote and distributed workplace, which makes organizing difficult. Until all workers reap the benefits of their hard word and are treated with dignity, the promise of the digital age remains unfulfilled.

    but nearly 80 percent of venture capital funding goes to only three states, California, New York, and Massachusetts. While California had nearly 4,000 VC deals and New York 1,400 in 2019, the state of West Virginia had only one.

    A policy of creating new tech hubs should begin with the most promising locations and then gradually expand…Each hub would specialize in a few technologies based on their regional assets and expertise. The cutting edge fields would include quantum computing, data science, clean energy, cybersecurity, robotics, electronics manufacturing, and synthetic biology.

    According to a Harlem Capital report, as of 2019, three are only two hundred startups led by Black or Brown founders that received more than $1 million in funding, and the total investment in these Black or Brown led start-ups is just $6 billions over nearly a decade. (Total VC funds deployed are $130 billion annually) The capital wealth gap is not just based on legacy assets but on a current, ongoing racial wealth generation gap.

    The moral case for prioritizing workers is that they’ve been denied the gains they’ve helped create, and dignity that they’ve deserved, for multiple decades now. But the dollars and cents economic rationale is straightforward as well. While paying workers more may not maximize short-term shareholder value, in the way that stock buybacks or dividend payments do, it will lead to more consumer spending.

    What they (Silicon Valley techies) may see as mundane and easily automated tasks actually are skilled ones, requiring dexterity, balance, judgment, practice, patience, precision, mapping, and a specific attention to detail.

    Joel Rogers, a law professor at University of Wisconsin famously laid out the “high road” strategy, arguing that firms that foster worker participation and develop worker talent will generate more wealth compared to firms that use a top-down, command model to get the most out of their workers at the cheapest wage. Rogers argued that “shared prosperity”, “environmental sustainability”, and workplace democracy were “necessary complements, not tragic tradeoffs”, in maximizing revenue growth.

    He (Gary Backer) said that the fuel for modern economic growth are investments in on-the-job training, health, information, and research and development. In fact, Becker argued that human capital, which is the “knowledge, information, ideas, skills, and health of individuals”, is “over 70 percent of the total capital in the United States.”

    The ultimate design of Apple and Google’s Covid app is consistent with many of my Internet Bill of Rights principles, and it is an example of technology designed to meet these standards. The app requires a use to consent before any data is collected, and it offers an easy-to-understand explanation of how the data will be used. There is no centralized database, negating the major risk of data theft in a breach or the need for deletion. If someone is Covid positive, then they receive a digital code from their local health department to enter into their phone, which sends an anonymous exposure alert to every user in contact with them while they are infectious. Apple and Google both prioritized the need for interoperability so the app can work across platforms on any smartphone in the world. They made a concerted effort to minimize the data collected and committed not to have access to the data.

    If tech companies are serious about promoting constructive dialogue, they should work with behavioral scientists, political theorists, psychologists, and mental health professionals to experiment with new designs. Software engineers should not only optimize for attention with like and share buttons, but to also prioritize for engagement with a wide range of perspectives. They could construct platforms for instance, that incentive users to participate in diverse online communities. Pushing opposing material to users can motivate them to become even stronger proponents of their ideologies…The key is to foster open-communications, and not simply self-affirmation.

    The best we may be able to hope for in an imperfect democracy is a plurality of online forums for political conversations that (a) demonstrate a threshold respect for our agency as participants by not engaging in data extraction, (b) are transparent in terms of both their speech standards and the priorities of their algorithms, and (c) comply with legitimate restrictions on unlawful speech consistent with our First Amendment jurisprudence.

    An Indiana University study that analyzed millions of tweets during the 2016 presidential election concluded that bots are a large contribute to the spread of misinformation and conspiracies. They share fabricated stories, make salacious content trend, and overwhelm investigatory teams with volume that makes it hard to find the offending postes. Most shockingly, the presence of fake accounts helps explains why “over 86% of shares and 75% of comments on German political Facebook from October 2018 to May 2019 were Alterative fur Deutschland content” even though the far-right party never “exceeded 15% of public support in polls during this time period”.

    The Living Planet Earth – Gaia and Solaris

    Friday, February 18th, 2022

    The Nutmeg’s Curse; Parables for a Planet in Crisis, Amitav Ghosh, 2021

    Out of these processes of subduing and muting was born the idea of “nature” as an inert entity,a conception that would in time become a basic tenet of what might be called “official modernity”. This metaphysic, fundamentally an ideology of conquest, would eventually become hegemonic in the West, and it it now shared by the entire global elite: within its parameters the idea that a volcano can produce meaning; or that a nutmeg can be a protagonist in history, can never be anything other than a delusion or a “primitive superstition”. To envision the world in this way was a crucial step toward making an inert Nature a reality. As Ben Ehrenreich observes; “Only once we imagined the world as dead could we dedicate ourselves to making it so.”


    Standing Rock Protest against the Dakota Access Pipeline – Camp became a prayer camp because its lands are a site of ancestral knowledge

    Now, as humanity faces the possibility of a future in which living will indeed have turned into a battle for survival, it is becoming increasingly clear that Indigenous understandings of terraforming were, in fact, far more sophisticated than those of today’s techno-futurists.

    Exhaustion is a metaphor that occurs often in science fiction stories about terraforming. Swarms of aliens go off to conquer another planet because their own is “exhausted”. It is the same presumption that impels billionaires to plan the conquest of Mars, now that the Earth is “exhausted”…what the Earth is really exhausted of is not its resources, what it has lost is meaning.

    He (Tennyson) sees the ascent of his “crowning race” as coming about by the severing of every kind of earthly tie, through the overcoming of everything that links humanity to other creatures and animals. He even proposes what we might call an “end of history” or an “end of the world”…This is man’s final ascent, when all creation ends and he is united with God.

    As we watch the environmental and biological disasters that are now unfolding across the Earth, it is becoming ever harder to hold onto the belief that the planet is an inert body that exists merely to provide humans with resources. Instead, the Earth’s responses are increasingly reminiscent of the imaginary planet after which the Polish science fiction writer Stanislaw Lem named his brilliant novel, Solaris: when provoked by humans, Solaris begins to strike back in utterly unexpected and uncanny ways.

    James Lovelock; “Long ago the Greeks…gave to the Earth the name Gaia or, for short, Ge. In those days science and technology were one and science although less precise, had soul. As time passed this warm relationship faded and was replaced by the frigidity of schoolmen. The life sciences, no longer concerned with life, fell to classifying dead things and even to vivisection…Now at least there are signs of a change. Science becomes holistic again and rediscovers soul, and theology, moved by ecumenical forces, begins to realize that Gaia is not to be subdivided for academic convenience and that Ge is much more than a prefix.”

    Other-than-human beings, forces, and entities, both manmade and earthly, could be pursuing their own ends, of which humans know nothing. Gaia, at once bountiful and monstrous, has now assumed a new avatar: Solaris.

    It is this compressed time frame (the last 30 years) that has made sure that non humans too are no longer as mute as they once were. Other beings and forces–bacteria, viruses, glaciers, forests, the jet stream– have also unmuted themselves and are now thrusting themselves so urgently on our attention that they can no longer be ignored or treated as elements of an inert Earth.

    So the true question then is not whether non humans can communicate and make meaning; rather we must ask: When and how did a small group of humans come to believe that other beings, including the majority of their own species, were incapable of articulation and agency? How were they able to establish the idea that non humans are mute, and without minds, as the dominant wisdom of the time?

    The reason why coal powered mills began to edge out their water-powered competitors in the early nineteenth-century was not that coal was cheaper or more efficient. Water powered mills were just as productive, and far cheaper to operate than coal-fired mills. It was for social rather than technical reasons that steam-powered machines prevailed: because, for example, coal-mills allowed mill owners to locate their factories in densely crowded cities, where cheap labor was easily available. “The steam engine” writes Malm,”was a superior medium for extracting surplus wealth from the working class, because, unlike the waterwheel, it could be put virtually anywhere”…In short, steam, and thus coal won out over water precisely because it empowered the dominant classes and was better suited to their favored regime of property.

    The role that fossil fuels play in war making is another, monstrously vital, aspect of their enmeshment with structures of power and forms of violence…Today the Pentagon is the single largest consumer of energy in the United States–and probably in the world…A single non-nuclear aircraft carrier consumes 5,621 gallons of fuel per hour; in other words, these vessels burn up as much fuel in one day as a small midwestern town might use in a year…In the 1990s the three branches of the US military consumed approximately 25 billion tons of fuel per year. This was more than a fifth of the country’s total consumption, and “more than the total commercial energy consumption of nearly two thirds of the world’s countries.”…Indeed, the predicament of the US Department of Defense is a refraction of the quandary that now confronts the world’s status quo powers: how do you reduce your dependence on the very “resources” on which your geopolitical power is founded?

    Viewed from this perspective, climate change is but one aspect of a much broader planetary crisis: it is not the prime, cause of dislocation, but rather a cognate phenomenon. In this sense climate change, mass dislocations, pollution, environmental degradation, political breakdown, and the Covid-19 pandemic are all cognate effects of the ever increasing acceleration of the last three decades. Not only are all these crises interlinked–they are all deeply routed in history, and they are all ultimately driven by the dynamics of global power.

    Just as the Lakota, repeatedly displaced by wars and the rising of dammed waters, were herded into ever shrinking reservations, so too are the refugees of today’s geopolitical wars being forced into zones of containment in North Africa, the Sahara, Mexico, Central America, and islands like Nauru. They too are casualties in a conflict that is not recognizable as war, in the sense defined by Western legal theorists. Yet the parallels with the biopolitical wars of the past are perfectly clear to many indigenous peoples–thus the title of Nick Este’s powerful account of the environmental struggles of the Lakota and their kin Our History is the Future.

    Since the adoption in 1989 of the Washington Consensus, the ideologies and practices of settler colonialism have been actively promoted, in their neoliberal guise, by the world’s most powerful countries, and have come to be almost universally adopted by national and global elites. It is those settler-colonial practices that are now being implemented by China, in Xinjiang; by Indonesia in Papua; and by India,in Kashmir and in many of its foreign regions.

    And surely it is no accident that today there exists a technology of last resort that many believe will ultimately work in favor of the neo-Europes: geo-engineering. As novel as it may seem, geoengineering is nothing other than terraforming carried literally into the stratosphere; it should by rights be called “strato-forming”. Today some of the richest and most powerful people, and institutions, in the West are openly promoting geo-engineering. Their enthusiasm makes it impossible to forget that “from the mid-eighteenth century onward, modern science explicitly supported empire, defining strategies for colonization.”

    This is the great burden that now rests upon the writers, artists, filmmakers, and everyone else who is involved in the telling of stories: to us falls the task of imaginatively restoring agency and voice to non humans. As with all the most important artistic endeavors in human history, this is a task that is at once aesthetic and political–and because of the magnitude of the crisis that besets the planet, it is now freighted with the most pressing moral urgency.

    Pope Francis; “A true ecological approach always becomes a social approach, it must integrate questions of justice in debates on the environment, so as to hear both the cry of the earth and the cry of the poor.”

    Shattering the Mythical basis of Anthropology and Archeology – Imagining Alternate Futures and Recovering our Freedom

    Tuesday, February 8th, 2022

    The Dawn of Everything; A New History of Humanity, David Graeber & David Wengrow, 2021

    Father Lallemant:”From the beginning of the world to the coming of the French, the Savages have never known what it was so solemnly to forbid anything to their people, under any penalty, however slight. They are free people, each of whom considers himself of as much consequence as the others; and they submit to their chiefs only in as far as it pleases them.”

    The democratic governance of the Wendat and Five Nations of the Haudenosaunee, which so impressed later European readers, was an expression of the same principal; if no compulsion was allowed, then obviously such social coherence as did exist had to be created through reasoned debate, persuasive arguments, and the establishment of social consensus.

    An impoverished French aristocrat known as Lahonstan recorded his conversions with a Wendat statesman named Kandiaronk; “In short, they neither quarrel, nor fight, nor slander one another…They brand us for slaves, and call us miserable souls, whose life is not worth having, alleging that we degrade ourselves in subjecting ourselves to one man [the king] who possesses all the power, and is bound by no law but his own will.”

    If human beings, through most of our history, have moved back and forth fluidly between different social arrangements, assembling and dismantling hierarchies on a regular basis, maybe the real question should be ‘how did we get stuck?’. How did we end up on one single mode? How did we lose that political self-consciousness, once so typical of our species? How did we come to treat eminence and subservience not as temporary expedients, or even the pomp and circumstance of some kind of grand seasonal theatre, but as inescapable elements of the human condition? If we started out just playing games, at what point did we forget that we were playing?

    As indigenous legal scholars have been pointing out for years, the ‘Agriculture Argument’ makes no sense, even on its own terms. There are many ways, other than European style farming, in which to care for and improve the productivity of land. What to a settler’s eye seemed savage, untouched wilderness usually turns out to be landscapes actively managed by indigenous populations for thousands of years through controlled burning, weeding, coppicing, fertilizing and pruning, terracing estuarine plots to extend the habitat of particular wild flora, building clam gardens in intertidal zones to enhance the reproduction of shellfish, creating weirs to catch salmon, bass, and sturgeon, and so on. Such procedures were often labour intensive, and regulated by indigenous laws governing who could access groves, swamps, root beds, grasslands and fishing grounds, and who was entitled to exploit what species at any given time of year.

    In fact land ownership illustrates perfectly what Rudolf von Ihering called the state’s monopoly of violence within a territory — just within a much smaller territory than a nation state.

    Just as access to violence, information, and charisma defines the very possibilities of social domination, so the modern state is defined as a combination of sovereignty, bureaucracy and a competitive political field.

    …in all parts of the world small communities formed civilizations in that true sense of extended moral communities. Without permanent kings, bureaucrats or standing armies they fostered the growth of mathematical and calendrical knowledge. In some regions they pioneered metallurgy, the cultivation of olives, vines and date palms, or the invention of leavened bread and wheat beer; in others they domesticated maize and learned to extract poisons, medicines and mind-altering substances from plants. Civilization, in this true sense, developed the major textile technologies applied to fabrics and basketry, the potter’s wheel, stone industries and beadwork, the sail and maritime navigation, and so on. A moments reflection shows that women, their work, their concerns and innovations are at the core of this more accurate understanding of civilization.

    It was precisely this combination of such conflicting ideological possibilities — and of course, the Iroquoian penchant for prolonged political argument — that lay behind what we have called the indigenous critique of European society. It would be impossible to understand the origins of its particular emphasis on individual liberty, for instance, outside that context. Those ideas about liberty had a profound impact on the world. In other words, not only did indigenous North Americans manage almost entirely to sidestep the evolutionary trap that we assume must always lead, eventually, from agriculture to the rise of some all-powerful state or empire; but in doing so they developed political sensibilities that were ultimately to have a deep influence on Enlightenment thinkers and, through them, are still with us today.

    We started out by observing that to inquire after the origins of inequality necessarily means creating a myth, a fall from grace, a technological transposition of the first chapters of the Book of Genesis — which, in most contemporary versions, takes the form of a mythical narrative stripped of any prospect of redemption. In these accounts, the best we humans can hope for is some modest tinkering with our inherently squalid condition — and hopefully, dramatic action to prevent any looming, absolute disaster. The only other theory on offer to date has been to assume that there were no origins of inequality, because humans are naturally somewhat thuggish creatures and our beginnings were a miserable, violent affair; in which case ‘progress’ or ‘civilization’ — driven forward largely, by our own selfish and competitive nature — was itself redemptive. This view is extremely popular among billionaires but holds little appeal to any else, including scientists, who are keenly aware that it isn’t in accord with the facts…The more rosy, optimistic narrative — whereby the progress of Western civilization inevitably makes everyone happier, wealthier and more secure — has at least one obvious disadvantage. It fails to explain why that civilization did not simply spread of its own accord; that is, why European powers should have been obliged to spend the last 500 or so years aiming guns at people’s heads in order to force them to adopt it.

    …innovation in Neolithic societies was based on a collective body of knowledge accumulated over centuries, largely by women, in an endless series of apparently humble but in fact enormously significant discoveries. Many of those Neolithic discoveries had the cumulative effect of reshaping everyday life every bit as profoundly as the automatic loom or lightbulb.

    Over the course of these chapters we have instead talked about the basic forms of social liberty which one might actually put into practice; (1) the freedom to move away or relocate from one’s surroundings; (2) the freedom to ignore or disobey commands issued by others; and (3) the freedom to shape entirely new social realities, or shift back and forth between different ones…It is clear that something about human societies really has changed here, and quite profoundly. These three basic freedoms have gradually receded, to the point where a majority of people living today can hardly comprehend what it might be like to live in a social order based on them.

    There is nothing particularly primordial about such arrangements (murder of entire populations); certainly there is no reason to believe they are in any sense hardwired into the human psyche. On the contrary, it’s almost invariably necessary to employ some combination of ritual, drugs and psychological techniques to convince people, even adolescent males, to kill and injure each other in such systematic yet indiscriminate ways.

    Time and again we found ourselves confronted with writing which simply assumes that the larger and more densely populated the social group, the more complex the system needed to keep it organized. Complexity in turn, is still often used as a euphemism for chains of command, which means that as soon as large numbers of people decided to live in one place or join a common project they must necessarily abandon the second freedom — to refuse orders–and replace it with legal mechanisms for, say, beating or locking up those who don’t do as they’re told. As we’ve seen, none of these assumptions are theoretically essential, and history tends not to bear them out…complex systems don’t have to be organized top-down, either in the natural world or in the social world.

    What is the purpose of all this new knowledge, if not to reshape our conceptions of who we are and what we might become? if not, in other words, to rediscover the meaning of our third basic freedom: the freedom to create new and different forms of social reality?…We know, now, that we are in the presence of myths.

    Pax Britannica, Pax Americana, Pax Zhōngguó, The Last World Empire?

    Friday, January 28th, 2022

    To Govern the Globe, World Orders & Catastrophic Change, Alfred W. McCoy, 2021

    Indeed, the Iberian (Spain and Portugal) vision of expansive sovereignty — acquisition of terrain by conquest and oceans by exploration — would continue under Dutch and British hegemony, illustrating the capacity of these global systems to survive the empires that created them…Thanks to the British and Dutch decisions to strip their colonial subjects of civil liberties and carry the transatlantic slave trade to new heights, the Iberian hierarchy of human inequality would, in all its cruelty and tragedy continue.

    In the years following the (Dutch) East India Company’s founding in 1602, the city’s (Amsterdam) dynamism led to a host of financial innovations that soon made it… “the clearinghouse of world trade”. The new Bank of Amsterdam took deposits, transferred funds trans nationally and later stored vast quantities of precious metals in its vaults, helping make the city “Europe’s reservoir of gold and silver coin.” The Chamber of Maritime Insurance offered coverage for dozens of dangerous destinations, while the newspaper Amterdamsche Courant gave the city’s merchants critical information about the prices of goods arriving from those distant shores. Amsterdam also built the world’s first stock exchange, where up to five thousand met to trade more than four hundred commodities around a central courtyard that became “the nerve center of the entire international economy.”

    William III Mary II

    William’s (of Orange) reign also witnessed a modernization of the British economy along Dutch lines, exemplified by the founding of the Bank of England, the London Stock Exchange, and a profusion of private banks, insurance companies, and joint stock firms.

    Coal was the catalyst for an industrial revolution that fused steam technology with steel production to make Britain master of the world’s oceans

    Each step in slavery’s eradication was foreshadowed by a new stage in Britain’s use of coal-fired energy–including the introduction of steam power in mills and mines by the time Parliament banned the slave trade in 1807; the development of mobile steam engines for land and sea transport prior to the Later, abolition of West Indies slavery in 1833; and the adoption of coal powered steam power in almost all British industries by the 1850’s, when the Royal Navy’s anti-slavery patrols reached their coercive climax. Later, new forms of fossil energy — electricity and internal combustion engines — would render even the coerced labor of the imperial age redundant.

    By the end of the nineteenth century, the Swedish physicist Svante Arrhenius would publish the first report on the capacity of industrial emission to cause global warming. By countless hours of painstaking manual calculations, he predicted with uncanny prescience and considerable precision “the temperature in the arctic regions would rise about 8 degrees to 9 degrees C., if the [carbon dioxide] increased 2.5 or 3 times its present value.”

    Britain was the world’s preeminent power for more than a century, but its dominance nevertheless evolved through two distinct phases. From 1815 to 1880 it largely oversaw an “informal empire” with a loose hegemony over client states worldwide. In the period of “high imperialism” from 1880 to 1940, however the empire combined informal controls in countries like China, Egypt, and Iran with direct rule over colonies in Africa and Asia to encompass a full half of all humanity.

    Parliament rescinded mercantilist laws that had protected British commerce for centuries, starting with the abolition of the (British) East India Company’s monopolies on Asian trade.

    British engineers built the world’s first major central power plant at Deptford, London in 1888, capable of lighting two million electric bulbs. As electrical plants spread quickly, their generators were powered by the first coal-fired steam turbines…tying a knot between coal and electricity that persists to this day.

    By century’s end (19th), discoveries (Iran, Indonesia, Burma) had created a sufficient supply of oil to enable a shift from steam to internal combustion engines in ships, trains, automobiles, and ultimately, aircraft.

    Eleanor Roosevelt Universal Declaration of Human Rights

    In fulfilling this commitment to human rights (The UN’s Universal Declaration of Human Rights 1948), the United States would face some exceptional challenges. Unlike earlier imperial powers, it was, after all, a former colony with a long history of slavery and a succeeding system of racial segregation that would compromise its commitment to those principle at home. As its global power grew during these postwar decades, Washington would cultivate anti-Communist allies among authoritarian leaders in Asia, Africa, and Latin America, tacitly endorsing torture and repression in their lands. Even as the US practiced racial segregation at home and backed ruthless dictators abroad, civil society groups worldwide would continue to fight for human rights, just as African Americans would struggle for their civil rights at home, making this universal principal a defining attribute of Washington’s world order, almost in spite of itself.

    By the time Washington’s world order was fully formed in the late 1950’s, the unequal power of its nuclear-armed bombers, its countless overseas military bases, and its covert interventions in the affairs of countless nations coexisted tensely with a new world order, epitomized by the UN, that was meant to protect the sovereignty of even small states and promote universal human rights. This underlying duality of Washington’s version of world power would manifest itself in numerous contradictions during its 70 years of global hegemony.

    Washington’s visionary world order took form at two major conferences — at Breton Woods, New Hampshire, 1944 where 44 Allied nations forged an international financial system exemplified by the World Bank, and at San Francisco in 1945, where they drafted a charter for the UN that created a community of nations. The old order of competing empires, closed imperial trade blocs, and secret alliances would soon give way to an international community of emancipated colonies, sovereign nations, free trade, and peace through law. In essence, the UN charter’s many clauses rested on just two foundational principles that would soon become synonymous with Washington’s world order; inviolable national sovereignty and universal human rights.

    Between 1945 and 2000, the US intervened in 81 consequential elections worldwide, including eight times in Italy, five in Japan, and many more in Latin America. Between 1958 and 1975, military coups, many of them American sponsored, changed governments in three dozen nations — a quarter of the world’s sovereign states — fostering a distinct “reverse wave” in the global trend toward democracy.

    George Kennan supported covert operations

    George Kennan, State Department official later called the creation of the CIA with authorization to conduct covert operations “The greatest mistake [he] ever made.

    President Truman tried to limit the newly created CIA to intelligence gathering only with no authorization for covert activities. Allen Dulles maneuvered Frank Wisner into position as OPC Chief and by 1952 OPC was operating 47 overseas stations and employed 3000 people. It specialized in the black arts of espionage sabotage, subversion, and assassination. When Eisenhower became president in 1953, Kermit Roosevelt led the overthrow of the elected President of Iran and installation of the Shaw.

    Throughout its rise to world power from 1820 to 1870, Britain increase its share of gross world product by just 1 percent per decade, while America’s rose by 2 percent during its accent from 1900 to 1950. By contrast, China was increasing its slice of the world pie at an extraordinary pace of 5 percent from 2000 to 2020.

    …the accounting firm PriceWaterhouseCoopers calculated that China’s economic output had already surpassed America’s in 2014 and was on a trajectory to become 40 percent larger by 2030.

    Across Europe, hypernationalist parties like the French National Front, Greece’s Golden Dawn, Alternative for Germany, and the British Independence Party won voters by cultivating nativist reactions to just such trends, often attacking the economic globalization that had become a hallmark of Washington’s world order. Simultaneously, a generation of populist demagogues won power in nominally democratic nations around the world — notably Viktor Orban in Hungary, Vladimir Putin in Russia, Recep Tayyip Erdogan in Turkey, Narendra Modi in India, Rodrigo Duterte in the Philippines, and of course, Donald Trump in the United States.

    While a weakening of Washington’s global reach seems likely, the future of its world order is still unclear. At present, China is the sole state to have most (but not all) of the requisites to become a new global hegemon. Its economic rise coupled with its expanding military and growing technological prowess under the “Made in China 2025” program, has given it many of the elements fundamental to superpower status…Yet, as the 2020s began, no state seemed to have both the full panoply of power to supplant Washington’s world order and the skill to establish global hegemony. Indeed, apart from its rising economic and military clout, China has a self referential culture, recondite non roman script (requiring 4000 characters instead of 26 letters), nondemocratic political structures, and a subordinate legal system that will deny it some of the chief instruments for global leadership.

    Successful imperial transitions driven by the hard power of guns and money also require the soft power salve of cultural suasion if they are to achieve sustained and successful global domination. During its near century of hegemony from 1850 to 1940, Britain was the exemplar par excellence of soft power, espousing an enticing political culture of fair play and free markets that it propagated through the Anglican church, the English language and its literature, mass media such as the British Broadcasting Corporation, and it virtual creation of modern athletics (including cricket, soccer, tennis, rugby, and rowing). Similarly, US military and economic domination after 1945 was made more palatable by the appeal of Hollywood films, civic organizations like Rotary international, and popular sports like basketball and baseball. On the higher plane of principle, Britain’s anti-slavery campaign invested its global hegemony with moral authority, just as Washington’s advocacy of human rights lent legitimacy to its world order…China still has nothing comparable. Both its communist ideology and its popular culture are avowedly particularistic.

    China has been a command economy state for much of the past century, and as such has developed neither the legal culture of an independent judiciary nor an autonomous rules-based order complementary with the web of law that undergirds the modern international system.

    Xi Jiping – Zhōngguó (China) is translated as Middle Kingdom

    If, however, Bejing’s potentially immense infrastructure investments, history’s largest by far, succeed in unifying the commerce of three continents, then the currents of financial power and global leadership may indeed flow, as if by natural law, toward Beijing. But if that bold project falters or ultimately fails, then for the first time in five centuries, the world could face an imperial transition without a clear successor as global hegemon.

    From scientific evidence, it seems clear that, for the first time in seven hundred years, humanity is facing another cumulative, century long catastrophe akin to the Black Death of 1350 to 1450 that could once again rupture a global order and set the world in motion…If the “Chinese century” does indeed start around 2030, it is unlikely to last long, ending perhaps sometime around 2050 when the impact of global warning becomes unmanageable. With its main financial center at Shanghai flooded and its agricultural heartland baking in insufferable heat, China’s days as a global power will be numbered.

    Given that Washington’s world system and Beijing’s emerging alternative are largely failing to limit carbon emissions, the international community will likely need a new form of collaboration to contain the damage. In the years following the Paris climate accord, the current world system — characterized by strong nation-states and weak global governance at the UN — has proven inadequate to the challenge of climate change. The 2019 Madrid climate summit failed to forge a collective agreement for emission reduction sufficient to cap global warming to 1.5C, largely due to the obstruction of major emitters like Australia, Brazil, China, India, and the United States. Any world order, whether Washington’s or Beijing’s that is based on primacy of the nation-state will probably prove incapable of coping with the political and economic crisis likely to arise from the appearance of some 275 million climate refuges by 2060 or 2070.

    The Laws of War: Nuremberg Trials, Vietnam, 9/11, Obama

    Saturday, November 27th, 2021

    Humane; How the United States Abandoned Peace and Reinvented War, Samuel Moyn, 2021


    <> <> <> Quincy Wright in 1933 and 1941:

    “The pax Britannia had given Europe the best two centuries it has had — at least since the pax Romana a millennium and a half earlier.” It was a long established fact: empires brought peace, too. “The excessively brutal civil and imperial wars which characterized the last century of the Roman Republic were followed by such a will to peace that most of the western world submitted to the Pax Romana of Augustus and his successors for two centuries.” (Quincy) Wright mused. Could the twentieth century offer something similar, he wondered, without requiring the humiliating subjugation of vassals and ceaseless violence at the savage frontiers of empire? Could a world organization under international law keep aggressors from bringing ruin to liberal democracies at peace? Would peace come,if it did, under the auspices of another empire or in some unprecedented guise?

    One of Wright’s first publications explained how it ought to be plausible under international law to hold (Kaiser) Wilhelm II accountable for his biggest crime, which was starting a war, with all the catastrophes to which that decision led…Most of the early public uses of the phrase “crimes against humanity”, now associated with grave atrocities during war, allocated responsibility for war itself…It became popular in 1918-1919 to call war itself, rather than its attendant cruelties, a “crime against humanity”. (The Kaiser fled to the Netherlands, which refused to extradite the queen’s “Uncle Willie”.)

    Home from Nuremberg, Wright definitely agreed it had been a good thing to rank aggression the premier evil. In effect, it was an auspicious sign for a federation to come that there was so much agreement to try individuals for war after the fact — as the Allies did in Tokyo for Japanese perpetrator, too. “Sanctions, to be effective must operate on individuals rather than states,” Wright explained. “International law cannot survive in the shrinking world, threatened by military instruments of increasing destructiveness, if sanctioned only by the good faith and self-help of governments.”

    But at Nuremberg and Tokyo, the charge (“crimes against humanity”) was only allowed in connection the the primary infraction of aggressive war, which Americans were sure they did not fight (despite Dresden, Hiroshima, Nagasaki). As for aerial bombardment , all powers conducted it, and no one was punished.

    …It was the Nuremberg Trial veteran Telford Taylor ( Counsel for the Prosecution) who went where Falk did not, and he framed the case against the Vietnam War exclusively in terms of war crimes…If one had to choose a single cultural document that marked the beginning of the coming of humane war in our time, Taylor’s bestselling and widely reviewed Nuremberg and Vietnam: An American Tragedy, which appeared in late 1970, is undoubtedly it. At the same time, Taylor epitomized how, after My Lai, atrocities became the index of consensus–belatedly mainstream–that the war had to end.

    …after receiving the pentagon Papers from the dissident defense analyst Daniel Ellsberg, (Neil) Sheehan was preparing to publish them. His grave and wide-ranging New York Times Book Review essay on whether to hold war crimes tribunals for Americans normalized talk of national guilt. “Do you have to be a Hitlerian to be a war criminal?” Sheehan asked. “Or can you qualify as a well-intensioned President of the United States?”


    Dick Cavett Show 1971

    …(Telford) Taylor stated clearly on The Dick Cavett Show (Jan 8, 1971) that (General William) Westmoreland was liable for war crimes, and then he went further adding that, while he reserved judgement on such a tricky question, (President Lyndon) Johnson might be, too.

    From the ashes of Hanoi and the darkness of My Lai, the possibility of humane war would come into view.

    Forget talk of war crimes prosecution. Let’s just strive to make war more humane. The leaders and decision makers of Pax Americana cannot be held legally responsible for their past actions. Then came the Sept 11, 2001 attacks on the World Trade Center and Pentagon and the lessons of Vietnam were lost to history. After George W Bush started wars in Afghanistan and Iraq, revelations of torture and abuse threatened a serious return of antiwar movements. Interestingly Seymour Hersh broke both the Mai Lai massacre story and the Abu Ghraib prison abuse story.


    <> <> <> <> <> <> Benjamin confronts Obama

    After Obama’s election support for antiwar politics cratered. (Medea) Benjamin promptly put her energies into finding new support for the cause of peace by attacking Obama’s drone empire. “Unless we shine a light on it,” she told one reporter of his done mania, “we’re going to turn around a say, “How’s we get involved in all these wars without knowing about it?”.

    Within two days of his inauguration, Obama signed executive orders to ban torture and rescind all Bush-era legal directives governing the treatment of prisoners…Few noticed Obama’s own first done strike, which took place that same third day of his administration…But the deepest and enduring reality of Obama’s first phase in office was that by making other moves, he was engineering an unprecedented new era of global engagement that would blur the lines between war and policing. What had once been brutal, albeit with beginnings and conclusions, was becoming humane — but never ending.

    Obama turned to armed drones more times in his first year alone than Bush had in the entirety of his presidency. Almost from the start, Obama’s policy called for engaging in targeted killing with gusto, not only by drone but also with the Special Forces or standoff missiles sent from long distances. And as Obama re-created a war less bounded in space and let it bleed in time, his lawyers formalized the system…target killings transformed the war on terror so that it stretched across a widening arc of the earth. Soon it was to be advertised as a humane enterprise, conducted with concern for the innocent in harm’s way.

    By the end of Obama’s time in office, no-footprint drones had struck almost ten times more than under his predecessor’s watch, with many thousands dead. The air force now trained more drone operators than aircraft pilots, and the architecture of drone activity had been extended deep into the African continent, not merely across the Middle East and South Asia. The same trend line followed the deployment of the light-footprint Special Forces, which operated in or moved through 138 nations…Actual fighting took place in at least thirteen, and targeted killing in some of those…If no one was captured, no one could be mistreated…As the Obama administration continued, the abuses to the laws prohibiting force accumulated almost without counterexample.

    “The United States takes the legal position that–in accordance with international law–we have the authority to take action against al-Quaeda and its associated forces without doing a separate self-defense analysis each time”, (John) Brennan remarked in his 2011 speech, flashing an astonishing license to kill. In the spirit of the March 2009 brief, what began as a rationale for detention off hot battlefields became a justification for killing. Many of the individuals and groups in question had never struck at the United States, and the threat they posed was debatable. They died anyway.

    For another take on Drones see High Tech Assassins

    For the year starting in the Summer of 2011, the drone program began to receive more intense scrutiny in the press. The Obama administration would lift secrecy partially and strategically over the period that followed. By doing so, it normalized targeted killing–not hard to do given the enthusiasm for the death of Osama Bin Laden in Pakistan on May 2, 2011, in a dramatic commando raid. At the same time it set out to demonstratively minimize collateral harm.

    Seymour Hersh reported on the actual events surrounding the killing of Osama Bin Laden in 2016.

    Obama offered something part way down a continuum between war and policing. Why not go all the way, these critics (Philip Alston) reasoned? If war was going to occur off battlefields and without time limit, so the impulse went, it really ought to resemble the permanent institution of policing with its far more stringent rules on killing, only on a global scale.

    (Medea) Benjamin intuited that drones without footprints were a sequel to the heavy-footprint wars of the Bush years. The technology was chosen for its difficulty to monitor but also its allegedly more humane precision. But she insisted that diplomacy was a better alternative to all forms of war: “I think it’s time to really reflect on the paths not chosen and those paths not chosen include policing instead of military focus…And focusing on the muscle that has been so deteriorated in the last ten years and that’s diplomacy.”

    Trump was to continue the Obama assassination program and dangerously escalate it when on Jan 3, 2020 he ordered the drone killing of popular active Iranian general Quassim Soleimani as his motorcade traveled to Baghdad, Iraq.


    Leo Tolstoy and Gandhi

    In his concern that advocates for more humane war could help make it endless for a public that tolerates it, Leo Tolstoy fixated on corporal wrongs and physical violence. Advocacy aimed at humane war, he contended, was no more ethically plausible than agitation for humane slavery, with daily episodes of torture replaced by everlasting–but kind and gentle–direction of labor and service. Audiences who accept endless war out of the belief that its humanity excuses them, the truculent moralist inveighed, were fooling themselves. They were no better than those who rest content with more humane techniques of animal slaughter, leaving them to carve their steaks and fricassee their chickens with eager gusto in good conscience.

    From The Nuremberg Trials to State Sponsored Extrajudicial Assassination

    Tuesday, November 9th, 2021

    Kill Chain; The Rise of the High-Tech Assassins, Andrew Cockburn, 2015

    Daniel Reisner, former head of the IDF’s Legal Department:

    “If you do something for long enough the world will accept it. The whole of international law is now based on the notion that an act that is forbidden today becomes permissible if executed by enough countries…International law progresses through violations. We invented the targeted assassinations thesis and we had to push it. [Now] it is the center of the bounds of legality.”

    A former senior White House counterterrorism official:

    “The idea had its origins in the drug war. So that the precedent was already in the system as a shaper of our thinking…In addition, the success of the Israeli targeted-killing strategy was a major influence on us, particularly in the Agency (CIA) and in Special Ops. We had a high degree of confidence in the utility of targeted killing. There was a strong sense that this was a tool to be used.”

    The Predator drone was only made feasible after the Internet and the 24 GPS satellites were available in 1993. The Predator was first fitted with a Hellfire missile in 1994. “Given that 168 support staffers were required to keep one predator 24-hour Combat Air Patrol in the air, this was clearly an expensive undertaking.” The drone program allowed live video connections to the entire military chain of command up to and including the President. Each could be directly involved in the remote action of an attack for the first time in history. They could make real time remote kill decisions based on dubious quality video images.

    President Obama as Assassin

    Two years into the (Obama) administration, everyone in the Ritz_carton ballroom knew that the bulky Irishman (John Brennan) was the most powerful man in U.S. intelligence as the custodian of the president’s kill list, on which the chief executive and former constitutional law professor insisted on reserving the last word, making his final selections for execution at regularly scheduled Tuesday afternoon meetings. “You know our president has his brutal side” a CIA source cognizant of Obama’s involvement observed at the time.

    The 542 drone strikes that Obama authorized killed an estimated 3,797 people, including 324 civilians. As he reportedly told senior aides in 2011: “Turns out I’m really good at killing people. Didn’t know that was gonna be a strong suit of mine.”

    On May 2, 2011 a team of navy SEALs killed Osama Bin Laden upon the orders of Obama. President Obama, breaking his agreement with Pakistan, immediately announced the assassination in support of his reelection campaign.


    <> <> <> Daniel Hale

    The NSA features heavily in this book and whistleblower Edward Snowden is included but whistleblower Daniel Hale whose leaked documents paint a far bleaker picture of the number of innocent casualties from drone strikes is not. Daniel Hale is serving a 45 month sentence for his trouble. See Snowden, Cell Phone Privacy, and Targeted Assassinations.

    As originally written, President Dwight Eisenhower’s epochal-1961 farewell address had warned of the “military-industrial-congressional complex” and its “economic, political, and even spiritual” influence at every level of government.

    Much of this book deals with the ever rising defense budget that even the fall of the Soviet Union couldn’t stop. The book deals with inter-service rivalries continuing to today, the competition for funding, the crazy high tech ideas that will never work, the total lack of accountability for failure, the failure to even admit failure, the corruption, the waste, the undermining of democracy, the use of classification to bury unwanted information, etc. Pretty bleak reading.