The Algorithmists’ Dilemma: Finding the Courage to Imbue Human Hearts in Digital Brains


Cristian Carrara

It was the logical next step for Digital Society…

By the mid-2010s, computer scientists, data scientists, behavioral economists, statisticians, social psychologists, and policymakers in government and business, medicine, and finance embraced algorithmic certainty as mathematical proofs of hypotheses.

Faithfully, as machines in tune, they reasoned that humans were computers who (1) interpret data from five bioengineered sensors (sight, sound, taste, smell, and touch), (2) store past computing cycles in memory, and (3) prioritize whether to trust or act on the computed results based on the emotions of (a) genetics and (b) social nudges.

As if mapping neural pathways, each academic, business, government, and financial expertise developed algorithms for explaining and tracing its significance in and for human life, economics, and society.

The Algorithmists—as they preferred to be called—styled their agnostic pursuits with a religious conviction whose goal was to know and encode computable truths beyond human questioning, accompanied by a ferocious human certainty. Along the way, their algorithms resembled mathematical proofs offered and dispelled for the physics of the universe, energy, and time. Rarely did tribes of Algorithmists from within or across different disciplines compare their formulae as capable of co-existing or of being truths describing the same nuclear family, small business, or neighborhood simultaneously. Each algorithm became like a Silicon Valley startup, whose value the marketplace highly prized for its monopoly of knowledge of others’ behaviors.

The Receivers, Amplifiers, and Tuners

By 2020, the truth before or free of manipulations by the Algorithmists was difficult to measure. Evidence-based studies became harder to simulate, even by machines learning to filter out or correct for the noise generated by computational influence, financial incentives, or contrived scarcities. Debates as to whether one algorithm was better than another in explaining human, physical, or virtual phenomena defined tribes clinging to emotional beliefs in the Algorithmist culture generating its given formula. With the fierce peer commitment to the certainty of algorithmic science, universities ranged from inbred cultures competing to produce the greatest number of algorithms resulting from graduate student thesis research to multidisciplinary cultures competing to assemble the greatest number of algorithms linked as a network of co-existent truths.

During the 2020s and 2030s, the Algorithmists sorted themselves into three branches of academic thought and industrial activities:

  • The Receivers developed algorithms for pulling in and harmonizing as much data as possible, supplying large standardized databases, with few assumptions or self-restraints as to the mega-databases’ commercial and government use;
  • The Amplifiers built algorithms that optimized any given commercial, environmental, economic, or social goal, without concern for how the goal contextualized as dependent on or impacting others’ goals;
  • The Tuners built algorithms for listening to and assessing the harmonics of how relevant algorithms degraded or improved various forms of capital: human, natural, intellectual, financial, and others.

    Cristian Carrara
    .

The Tuners were seen as the outliers, as rebels, as judges, on naive quests to weigh what data was ignored, used, and tweaked into conformity by the Receivers and what algorithmic goals were being under- or over-emphasized by the Amplifiers. In a real sense, the Tuners had trained themselves and their algorithmic assessment machines to listen with the ear of an expert Steinway piano tuner, hearing how the harmonics of all the notes (algorithms) playing on a given concert hall’s stage (a neighborhood, a family, or a business) could be balanced and padded to improve the performance (the holistic impacts of all the algorithms’ goaltending).

Very certainly, had the Tuners not arrived and been as effectively intuitive, our society and markets would have been optimized to create enormous gaps: The Receivers would have ignored or rejected data needed to assess the conditions of the poor, disabled, entrenched, and powerful, and the Amplifiers would have built algorithms exacerbating poverty, disability, insider knowledge, and power amongst the few.

Where We Sit Today in 2050

Looking back from 2050, most of the manual repetitive decisions of modern life have moved into a web of networked unseen computers operating at quantum speeds, using all knowing and all seeing algorithms, finding patterns in vast economic, social, and natural systems’ relationships:

  • Housing – The housing units where we live depend on algorithms that assess our job and family prospects.
  • Jobs and Careers – Our job prospects are auctions for human talent mediated by algorithms tracking and optimizing our childhood and adulthood family lives, national education, and occupational re-education performance and income streams.
  • Consumer Prices – The retail prices we pay for goods and services are customized based on algorithms that compute the necessity and frivolity of the expense, based on the standard of living we have been algorithmically predicted to behaviorally want to achieve.
  • Nutrition and Health – Our food and water intake are algorithmically tuned to ingest printed and bioengineered foods conveying nutrients aimed to mitigate genetic and regional physical and mental health concerns, in order to optimize social cohesion and gross domestic production.
  • Government – Our government is elected by votes registered, tallied, and statistically error checked using algorithms that confirm the absence of tampering.
  • Banking – Our banks monitor for fraud and reduce the risk of “black swan” foreign exchange, stock, and bond price movements.
  • Media Access – Our media provide infinite channels to reconfirm our beliefs—and occasionally challenge those beliefs—based on algorithms predicting how the content will nudge us to spend and invest in futures that are more achievable in a mathematically stabilized socioeconomic paradigm.
  • Justice – Our courts use algorithms to adjust criminal and civil litigation outcomes to reduce the penalties for being poor, disabled, unlucky, and disadvantaged.

The secret sauce that runs our lives is now the algorithm, a currency of what matters, who matters, where they matter, and why.

Taken together, these algorithms are forms of artificial intelligence reflecting human and machine idealism, battling for balanced perfection in expert silos, unnaturally accessible to digital agents, maintained for wealth-building and other purposes inuring to individual or shared prosperity.

Who—other than computers once trained by human beings—builds our algorithms?  How can we check the fairness of the outcomes that algorithms achieve?

Asking and answering such questions differentiates the progressive from the dystopian futures of our society. The Tuners alone cannot assess the handiwork of the Receivers and Amplifiers nor be the primary filter on the quality of lives built by and for algorithmic certainty.

Towards an Algorithmist Code of Conduct

Medicine evolved its Hippocratic Oath and code of conduct in order to professionalize the scientific repeatability of its health outcomes and to minimize risks of unprofessional behavior or worse, the motivation to put health provider profit ahead of patient quality of life.

Lawyers—yes, even lawyers—developed their code of conduct by setting two levels of professionalism. The barely legal minimal standard was set by Disciplinary Rules, enforced through professional misconduct committees. The aspirational standards, called Ethical Considerations, reflect the goals of the legal profession: equal access to the benefits of being represented by competent lawyers. Disciplinary Rules are the floors, and Ethical Considerations are the ceilings. Lawyers who look only at the floor to “not get caught” are snakes; lawyers who fly to the ceilings are the giraffes or the eagles who soar to the heights needed to improve the law and justice for all.

Computer scientists, engineers, mathematicians, statisticians, data scientists, social scientists, economists, and other professions have codes of conduct setting floors and ceilings on the uses to which their professional training can and should be put.

Over the past three decades, the algorithms that rule our daily lives have come from, and been steered to automatically apply, such professionals’ training and evolving body of scientific knowledge.

Important questions arise:

  • Are algorithms self-assessing their compliance with the codes of conduct of the professionals building or relying on them?
  • Are new codes of conduct needed to guide the professional practicing his skills in multiple domains such as computer scientists working in medicine or law?

Vaults for Personal Data and the Algorithms that Use It

“Big data” is such an outdated term: Today, data is ubiquitously secure, with each individual owning his or her own data vault into which daily transactions deposit new data on our behalf, as the price for tapping the vault’s knowledge of our credit worthiness, education level, or other characteristics. Blockchain technologies make our data vaults cyber-secure and ease tracing who used our personal data when, for what limited purposes, and at what profit to themselves. Storing others’ personal data for profit from resale was outlawed in 2025, so that all data stored about an individual is now owned and controlled by that individual.

Susy Morris
.

With secure personal data vaults, government established vaults to encrypt and store the algorithms that use our personal data. While government could not freely use the data in our personal vaults and could not directly use the algorithms developed by private sector industries, in the event of a dispute about the consequences of algorithmically deterministic lives, digital society specialty courts evolved to have access to immutably transparent archival materials for finding and analyzing the digital evidence of data and algorithm whose existence was immutably recorded in the blockchain.

Rather than the Big Brother of George Orwell’s 1984 futurescape, our digital society relies on legal technologies (legaltech) to continually test the fairness of algorithmic outcomes, and in cases of claimed unfairness, to pursue claims that remedy the harms.

Updating Implementation of Ethical Codes of Digital Conduct

Regulatory technologies (regtech) helped banks, hospitals, pharmaceutical companies, farms, and other regulated industries achieve compliance and avoid risks of fines and de-licensure.

Only recently have Algorithmists—beyond the steadfastly pioneering work of the Tuners—begun to deploy regtech algorithms to implement applicable codes of conduct.  The regtech algorithms use machine learning to play an important game, asking how the algorithms that seek access to our personal data vaults could conspire against our personal or societal best interests, leading to poverty, civil unrest, and worse outcomes.

Final Thoughts

As we rely on computers to apply knowledge to all of our society, we have placed our trust in the data and algorithms that computers use for the separate tasks that caused them to exist.

As our digital societies grow in complexity, the people and computers that build algorithms will continue to develop the digital courts and regtech needed to apply traditional codes of ethical conduct to how the machines interact and collaborate to chart our careers, our neighborhoods, our economies, and our social structure, by stewarding scarce natural, financial, and other forms of capital.

Recalling the wisdom of The Wizard of Oz: The engineering to live on the Moon and Mars may seem child’s play compared to engineering a heart into the brains of our automated selves.

© Copyright 2016 Bruce Cahan. All rights reserved.

Acknowledgements

This article happened because of the persistent encouragement of Bruce Cooperstein as editor and ally in the visioning and revisioning process.