Wednesday, January 10, 2024

FIXING THE INTERNET 3: ELON MUSK AND DISINFORMATION

A review of several books related to the internet by Jennifer Szalai provides an opportunity to focus on the topic of valid information

 ( https://www.nytimes.com/2023/12/31/books/review/elon-musk-trust-misinformation-disinformation.html )

Elon Musk's struggle with Twitter (now X) provides a useful guide.  There are many speculations about why he chose to buy Twitter and refashion it.  But his claim to make it more accurate seems undercut by the first changes he made to stop putting so many resources into maintaining the trust, including charging for blue check verification,  suing an organization that tracks hate speech and falsehoods on social media , rejecting legislators’ calls for transparency, and now charging researchers up to $42,000 a month for access to data-gathering tools that once were free.  By reveling in the chaos, Musk has turned X into an experiment in whether “the best source of truth” means anything.   If we take him at his word that he was trying to make X more valid, then it is not clear that he succeeded.  But then some of Space X rockets blow up,  Tesla autopilots drive into trees; engineering is an imperfect skill that deals with imperfect information.  Musk’s vision for an information free-for-all on X  makes the goal seem impossible to  achieve — and the enormous challenges of any workable fix plain to see.   ( https://www.newyorker.com/tech/annals-of-technology/what-we-lost-when-twitter-became-x  )

(We do well to dispatch the character attacks on Musk, his erratic moods, his alleged ketamine, and other drug use.  These may have relevance for his responsibility to his workers and investors, but it is hard to see how this can be used to explain his failures at X.  https://www.forbes.com/sites/zacharyfolk/2024/01/07/elon-musk-claims-not-even-trace-amounts-of-drugs-in-his-system-after-report-detailed-drug-concerns-from-tesla-spacex-execs/    https://fortune.com/2024/01/07/elon-musk-drug-use-worries-tesla-spacex-leaders-report/    https://nypost.com/2024/01/07/news/elon-musks-drug-use-has-executives-at-his-companies-concerned-report/  )

Szalai provides a distinction between “Misinformation” (= false information that people sincerely believe and unwittingly spread) and “disinformation,” (=  (what the Soviets called dezinformatsiya) deliberate goal oriented deception).  This seemingly clear distinction is quickly confused when persons disseminate "disinformation" without realizing that they are doing so.   “On Disinformation: How to Fight for Truth and Protect Democracy,”  Lee McIntyre,  writes about “truth killers”  “to spread disinformation out to the masses — in order to foment doubt, division and distrust — and create an army of deniers.”   The originators of the disinformation may be lost in the masses of misinformation!  McIntyre claims the information about vaccination was competing against a "fire hose of falsehoods" on social media and the internet.  Can "real truth" be washed out by large numbers of lies?

 Thomas Rid explains, in “Active Measures: The Secret History of Disinformation and Political Warfare” (2020), that Russian disinformation is nothing new;  Russian trolls placed thousands of ads on Facebook — but none of the most popular contained what Rid, a political scientist at Johns Hopkins, calls “sharp, corrosive disinformation.” Rid claims the internet has actually made such campaigns “harder to control, and harder to isolate engineered effects.” Isn't there a long history of mis- and disinformation, including much sent by the US to other countries?  Was Radio Free Europe "true"?  Or propaganda?  This is precisely the kind of argument that Jeff Kosseff presents in “Liar in a Crowded Theater: Freedom of Speech in a World of Misinformation” (2023).  Kosseff, a professor of cybersecurity law at the United States Naval Academy, urges caution. He doesn’t deny that technology can amplify lies, and that lies — whether deliberately engineered or not — can be dangerous. Does it make sense to give government the power to decide on the "truth" of information?  Doesn't that amount to the sort of mind control in 1984?

Despite the liberal/libertarian claims of Silicon Valley, companies like Meta, which makes billions from advertisers,  have a vested interest in promoting the idea of an impressionable public susceptible to being influenced on platforms that are “magically persuasive” to sell advertisers. Most people would agree that the level of "truth" in advertising is extremely low, to non-existent.

Who is in charge of "truth" anyway?  In a cover story for Harper’s Magazine in 2021, the BuzzFeed reporter Joseph Bernstein (now a reporter for The Times) wrote about what he calls “Big Disinfo”: an industrial complex of think tanks, media companies and academic centers that emerged during the Trump years to study the effects of disinformation.  The RAND CORP has published the results of a study on the erosion of truth in society which they call TRUTH DECAY.  (https://www.rand.org/research/projects/truth-decay.html)(It is available for download free at ( https://www.rand.org/pubs/research_reports/RR2314.html )
 They identify this as the declining use of “data and facts” in political discussions over the last decade.  They recognize this has been a feature in US society historically, "yellow journalism", etc. They see four trends:
    disagreement about facts and their interpretation
,  blurring of the line between opinion and fact
,  increasing the role of opinion,
 and decline in trust of "respected sources".  They identify four drivers:
  cognitive bias as a human characteristic
, changes in information systems in news media,  changes in internet social media
,  breakdown of the education system,
  and polarization of society,  politically and economically.  Data supporting these observations is generally strong, and social media is only one of several factors.  They assert a need for agreed upon “truth” for every society,  and  that making decisions for the society based on these “truths” is the best course of action.
  In this view,  the problem is not the breakdown of valid communication of information,  but the intentional biasing of information for advocacy. 

TO PROCEED it is useful to be clear about three different kinds of truth:  1) Certain beliefs common to religious groups and certain social communities are accepted as true statements, without any independent validation.  "The bible says it's so."  2) Peirce's pragmatic truth “Inquiry properly carried on will reach some definite and fixed result or approximate indefinitely toward that limit” (1.485)1 and, “The opinion which is fated to be ultimately agreed to by all who investigate, is what we mean by the truth” (5.407).  This closely approximates what scientific knowledge would provide if it were not distorted by other social values. and 3) Consensus reality the common group of beliefs held by a community based on varying levels of observation and reinforced by social consensus. (As in the Ash experiments in social psychology.) (Tart has written the strongest descriptions of the power of social consensus.)  If there are other ways of validated "truth" (outside of mathematical logic),  I am not aware of them. 

 The range of "truth" is relatively narrow.  Facts that I observe as an individual based on my sensory experience are the basic level of “truth”.  Even at this level the information is imprecise.  I may overlook important details,  or see an event occurring and misinterpret it,  as in observing a traffic accident.  Once the process of observation becomes complicated by scientific instrumentation,  various procedures of standardization (calibration) must be done.  And if the information is reported by another, either verbally, or using audio-visual media, the bias that person introduces must be considered.  Even a simple report like “it is hot today” may involve bias,  not to speak of complex reports like “voter fraud is everywhere”.   “Scientifically” performed observations can easily be biased if the method of observation or reporting is biased.  This supports the postmodern view that most information is biased by the source and must be interpreted,  so that “truth” is relative.  “Fact checking” politicians amounts to identifying if they make statements of advocacy pretending that they are based on “real events” or “data”.  They are not bound to make true statements “under oath” most of the time, so it is unclear why people expect them to do so.  The current Liberal attacks on the POTUS for his constant, patently false statements simply reaffirm his intention to annoy them and support his followers.  The idea that the Clinton presidency was mostly truthful is equally inaccurate, and the failure of Liberals to acknowledge this confuses their current outrage.


Based on these versions of truth, the Rand concept of "Truth Decay" is not about disagreement of facts,  but about the breakdown in a consensus of how agreement is reached by the society.  There is a common liberal/conservative confusion about communication:  liberals  believe that transmitting "accurate data" will drive optimal decisions,  while conservatives believe that emotional bias drives decisions regardless of data and emphasize emotional communication, an apparent conflict between data and consensus. Increasingly, the liberal position has lost its data focus. (The reader will recall that RAND at one time produced studies on the magnitude of overkill needed to prevent a nuclear war, not "fact based"!) The social system creates its own reality and enjoys or endures the consequences.  This is more obvious in economic decisions,  but applies in all social decisions.  TRUTH DECAY is an attempt to “explain” how the conservative approach to communication is “overwhelming” and eroding "truth".  But “big data” continues to be utilized by businesses for decision making.  The supposed disregard of facts is mainly in the arena of politics and social policy, and a strong division of values in current US society is driving this advocacy messaging.    

A clearer understanding is the breakdown of social consensus in the country over the last half century, the result of 1) economic inequality, 2) failure to protect workers as a class, and 3) scapegoating of minorities as a method of avoiding the effective decision making of leadership, which intensifies social silos.   Chris Hayes book “Twilight of the Elites: America After Meritocracy,” published in 2012 describes how elite malfeasance: the forever wars after 9/11, the 2008 financial crisis undermined the public’s trust in institutions.  The election of Obama further challenged the country's sense of consensus (whether or not he was an effective leader).

The role of the internet social media is a multiplier.  It expands access to varying opinions, and if these are used according to criterion 2) will eventually lead to valid decisions.  But more often,  the result is the creation of consensus subgroups 3) with conflicting opinions on major issues.   "truth" is not overwhelmed by a large volume of lies,  though this process may sometimes occur.  This problem is not new and the American public has been misinformed on a consistent basis by its government through much of its history, e.g., the Pentagon Papers.  And America has actively participated in disrupting the information of other countries, most recently in the "Arab Spring".  

Legislative solutions,  government interventions, and economic models like paying users for the information about them, all put the responsibility for managing the system outside the user.  It is likely that the new AI systems will attempt to circumvent any interventions that interfere with the algorithm goals, since they are goal-oriented programs!  When the user intentionally manages the response to the algorithms, the AI will react.  The user/interface dynamic must be used to control the viewing experience, so it is up to the user to express the control and direction he/she intends.  If platforms attempt to prevent this by limiting the users ability to select,  the choice is to abandon the platform.  Many platforms,  like MYSPACE,  have disappeared.  The idea of trusting the government to somehow legislate this consensus is the most dangerous of all,  and truly encourages "mind control" and 1984.  Most congressmen have so little understanding of internet and social media as to be ludicrous.

Enhancing the  interaction between user and AI system does not address the tendency of users to favor their own perspective and suppress other information.  It does not address the user's tendency to accept viewpoints of others with similar perspective, even when they are false observations.   The problem of the fragmentation of our social system is not caused by the expansion of social media,  though it facilitates diverse social networks.  This has generally been viewed as a great disaster on the Left when it fosters conspiracy theory sites like QAnon, or Right wing militant groups,  but also on the Right when the result is "cancel culture" of narrowly focused Liberal groups.

Educated viewers of TV do not believe most of the information promoted in television ads,  nor should they believe much of the information promoted in internet ads,  or even internet "postings" that purport to be shared by "friends" but are really ads, sometimes from BOTS.  Viewers are bombarded with false information in all media, and bear the responsibility for recognizing and dealing with this.  The only real solution to the integrity of internet communication is educated participation by users who are mindful of its limitations.  This requires an educated media viewer who is informed of the processes and can manage their own interaction with the system.  This training must begin at an early age because media actively engage young viewers into passive viewing habits.  Unless the country supports these changes in users,  the process will not be managed. 

IN A DEMOCRACY THE PEOPLE MUST TAKE RESPONSIBILITY FOR POLITICAL ACTION WHICH INCLUDES THEIR INTERACTION WITH THE VARIOUS FORMS OF MEDIA USED TO INFLUENCE AND MANIPULATE THEM.  WITHOUT THIS DEMOCRACY IS THREATENED.


No comments: