Democracy in an age of algorithms


HOMEPAGESJUST PRIOR|1|2|3|4|5|6 |7|8|9|PI|11|12|13|14|15|16|17|18|19|20|21|22|23|24|25|ORIGINAL
Defense Innovation Board: Richard Murray, Marne Levine, Eric Schmidt, Michael McQuade, Jennifer Pahlka, Milo Medin

Algorithms, our Standard Models, and Us

by Bruce Camber  —-ALSO SEE:   Compilations of conceptsFoundational Concepts & First Principles

Background: In the opening remarks of the Defense Innovation Board (DIB) public meeting* in the heart of Silicon Valley (USA), the question was asked, “How do we continue to be the arsenal of democracy in an age of algorithms?”

The simple answer is, “Know your algorithms.” It would also help to know how those algorithms work with each other within all of our different models that we use to make sense of this world and universe. We have models for just about any system involving people and things. Yes, although we have models-algorithms-scales for almost everything, our understanding of algorithms is limited; i.e., consider our two most-pivotal, intellectual constructions, our “Standard Models.” One is for particle physics on the small-scale side of things and the other for Cosmology on the very large scale. Both have rather limited algorithmic models that connect to the algorithms that we use to define ourselves.

Natural Intelligence. The degree to which we have integrated all our models as a working system is a measure of our intelligence and the integrity of our knowledge systems.

Such an integration is pivotal to our understanding and defense of our people. Our Standard Models do not yet have a known consistent scale between each other, and there are no current scales from either Standard Model to our human systems scales where the greatest diversity of algorithms are at work. Add to that quandary (challenge), in deep learning, our artificial intelligence systems now generate their own algorithms from the analysis of thousands, millions, and even billions-upon-billions of actions and records, and it appears today that we have no access to understand the inner workings of those algorithms.

Achilles Heel.  This lack of integration and our understanding of that integration of systems (explainability) is a weakness within the defense industry; and in light of the work of DIB (and groups like Air Force Lt. Gen. Jack Shanahan’s Algorithmic Warfare Cross-Functional Team), it is a weakness for our nation’s defense. Eventually we’ll all have to engage the quaternion-and-octonion space, yet for now, just learning a little about base-2 exponentiation might free us from being circumscribed by space and time.

Base-2. One of the most-simple algorithms is base-2 exponentiation, yet most do not  know much about this simple growth of numbers. When applied to the Planck base units, the two Standard Models are incorporated and that has the potential to open pathways to every algorithm in use today, the known and the unknown, including those with uncountable sets of explanations (or uncountably infinite set).

To begin to get a sense of base-2 exponentiation, first look at its application using those two smallest units of measurement, Planck Length and Planck Time. Then, include mass and charge. These units are all generated by dimensionless constants and represent baseline units that describe our reality. If one stops long enough to think about it, when Planck Mass and Planck Charge are added, this is necessarily, logically and mathematically a possible starting point for the universe. Of course, many other factors are involved, but these four provide a baseline for analysis.

A big bang is replaced with a quiet expansion. This new model naturally evolves.  It is initially superconductingly-cold and smooth, densely-packed like a neutron star; the volume and structure are defined by Planck Spheres (discussed in earlier articles). When these four Planck base units double, then double again, and again, and again, in just 202 doublings (or notations or steps, all discrete groups), the current age and size of the universe will have emerged. By notation 67, the CERN-scale begins its first measurements of a length. The first possible measurement of a unit of time begins within notation 84. It goes from superconducting cold (below notation 107) to hot enough for the Quark-Gluon Plasma (around notation 137). Within the 143rd notation the first second emerges, and the first year within the 169th doubling (at about the size of the solar system). Actual galaxy formation only begins at the 197th notation (the doubling that begins at 343 million years). With only 202 doublings, this is primarily a scale of the very early universe (which is largely unknown and subject to a widely-divergent array of guesses).

The First 64-Doublings Matrix. Most significantly, the first 64 doublings of this scale have not been engaged by the academic, scientific, and scholarly communities. Here is our most-simple scale or progression and very few people know about it or understand it. Yet, this scale is a logical, mathematical (algorithmic) ordering of basic information in a most fundamental way. One can make the argument that a finite-infinite bridge emerges here as the most basic three, space-time-light, become even more closely aligned. The perceptions of the past-present-future are now notation-centric and all notations are active and available from any other notation. Yes, these 64 notations are the domain of the simplest mathematics where algorithms logically begin to emerge.

This scale is important to know and to have within our arsenal in the defense of democracy.

The Challenge. Most of the problems that require our Department Of Defense are when the foundations of understanding are not known, not mutually-shared, not specified, or not-in-synch. Most all these foundations appear to be relative and rather disconnected. The depth-and-breadth of our understanding are limited to that scale to which our algorithms are applied. That scale itself is often poorly defined. Even within those two most-important Standard Models, there is no algorithmic scale that is defined or understood that mathematically connects to any other system. When this base-2 scale is applied to these two major models, the first algorithmic scale will be established and others can follow.

How can we effectively communicate?  We need to learn the nature of these scales and our understanding of “mutual understanding.”  And, we need this base-2 application as an addendum to our Standard Models to begin to grasp the fullness of understanding.


* Wednesday, July 11, 2018, at DOD’s Defense Innovation Unit Experimental (DIUx) is located right alongside NASA’s Ames Research Center in Mountain View, California.

This article is a follow-up of my comments during the Public Discussion; those who spoke were limited to two minutes. – BEC


The near-term challenges:

(these new pages are under construction)

1. Recruitment / Recruit training / Military Intelligence training. Using an expanded  gamification designs, those with the highest levels of integration from multiple levels of scores  are identified within their high school days. Older people with very high levels of integration are actively engaged. All military personnel are engaged to begin creating a level set for basic knowledge within each grade of military engagement.
2. AI, More Natural than We Know: Understanding the foundations of algorithms beginning with the Euler Identity and the mechanisms for computing…
3. Quantum computing and natural bridging: Within the continuity equations of base-2, a current challenge is to identify an actual computational substrate such that quantum mechanics accesses the metrics that create inertial motion. It is believed that our higher level processes in physics can only access this substrate process through the mathematics of algorithms and the functions of prime numbers.



Key Questions from the homepages from which this page has come:

  1. Is our intellectual depth being conscribed by our two Standard Models?
  2. Shall we revisit our structure for scientific revolutions?
  3. Can these concepts be tested using rather simple formulas?
  4. Does measurement qua measurement actually begin with pure math and logic?
  5. Is “infinitely-hot, infinitely-dense, infinitely-small” the wrong place to start?
  6. What is the deep nature of growth?
  7. Are our imaginations working overtime?
  8. What is an inertial frame of reference in light of 202 notations?
  9. Are some concepts first principles”?
  10. Can Turok, Arkani-Hamed and Tegmark open up new conceptual frames of reference?
  11. What is pi that we are mindful of it?
  12. Ask the penultimate questions:  What is finite? What is infinite?
  13. Are we asking enough “what if” questions?
  14. Who is on our team? To whom do we turn?
  15. What has been the driving vision?
  16. What is the fabric of the universe?
  17. Are there rules for our roads?  What are they?
  18. Is the universe exponential? Is Euler’s identity spot on?
  19. Is this model built on something even faster than exascale computing?
  20. Does the universe go on forever or just as far as the current expansion?
  21. Is there a better way to keep track of all these writings?
  22. Who among us is really and truly in a dialogue with the universe?
  23. Why?  Then as a child, ask the question again, Why? And again, ask, “Why?”
  24. Have there been summaries of these ideas? What have we missed?
  25. Are the 202 doublings still a virtually unexplored area for research?
  26. The arrogance of language: How do we know what we know and don’t know?
  27. What are the most important qualities of infinity?
  28. Does the original homepage (January 2012) anticipate the future?