Following the work of Barry N. Taylor, NIST, USA


National Institute of Standards and Technology (NIST):  Barry N. Taylor of NIST’s Data Center joins David B. Newell and Peter J. Mohr of NIST’s Physical Measurement Laboratory’s Atomic Physics Division to write the “2014 CODATA recommended values.” These standards are generally recognized worldwide for use in all fields of science and technology. The values became available on 25 June 2015 and replaced the 2010 CODATA set. They are based on all of the data available through 31 December 2014.

Second email: 12 May 2022 at 3:30 PM

Dear Dr. Barry Taylor:

Six years ago, I sent a note to you about our work. It was a plea for help to interpret our data. We thought, “Math is math and geometry is geometry and logic is logic, so what are we doing wrong?”

Our most recent homepage makes reference to you and your team and I thought you’d want to know: is the long term URL.

If I could do it all over again, I would have requested a meeting with you all back in 2012, just weeks after we began our journey into this infinitesimal universe. It has been a long ten years of challenges and discoveries. Thank you.



First email: Jan 14, 2016, 5:42 PM

Dear Dr. Barry Taylor:

We are now working extensively with the NIST values for dimensionless constants.  We found some very interesting work where the values for the physical constants are generated using pi, close cubic packing, number density, and the isoperimetric quotient. 

Now that may sound like we might know what we are talking about! 

We are just simple high school folks who have spent the last four years pondering constants and universals because we backed into it all from a very simple exercise while studying the platonic solids.  We divided in half all the edges of a simple tetrahedron, connected the new vertices and found four half-sized tetrahedrons in each corner and an octahedron in the middle. We then divided the octahedron; and unlike Zeno, we knew we could stop at the Planck Length. To make things consistent we started with the Planck base units and multiplied by 2 until we were at the Observable Universe and the Age of the Universe. 

There were just 202 notations to cataloged everything, everywhere for all time.  We thought it was remarkable until we couldn’t find a professional version of our very rough model on the web or in textbooks. We discovered Kees Boeke base-10 work, but it had no lower and upper boundary, it didn’t expand like in bifurcation theory, it had no geometry, and it didn’t have the Planck base units to define it all. Our first 67 notations became a truly small-scale universe.  Although infinitesimally small and discounted by academia, we found many ways to impute meaning.

Freeman Dyson of the Institute for Advanced Studies and Frank Wilczek of MIT have encouraged us.  Perhaps it is better for the naive to make idiots of themselves. Yet, four years later we are still at it and the closest anybody has come to being harsh was to tell us that our work is idiosyncratic. We certainly knew that much.

Your thoughts would be profoundly appreciated.  Thank you.

Most sincerely,
Bruce Camber
New Orleans