# Dark matter and the Second Law

You might or might not know, I should probably rightly be regarded a “fringe,” hobbyist quantum gravity researcher. I’ve attempted to publish a couple of papers on the topic of composite monopole quasi-gravitons, which haven’t gone over well in peer review. The latest attempt to publish, which I think at least manages to distill my hypothesis down to its most fundamental exemplar mathematical form, was ultimately rejected largely on the basis (in my understanding) that my waves in the metric, while an independently confirmed solution to the Einstein field equations according to the reviewer, have vacuum Ricci scalar curvature, (R=0), and therefore must be regarded as a “coordinate artifact” rather than true gravity waves.

In great haste, I made a small but critical mathematical error in coordinate choice, in reaching the above, which I must thank the reviewer for correcting and looking past; however, I continue to assert, correcting this error, the argument of the paper is otherwise wholly unaffected, including my original preemptive response to the R=0 point of criticism, which might have been largely ignored. Tidal forces have since been reported to carry the signature of gravity waves, and I am confident my waves qualify as such a case, with R=0, on the basis on my own substantial work.

In the course of this work referenced above, the (supremely dissatisfying) reliance of my work on the second law of thermodynamics became quickly apparent, in order to explain the ostensible existence of black holes for a period any longer than a moment of (“ultraviolet…”) catastrophic breakdown into monopole gravity wave radiation, entirely. Certainly, it follows, quantum monopole waves/particles with enough energy to fit into their own Schwarzschild radii, to be black holes themselves, violate the black hole thermodynamics of Bekenstein and Hawking, so we “happily” assume the second law. From this, the (asymmetric) “arrow of time” seems to arise thermodynamically, “thankfully.”

In case my rhetorical parentheticals and quotations don’t make it clear, this assumption of thermodynamic law has been my biggest self-circumspect reservation about my own work. Thermodynamics is “** emergent**,” isn’t it? Unifying gravity with quantum mechanics, fully describing all known fundamental forces quantum mechanically, the laws of thermodynamics and statistical mechanics should follow in a manner “

*quod erat demonstrandum*,” shouldn’t they?

One of my informal teachers (and favorite persons in the world) repeated a point of confusion to me, once, on the application of the black hole thermodynamics work of Bekenstein and Hawking, at its interface with quantum information theory and computer science: “A hard drive filled with ‘information’ should have more mass than a zeroed hard drive.” Really? I could believe you, almost. I could believe you, if your hard drive was a thermodynamically “closed” system that approached or satisfied the Bekenstein bound before you manipulated it further, but it’s not, and it doesn’t.

The Bekenstein bound represents both a thermodynamic limit and a quantum information theoretic limit: for a given mass and geometric volume, there is a maximum “entropy.” We cannot represent more “information” than this entropy limit, with a fixed mass and geometric volume. The “densest” possible representation of “information,” of “entropy,” happens to be exactly maximized by the mass, volume, and thermodynamic entropy of a literal relativistic black hole.

“A hard drive filled with ‘information’ should have more mass than a zeroed hard drive.” …Really, there’s no reason to suspect this, at all, besides an overzealous over-extension of interpretation of the work of Bekenstein and Hawking. Following that over-extension “faithfully,” we might expect a difference of one Planck mass (in a radius of one Planck length) per computational “bit,” or really, “qubit”! Canonically, for magnetic effects in a hard drive, we expect a small mass/energy difference, but nowhere close to this. Further, we can carry the “hard drive” analogy to a hypothetical model where even these tiny magnetic potential energy differences vanish virtually or exactly, like to the model of a “Turing machine” “memory tape” where both binary states of a “bit” of the tape exhibit no electromagnetic or even gravitational difference in potential energy, at all.

Is there any good, plausible theoretical reason to *question* whether my mentor’s offhand remark might actually be *right*? Can we raise any physically reasonable *doubt, against his wrongness*?

Suppose, I have two identical terrabyte, “TB,” hard drives. I start by setting all bits on both drives to the “0” state, exactly. I isolate them both entirely from interaction with their environment. I isolate them both entirely, except for the bare minimum interaction necessary to complete this next step, on one of them: I prepare a TB of (quantum computational) “qubits” in identical, exact “50/50” superposition (such as by Hadamard gates on the |0> state). I measure this TB of qubits, with equal, independent probability of measuring “0” or “1” in every case, and I write every one of these results to a “bit” on the TB hard-drive. (Say I do this completely faithfully, without regard to any “coincidence” I might notice in the results measured, and with a bare minimum of interaction with my two hard drives.)

Now, I squeeze both the identically zeroed (“no information”) hard drive and the other hard drive filled to capacity with “information” down to *black holes*. Here’s where things get “wonky.” What you expect to happen, here, might depend on which side you fall in the black hole information paradox “war,” and—in case you couldn’t already guess—I throw my lot in with Hawking, in that conflict.

I assert, we hit the point of minimum volume for the two drives, (collapse into black holes,) one TB “sooner” for the “full” disk, compared to the “empty” one. That is, the “full” disk’s Schwarzschild black hole configuration should be 1 TB larger, where we have 1 Planck mass per “bit,” and where this is *entirely and solely* a function of the *mass/energy* of the two disks, when we vary the volume, according to general relativity. If Hawking is right, my mentor Peter Schmitt is a careless genius. The disk filled to capacity with computational information *would* be 8 * 1024 * 1024 * 1024 * 1024, (1 TB,) Planck masses more massive or energetic, by argument from black hole thermodynamics and quantum information theory, by no mechanism known or expected from material science, electromagnetism, or any other canonical physical theory. *From where could the mass possibly come?!*

Consider: our most widely accepted models of cosmology already suppose that *most* of the mass and energy comprising our universe already take a weakly interacting form of which we have *virtually no idea what they are*: “dark matter” and “dark energy.”

As I prefaced this post with, as I have failed to satisfy peer reviewers with, I offer that I might know exactly the form these two substances take: a composite graviton quasi-particle gas, (resulting from an asymmetric mode of the metric tensor of Einstein’s gravity,) and its excitation of (primarily) baryonic and leptonic fundamental rest masses above the ground state of the Higgs field vacuum. Drawing a finite hypervolume surface bounding the preparation of our hard disk “saturated with information,” slightly more (composite graviton monopole) “dark energy” goes into the hypersurface than ultimately comes out, and slightly more (baryonic rest mass excitation above the Higgs field vacuum state) comes out than goes in. These substances must follow a continuity equation, (i.e., must be “locally conserved,”) cannot travel faster than the speed of light, and exhibit “coupling constant” proportional to particle rest mass times Newton’s gravitational constant, in the ideal.

This would be worthless, raving speculation, if it could not be tested; tell me it cannot be tested.