When I published my NP-Hardness preprint, the obvious question was: Is this a property of my specific architecture – or something deeper?
Now I have the answer.
In a new technical appendix, I applied a completely different system to the same data: a classical GMDH network based on the work of Alexei Ivakhnenko (1968) – the forgotten "Father of Deep Learning" who developed self-organizing polynomial networks decades before backpropagation existed.
The result is striking:
❌ R² = -0.0055 across all 98 neurons (zero predictive power)
✅ Full structural connectivity (9 layers, complete pairwise coupling)
✅ Invariant activation (49-50 neurons active per iteration, near-zero variance)
✅ Uniform switching (mean: 48.0 switches per neuron)
Translation: The GMDH network finds nothing to learn – yet the system is fully coupled, perfectly stable, and globally coordinated.
This is the signature of geometric collapse: Structure without function. Coupling without causation. Order without computation.
![]() |
The information-geometric manifold is not an artifact of my architecture. It is a property of the informational substrate itself.
π Appendix DOI: 10.5281/zenodo.17849616
π Appendix Link: https://doi.org/10.5281/zenodo.17849616
#Physics #ArtificialIntelligence #InformationGeometry #ComplexityTheory #Research #SpinGlass #TrauthResearch #GMDH #Ivakhnenko #DeepLearning #NPHardness




Keine Kommentare:
Kommentar verΓΆffentlichen