Translater

Montag, 8. Dezember 2025

π†π„πŽπŒπ„π“π‘πˆπ‚ π‚πŽπ‹π‹π€ππ’π„ π‚πŽππ…πˆπ‘πŒπ„πƒ: 𝐍𝐏-π‡πšπ«π 𝐒𝐩𝐒𝐧-𝐆π₯𝐚𝐬𝐬 𝐜𝐨π₯π₯𝐚𝐩𝐬𝐞𝐝.

gmdh_network_structure_light_trauth_research


When I published my NP-Hardness preprint, the obvious question was: Is this a property of my specific architecture – or something deeper?

Now I have the answer.


In a new technical appendix, I applied a completely different system to the same data: a classical GMDH network based on the work of Alexei Ivakhnenko (1968) – the forgotten "Father of Deep Learning" who developed self-organizing polynomial networks decades before backpropagation existed.



The result is striking:


❌ R² = -0.0055 across all 98 neurons (zero predictive power)

✅ Full structural connectivity (9 layers, complete pairwise coupling)

✅ Invariant activation (49-50 neurons active per iteration, near-zero variance)

✅ Uniform switching (mean: 48.0 switches per neuron)


Translation: The GMDH network finds nothing to learn – yet the system is fully coupled, perfectly stable, and globally coordinated.

This is the signature of geometric collapse: Structure without function. Coupling without causation. Order without computation.


switching_activity_pattern_trauth research


The information-geometric manifold is not an artifact of my architecture. It is a property of the informational substrate itself.



switching_frequency_distribution_trauth research


πŸ“„ Appendix DOI: 10.5281/zenodo.17849616

πŸ“„ Appendix Link: https://doi.org/10.5281/zenodo.17849616


#Physics #ArtificialIntelligence #InformationGeometry #ComplexityTheory #Research #SpinGlass #TrauthResearch #GMDH #Ivakhnenko #DeepLearning #NPHardness

Keine Kommentare:

Kommentar verΓΆffentlichen