# N-gram frequencies

I thought it might be interesting to see what the n-gram frequencies were for the tapes of various machines. This means the frequencies that various size n sequences of symbols appears on the tape. After hacking around a bit I ended up deciding to look at windows around the TM head of size 5 (the current symbol and 2 as context in each direction) in addition to the TM state. Then I decided to compute the entropy of that window over time. The entropy is roughly the number of bits needed to encode an average â€świndowâ€ť in an optimal encoding. Basically, low entropy means that the window is more predictable (less variable) and high entropy means the windows are more â€śrandomâ€ť (more variable). Here are the results for Skeletâ€™s list sorted low entropy to high:

``````\$ time python Code/ngram_entropy.py Machines/5x2-skelet | sort -nk2
1RB0LD_1RC0RE_1LA0RA_0LA1LD_0RD1RZ    0.065170   2.624923   2.885602
1RB0LE_0RC0LC_1LA0RD_1RC1RD_1LB1RZ    0.069403   2.531400   2.809011
1RB1LE_0RC0RB_1LD1RA_1LC1RZ_1RC0LA    0.073719   1.753842   2.048718
1RB1RD_1LC0RC_1RA1LD_0RE0LB_1RZ1RC    0.210259   3.032768   3.873805
1RB0RD_0LC1RA_0RA1LB_1LB0RE_1RZ1RD    0.220661   2.875455   3.758101
1RB0LE_0RC1LD_0LD1RB_1LB0LA_1RZ1LA    0.220662   2.875586   3.758235
1RB1RA_1LC0LC_0LE0RD_0RA1LB_1LD1RZ    0.244216   2.480900   3.457762
1RB1RZ_0LC1RD_1LD1LC_1RE0RE_0RA0LB    0.244739   2.483247   3.462203
1RB0RB_0RC0LD_1RD1RZ_0LE1RA_1LA1LE    0.251298   2.520131   3.525323
1RB1RZ_0LC1RE_0LD1LC_1RA1LB_0RB0RA    0.267918   2.732411   3.804085
1RB0LD_1RC1RA_1LA0RC_1RZ1LE_0LA1LB    0.275482   2.612768   3.714697
1RB0LC_1LC1RB_1RD1LA_1RE0RC_1RZ0RD    0.285419   3.021045   4.162722
1RB0LA_1LC0RD_1LA1LB_1RZ1RE_0RB1RC    0.287281   2.632414   3.781536
1RB0LE_0RC1RB_1RD0RB_1LA0LB_0LD1RZ    0.350166   2.870047   4.270711
1RB0RA_1LC1RZ_0LC0LD_1RE0RB_0RE1RA    0.350893   2.079874   3.483446
1RB0LE_1RC1RZ_0RD0RC_1LD0LA_0LB0LE    0.372895   2.376806   3.868385
1RB1RZ_0RC0RB_1LC0LD_1RA0LE_0LA0LE    0.373058   2.377073   3.869306
1RB0LB_1LC0RD_1RB0LD_1LA1RE_0RC1RZ    0.424470   3.039838   4.737720
1RB0LC_1LA0RC_1LD1RE_1RB0LB_0RA1RZ    0.424470   3.039838   4.737720
1RB1LD_1RC0RB_1LA1RC_1LE0LA_1LC1RZ    0.451514   2.975599   4.781655
1RB1RZ_1RC1LB_1LD1RE_1LB0LD_1RA0RC    0.460049   2.976033   4.816229
1RB0LA_1RC1RZ_0RD1LE_1RE0RA_1LC0LA    0.463559   2.981931   4.836166
1RB0RD_0LC1RA_1LA0LD_1LE0RD_1LB1RZ    0.463577   2.981866   4.836176
1RB0LE_1RC0LA_1LD0RB_1LB1LD_1RZ0LC    0.481513   2.949394   4.875448
1RB0LD_1LC0RA_1LA1LC_1RA0LE_1RZ0LB    0.485308   2.923904   4.865136
1RB1RA_1LC0RD_1RA0LB_1LB0RE_1RZ0RC    0.485423   2.923645   4.865335
1RB1RZ_0RC0LD_1RD0LE_1RE0RA_1LC0LD    0.486168   2.928137   4.872809
1RB1RD_1LC1RZ_1LE1LD_1RE0LC_1RA0RD    0.520271   2.903459   4.984542
1RB1RD_1LC0LD_1LE1LD_1LB0RA_1RA1RZ    0.520280   2.903392   4.984510
1RB0RE_1RC1RE_1LD1RZ_1LA1LE_1RA0LD    0.520460   2.903689   4.985530
1RB1LC_0RC0RB_1LD0LA_1LE1RZ_1LA0LA    0.521675   3.001241   5.087942
1RB0RA_0LC1RA_0LD1LC_1RA0LE_1RZ0LA    0.522168   2.969088   5.057759
1RB1LC_0RC0RB_1LD0LA_1LE1RZ_1LA1RE    0.529447   3.039388   5.157176
1RB1LC_1LA1RB_1LD0LA_1RE0RD_1RZ0RB    0.533572   2.932802   5.067089
1RB0LE_1RC0RB_0LD1RB_0LA1LD_1RZ0LB    0.539007   2.967105   5.123134
1RB1LC_0RC0RB_1LD0LA_1LE1RZ_1LA1RA    0.560370   2.994851   5.236332
1RB0LA_0RC1RZ_1LC1RD_1RE1LA_0RB0LD    0.564641   2.954846   5.213408
1RB0LD_0RC0RE_1LC0LA_1LA1RC_0RB1RZ    0.621502   2.897843   5.383851
1RB0RA_0LC1RA_1RE1LD_1LC0LD_1RZ0RB    0.628101   3.065073   5.577475
1RB0RE_1LC0RA_0LC1LD_1LA1RZ_0RB0LD    0.660916   2.793557   5.437221
1RB1RZ_1LC0LE_1RD0LB_0RD1RA_0LC0RA    0.661757   2.793634   5.440663
1RB0LA_1LC1RD_1LA0LC_0RD0LE_1RA1RZ    0.681545   3.027715   5.753897
1RB1RZ_0RC1RD_0LD1RC_1LE0RA_1RA0LE    0.710158   2.916590   5.757222

real	0m4.159s
user	0m4.124s
sys	0m0.025s
``````

Lowest entropy: https://bbchallenge.org/1RB0LD_1RC0RE_1LA0RA_0LA1LD_0RD1RZ
Highest entropy: https://bbchallenge.org/1RB1RZ_0RC1RD_0LD1RC_1LE0RA_1RA0LE

Details: There are 3 #s listed after each machine. The second number `h1` is entropy for window size 1 (i.e. entropy of cur state and cur symbol) max is log2(9) = 3.16 (if it used all 9 non-halt transitions equally often). Third number `h5` is the entropy of for window size 5 max is log2(9) + 4 = 7.16. The first number is the average entropy increase per symbol of window (`(h5 - h1) / 4`) and I think is the most interesting number since itâ€™s not specifically tied to window size. It has a max of 1 (if knowing a window told you nothing about what comes next).

1 Like

And hereâ€™s what it looks like for a random 20 machines from my holdouts:

``````\$ time python Code/ngram_entropy.py holdout.sample.txt | sort -nk2
1RB1LC_1LA1RB_0LB1LD_1LA0RE_1RZ1RD    0.060360   2.632477   2.873915
1RB1RE_0LC1RZ_1RE1LD_1LE0LD_0RA1RC    0.090153   2.631444   2.992057
1RB1LD_0RC1RB_1LD1RE_1LA0LB_1RZ1RD    0.136317   2.922965   3.468234
1RB0LC_1LC1RD_1LA0LC_1LE1RZ_0RA1RE    0.144981   2.931490   3.511415
1RB0LC_1RC1RZ_1RD1LC_1LC0RE_1LA1RE    0.192518   1.386243   2.156317
1RB1LA_0LC1RD_1LA0RB_1RC0RE_1RZ1LD    0.331947   2.970354   4.298141
1RB0LD_1LC0LA_1RZ1LD_0RE1LB_1RB1RE    0.397143   2.847294   4.435864
1RB1RC_1RC0LA_1LB1RD_1RZ1RE_0LB0RA    0.415552   2.707798   4.370007
1RB1RA_1LC1LA_1RA0LD_0RA1LE_1RZ0LC    0.433755   2.856580   4.591600
1RB1LA_1LC1RD_1RZ0RB_0LA1RE_1RB0RB    0.441370   2.324444   4.089925
1RB0RC_1LA0LC_1RZ1LD_0RE1LB_1LC1RE    0.441386   2.323689   4.089232
1RB1LB_0RC1LA_1RD0RE_0LB1RC_1RZ1LD    0.469857   3.000775   4.880204
1RB1LC_0LA0RE_1LD1RZ_1LA1LD_0LC0RB    0.488202   3.038504   4.991313
1RB1RD_0LC1RZ_0RE1LD_1RA0LC_1LC1RE    0.521987   2.265647   4.353595
1RB0RD_1LC0RE_0RA1LD_1RA0LD_1RZ1RC    0.538217   2.868764   5.021631
1RB1LB_1RC1RD_0RD1RZ_0LE0RD_1LA1LE    0.563444   2.500963   4.754739
1RB1LC_0LA0RE_1LD1LC_1RE1RZ_0LC0RB    0.572818   3.069518   5.360791
1RB0LA_1LC0RB_1RD1LA_0RE1RD_0LA1RZ    0.645341   2.965075   5.546438
1RB1LA_1RC0RB_1LD1RC_1LE0LD_1LA1RZ    0.661506   3.022060   5.668085
1RB1LA_0LC1RE_1RZ1LD_0RB1LC_0LA0RE    0.709517   2.999051   5.837117
``````

Which interestingly covers a similar range (~0.06 to ~0.70).

1 Like

Running on a sample of 20 TMs proven by CPS (with block size <=6):

``````\$ time python Code/ngram_entropy.py cps6.inf.sample.txt | sort -nk2
1RB0LD_1RC0RA_1LD0RE_1LA0LD_0LA1RZ    0.044593   2.147155   2.325528
1RB0LA_1LC1RE_1RZ1LD_1RE0LA_1RA0RB    0.053115   2.661584   2.874043
1RB0RA_1LC1RA_1RZ1LD_0LE1LD_1RA1LC    0.058281   2.651637   2.884763
1RB1RB_0LC0RA_1RE1LD_0LE1RZ_1LB0RC    0.063748   3.033164   3.288157
1RB1LC_1LC0RD_0LA1LA_1RZ1RE_1RB1RE    0.070964   1.695725   1.979580
1RB0RA_1LC0RE_0LD1LC_1RA1LC_1RZ1LD    0.072521   3.016125   3.306209
1RB1RA_1LC1LB_1RE1LD_0RD0LC_0RA1RZ    0.080887   1.330488   1.654034
1RB1RZ_1LC1RB_0RE0LD_0LB1RE_0RD1RA    0.090932   2.627635   2.991363
1RB1RD_1LC0LB_1RD1LB_1RZ1RE_0RA1LA    0.096941   2.663687   3.051449
1RB0RC_0RC1RE_0LD1LC_1LA0LD_0RA1RZ    0.104231   3.016161   3.433085
1RB1RE_0LC0RE_0RA1LD_0LC0LD_1RC1RZ    0.106467   2.666964   3.092833
1RB0RD_1LC0LB_1RA0LB_1RE1RZ_0RC1RD    0.222764   2.926871   3.817928
1RB0LA_0RC0RC_0LD1RE_1LA0RB_0RD1RZ    0.326563   2.889040   4.195293
1RB0LA_0RC0RD_1LA1RE_0LC1LB_1RB1RZ    0.339396   2.837638   4.195224
1RB0LA_0RC1LA_1RD1RE_0LB1RE_0RC1RZ    0.404764   2.907282   4.526337
1RB1RZ_1LC0LC_0RD1LB_1RE1RD_1RA1RC    0.565805   2.725739   4.988961
1RB0LA_1RC1LA_1LC0RD_1RZ1RE_1LB0RB    0.598393   3.014867   5.408437
1RB1RZ_0RC0RE_1LD0RA_1RB0LD_0LC1RA    0.623808   2.868375   5.363605
1RB0LA_0RC1RE_0LD0RB_1RC1LA_1RA1RZ    0.651181   2.953644   5.558369
1RB0RD_1LC1LB_1RA0LB_0RE1RD_1RZ1RA    0.674834   2.929456   5.628790
``````

It looks like the median entropy is waaay lower ~0.1 (CPS) vs ~0.4 (holdouts) which (I think) makes some sense because I think that CPS is taking advantage of lower entropy to create a limited model of possible configurations that doesnâ€™t include halting configs. But CPS still proved several reasonably high entropy TMs.

1 Like

And to round things out, hereâ€™s the distribution for top 100 Halting 6-state TMs (5-state TMs are too small and halt before simulation is done):

``````\$ time python Code/ngram_entropy.py halt6.txt | sort -nk2
1RB1RZ_1RC1RA_1RD0RB_1LE0RC_0LF0LD_0LB1LA    0.026431   2.053672   2.159395
1RB1RE_1RC0LD_1LB1RZ_0LE1LD_1RF0RA_1RA0LA    0.030465   2.096668   2.218530
1RB0RB_1LC1RE_0LD1LB_1LA0LF_0RA0RB_1LA1RZ    0.032283   2.079196   2.208327
1RB0RB_1LC1RE_0LD1LB_1LA0LF_0RA0RB_0RC1RZ    0.033201   2.095847   2.228653
1RB0RC_1LC0LD_1RA0LB_1LE0RA_0RA1LF_1LB1RZ    0.033496   2.125938   2.259924
1RB1RZ_1RC0RE_1LD0RB_1LB0LC_1RF0LD_0LD1RA    0.033689   2.125954   2.260709
1RB1RF_1RC0RA_1LD0RB_0LE0LC_0LA1LE_1RA1RZ    0.034318   2.121867   2.259138
1RB0RB_1LC1RE_0LD1LB_1LA0LF_0RA0RB_1LE1RZ    0.034960   2.071099   2.210939
1RB0LC_1LA1RZ_0LD1LC_1RE0RF_1RF0LF_1RA1RD    0.035018   2.109867   2.249940
1RB0LE_1RC0LB_1RD0RF_1RE1RB_1LA0RD_1RD1RZ    0.036139   2.132902   2.277456
1RB0LD_1RC0RF_1LC1LA_0LE1RZ_1LA0RB_0RC0RE    0.037035   1.844898   1.993040
1RB0RF_1LC0RA_1LD0LB_0RD1RE_1RF0LE_1RA1RZ    0.037217   2.045673   2.194543
1RB0RF_0LB1LC_1LD0RC_1LE1RZ_1LF0LD_1RA0LE    0.037219   2.045660   2.194534
1RB1LF_1LC0RD_1LA1LD_1LB0RE_0LA1RB_0LE1RZ    0.040646   2.630279   2.792863
1RB0LA_1RC1RZ_1RD0RB_1LE0RF_0LA0LD_1RD0RE    0.040797   2.047001   2.210191
1RB0LA_1RC1RZ_1RD0RB_1LE0RF_0LA0LD_1RD1RB    0.040797   2.047001   2.210191
1RB0RC_1RC1RB_1RD1RA_1RE0LF_1LD1RZ_0LA1LF    0.042673   1.298521   1.469212
1RB0LC_1LA1RF_1LD0LE_1RE0RA_1LC0RD_1RD1RZ    0.042831   2.166315   2.337641
1RB0RD_1LC0RA_1LA0LB_1RE0LC_0RA1RF_1RA1RZ    0.043483   2.155920   2.329851
1RB1RF_1RC0RA_1LD0RB_1LE0LC_1RA1LE_1RZ1RC    0.045629   2.162059   2.344574
1RB1RF_1RC0RA_1LD0RB_1RE0LC_1RA1LE_1RZ1RC    0.046920   2.177552   2.365232
1RB0RF_1RC0RA_1LD0RB_1LE0LC_1RA1LE_1RZ0LA    0.049461   2.167388   2.365232
1RB0RF_1RC0RA_1LD0RB_1RE0LC_1RA1LE_1RZ0LA    0.050305   2.182866   2.384088
1RB1RZ_1RC0RA_1LD0RE_0LF0LC_1RC0RD_1RA0LF    0.056691   2.152736   2.379500
1RB1RZ_1RC0RA_1LD0RE_0LF0LC_1RC1RA_1RA0LF    0.056777   2.151818   2.378927
1RB0RD_1LC0RA_1RD0LB_1RE0LD_1RA1RF_1RA1RZ    0.057247   2.076274   2.305261
1RB0RD_1LC0RA_1RD0LB_1RE0LD_1RA0RF_0LD1RZ    0.057972   2.076272   2.308159
1RB0RD_1LC0RA_1RD0LB_1RE0LD_0LE1RF_1RA1RZ    0.058512   2.102893   2.336941
1RB0LC_1LA1RZ_0LD1LC_1RE0RF_1RF1RE_1RA1RD    0.058782   1.488335   1.723465
1RB0LC_1RC0RA_1LA0LD_1LE0RB_0LC1LF_1LC1RZ    0.059025   2.203560   2.439659
1RB0RA_0LC0RC_1RA1LD_0LE1LC_1LB0RF_1RE1RZ    0.059073   2.670231   2.906524
1RB1RZ_1LC0RA_0LD0RD_1RF1LE_0LB1LD_1RC0RF    0.059073   2.670231   2.906524
1RB1LD_1RC0RB_0LA0RA_0LE1LA_1LC0RF_1RE1RZ    0.060692   2.689891   2.932661
1RB0LA_1RC1RZ_1RD0RB_1LE0RC_1LF0LD_0RF1RA    0.061039   2.159305   2.403462
1RB0LF_1RC0RA_0LC1LD_1LE0RD_1LF1RZ_1LA0LE    0.061039   2.159305   2.403462
1RB0RD_1LC0RA_1RD0LB_1RE0LD_1RA1RF_1LA1RZ    0.061044   2.081224   2.325400
1RB0LB_0RC1LB_1RD0LA_1LE1LF_1LA0LD_1RZ1LE    0.061607   2.088306   2.334734
1RB0LF_0RC0RD_1LD1RE_0LE0LD_0RA1RC_1LA1RZ    0.061995   2.758484   3.006466
1RB0LF_0RC0LC_1LD1RE_1LB0LD_0RA1RC_1LA1RZ    0.062605   2.682003   2.932424
1RB1RE_1RC0LD_1LB1RZ_0LE1LD_1RF0RA_1RA1RF    0.063947   1.491418   1.747206
1RB0LE_0RC0RA_0RD0RE_1LE0RF_1LA0LD_1RA1RZ    0.064137   2.151735   2.408284
1RB0LC_1LC1RF_1LE0LD_1LC0RE_1RD0RA_1RE1RZ    0.064524   2.211178   2.469272
1RB0RD_1LC0RA_1LA0LB_1RE0LC_1LC1RF_1RA1RZ    0.064524   2.211178   2.469272
1RB0LF_1LC0RA_0RD1LB_1RF1RE_1RZ1RF_1LA0RD    0.064733   2.588631   2.847565
1RB0LF_1RC0RA_1LD0RB_0LE0LC_0LA0LB_1LC1RZ    0.065176   2.159423   2.420128
1RB0LC_1RC0RA_1LA0LD_1LE0RB_1RB1LF_1LC1RZ    0.065932   2.210204   2.473932
1RB1LF_1RC0RD_1LD0LE_1RB0LC_1LA0RB_1LC1RZ    0.065932   2.210204   2.473932
1RB0RC_1RC0LC_1RD1RA_1RE0LF_1LD1RZ_0LA1LF    0.070030   2.212752   2.492873
1RB0RC_1LC0LD_1RA0LB_1LE0RA_1LB1LF_1LB1RZ    0.070354   2.218575   2.499991
1RB1RF_1RC0RE_1LD0RB_1LB0LC_1RA0LD_1RB1RZ    0.070354   2.218578   2.499996
1RB1RZ_1RC0RE_1LD0RB_1LB0LC_1RF0LD_1RB1RA    0.070354   2.218578   2.499996
1RB0RC_1LC0LD_1RA0LB_1LE0RA_0RE1LF_1LB1RZ    0.070355   2.218580   2.500000
1RB1RZ_1RC0RE_1LD0RB_1LB0LC_1RF0LD_0LF1RA    0.070356   2.218581   2.500004
1RB0LA_1RC0RA_1LD0RB_0LE0LC_1LA0LF_1LB1RZ    0.070526   2.235329   2.517434
1RB1RZ_1LC0LF_1RD0LB_0RE0RC_1RF0RA_1LB0RF    0.070526   2.235329   2.517434
1RB1RZ_1RC0RE_1LD0LE_0LF0LC_0LF0RA_1RA1LC    0.070626   2.661838   2.944342
1RB0RF_1RC0RE_1LD0RB_1LB0LC_1RA0LD_0LE1RZ    0.072765   2.270568   2.561630
1RB0RC_1LC0LD_1RA0LB_1LE0RA_1LB0LF_0RD1RZ    0.072766   2.270576   2.561640
1RB1RB_1LC0RA_1LE0RD_0LD1RC_1RB1LF_0LB1RZ    0.082939   3.028819   3.360577
1RB0LE_0RC1RF_1RD0RB_1LA1RB_1LA0LD_1RA1RZ    0.088249   2.808615   3.161613
1RB0RF_1LC0RA_1LD0LB_0RD0LE_0LF1RZ_1RA1LA    0.097094   2.233849   2.622226
1RB0LF_1RC0RA_0LC0RD_0RE1RZ_1LF1RF_1LA0LE    0.097885   2.303905   2.695446
1RB0RF_1LC0RA_1RD0LB_1RE0LD_1RA0LC_0RE1RZ    0.100928   2.377033   2.780746
1RB0LE_0RC0RF_1RD0LD_1LA0RB_1LC1LC_1RB1RZ    0.102999   2.700411   3.112409
1RB1RZ_0RC0RA_1RD0LD_1LE0RB_1RB0LF_1LC1LC    0.102999   2.700411   3.112409
1RB1LE_1RC1RF_1LD0RB_1RE0LC_1LA0RD_1RZ1RC    0.115459   2.672642   3.134479
1RB1RF_1LC1RF_1RZ1LD_1LE0LF_0RA1RC_1RE0LD    0.119426   3.091113   3.568817
1RB0LF_0RC1RE_1RD1RA_1LE1RA_1RZ1LF_1LB0LA    0.119441   3.091051   3.568814
1RB1LE_1RC0RF_1LD0RB_1RE0LC_1LA0RD_1RZ0LA    0.120785   2.693624   3.176763
1RB0LE_0RC1LD_1RD1RF_0LB1RA_1LA1LB_0RA1RZ    0.122799   3.133320   3.624516
1RB1RZ_1LC0RE_0LD0LB_1RE0LC_1RF1RD_1LD0RA    0.125159   2.758484   3.259120
1RB1RC_1LC0RF_1RA0LD_0LC0LE_1LD0RA_1RE1RZ    0.125174   2.760195   3.260890
1RB1RF_1LC0LE_1RD0LB_0RE0RC_0LA0RA_1RB1RZ    0.125625   2.479001   2.981499
1RB1RZ_1LC0LE_1RD0LB_0RE0RC_0LF0RF_1RB1RA    0.125625   2.479001   2.981499
1RB0RF_0RC0RA_1LD1RA_1LE1RZ_1LA0LF_0RC0LD    0.127971   2.814481   3.326366
1RB1LD_1RC1RZ_1RD0RF_1LE0LF_0LA0LD_0LA0RB    0.128984   2.806547   3.322482
1RB0LF_0RC1RD_1LD1RA_0RE1LF_1RZ1RC_1LB0LA    0.131986   3.138235   3.666179
1RB1LF_0LC1RD_1RZ1LA_1RE0RF_0LA1LB_1LE0RD    0.131986   3.138235   3.666179
1RB0LC_1RC1RA_0RD0LE_1LE1RF_1LB1LC_0RA1RZ    0.144468   2.392150   2.970023
1RB0LB_0LC0RD_1LA1LD_1RE0LF_0RB0LB_0LE1RZ    0.166164   2.738109   3.402765
1RB0LF_0RC0LC_0LD0RA_1LE1LA_1RC0LC_0LB1RZ    0.166470   2.738110   3.403991
1RB0LC_0LA1RE_0RD1LC_1RA1LB_1RF0LE_1RC1RZ    0.202301   2.808486   3.617691
1RB1LC_1RC0LD_0LB1RE_0RA1LD_1RF0LE_1RD1RZ    0.202301   2.808486   3.617691
1RB1RZ_0RC1LB_1RD1LE_1RE0LB_0LD1RF_1RA0LF    0.202301   2.808486   3.617691
1RB1RZ_0RC1RD_0LD0LA_1LE0RB_0LF1LD_1RA0LA    0.208159   2.528048   3.360684
1RB0LA_1RC1LD_1LB0RE_1LB0LB_0RF0RA_1RZ1LC    0.232516   3.372232   4.302297
1RB0LA_1RC1RZ_0RD1LC_1RE1LF_1RF0LC_0LE1RA    0.237930   2.864985   3.816705
1RB0LA_1RC1LD_1LB0RF_0RE0LB_1RZ1LC_0RE0RA    0.245105   3.376250   4.356670
1RB0LA_1RC1LD_1LB0RE_1LB0LB_0RF0RA_1RZ1LD    0.252166   3.293632   4.302297
1RB0LA_1RC1LE_1LD0RF_1RZ1LE_1LB0LB_0RD0RA    0.260386   3.306564   4.348107
1RB0RF_1LC0RB_0LF0LD_0LE1RZ_1LC1LA_0RA1LE    0.276841   2.748674   3.856037
1RB0LA_0RC0RF_0LD1RE_1LA0LC_1RB1RD_0RE1RZ    0.276901   2.748394   3.855998
1RB1RD_0RC0RF_0LD1RA_1LE0LC_1RB0LE_0RA1RZ    0.276901   2.748394   3.855998
1RB1LC_1LC0RB_0LF0LD_0LE1RZ_1LC1LA_0RA1LE    0.276960   2.748121   3.855960
1RB0LA_0RC0RF_0LD1RE_1LA1RB_1RB1RD_0RE1RZ    0.277003   2.747908   3.855918
1RB1RD_0RC0RF_0LD1RA_1LE1RB_1RB0LE_0RA1RZ    0.277003   2.747908   3.855918
1RB0LA_1RC1LD_1LB0RE_1LB0LB_0RF0RA_1RZ0RD    0.294786   3.308191   4.487336
1RB1LE_1LC0RB_0RA1LD_1LE1LA_0LC0LF_0LD1RZ    0.311036   2.953248   4.197392
1RB1RD_0RC0RF_0LD1RA_1LE1RB_1RC0LE_0RA1RZ    0.311523   2.954133   4.200225
1RB0LA_0LC1RE_1LA1RD_0RB0RF_1RD1RC_0RE1RZ    0.311635   2.954006   4.200547
``````

Entropy is even lower. This is not surprising at all given that all of these machines spend 99.99% of their time doing some basic bouncer behavior which is quite predictable.

1 Like