have the feeling of being put under magnifying glass or microscope sometimes at work and elsewhere (some recent joking about this on the physics stackexchange hbar chat room among long oldtimers). in worst case scenario as mentioned last month it feels like being micromanaged, bullied, stalked, or hunted. maybe this is my coming-home-to-roost karma for having a cyber alter ego—decades old now. lets face it no matter how human corporations pretend to be, theyre fundamentally soulless at heart.
theres some greatness that psychology is starting to understand the negative effects of the corporate world. could really relate to this latest headline/ study, Greedy bosses are bad for business, study finds. but could it really be true? bet theres some other study that says that soulless bosses can help drive up the bottom line. but ah, also trying not to be egocentric and put it all in perspective, this is a very old complaint in historical terms, ie roughly as old as capitalism, and intensified with the industrial revolution and so-called late stage capitalism (luv that phrase! what does/ can it mean?! reminds me of the term postmodernism…). overall what one might call 1st world problems… time to take a vac…
this code has optimization and analysis sections. the optimization is to push down mxl, mxr the max 0/1 lengths on both left/ right of the glide. then it looks at the binary structure of the largest trajectories. it takes a same # of iterates starting from left and right sides of the glides, concatenates them in binary, and then analyzes the 0/1 runs. in the 1st graph its shown that the histograms (4) for left/ right 0/1 run lengths are nearly identical as found awhile back with another generation scheme, (cant recall exactly, maybe the long-examined ‘w’ widthdiff). in the 2nd graph there seems to be some slight differentiation, but this seems a rough 1st cut on finding it. strangely there seems to be a difference between odd and even lengths seen in the apparent alternating/ thrashing pattern in the graph. the 2nd graph is 0 run length histogram difference and the 3rd is 1 lengths.
overall it needs some further investigation/ polishing but seems to be real. as was working on this it occurs to me immediately that both on the left and right side, there is not a “control” for varying trajectories starting at the intermediate positions and was wondering if thats causing the results to be more uniform/ undifferentiable. from prior experiments its known that many of the intermediate trajectories or “subtrajectories” (actually subglides) starting from the intermediate points tend to have a much different aspect of terminating quickly eg on the left side. another aspect that occurs to me is that the (“quick-and-dirty”) concatenation idea might bias the measurements slightly versus the alternative of concatenating all the separate 0/1 iterate string splits because the start and end of the binary iterate are always 1s and join to make a larger combined 1-run than either separately.
last installment ended looking at the “forward/ lookahead trend” concept. heres a presumably stronger optimization/ test building on that idea using the hybrid system whipped off fairly quickly. the idea here is that the ~200 step lookahead function seems to be monotone (last bitwise experiment suggested max nonmonotone length is around ~200) and it would be useful to try to verify that as rigorously as possible. this calculates the ratio of the lookahead function at 200 steps fwd vs the initial point as ‘fr’ lightblue line and tries to maximize it along with the other trajectory metrics with no limit on starting iterate size. its indeed bounded very close to less than 1. there is one outlier point greater than one at about nl=235 bit size, but the fr vs nl graph seems to show a ceiling.
victor frankl, an extraordinary figure, a psychologist who survived nazi german camps, wrote the book “mans search for meaning”. read it last summer, was impressed. in contrast also am fond of the kurzgesagt video on “optimistic nihilism”. it seems one can get lost searching for meaning in certain math problems, the hard unsolvable ones. it appears that Church was involved in one of these “wild goose chases” (found that described in a paper somewhere, wanna find out more). finding a solution is a lot like the search for meaning in the problem. for me, as written at times, if something meaningful can be said about the problem, then feelings of despair about the unlikelihood of a solution can be held somewhat at arms length… like wandering around in underground caverns, its not entirely clear if one is any closer to an “exit,” but its clear that many diverse, even wondrous properties unseen by anyone else have been isolated over the years. the fractal nature in particular, a theme explored in great depth/ detail with this attack, is still not remarked on/ noticed much by almost all other authorities. other concepts such as “mixing” and the pre vs postdetermined themes tie in closely with ideas outlined by authorities/ experts Lagarias and Terras.
hi all… last months installment was all about pushing the pre vs postdetermined concept as far as it could go and found that it was stretched thin. in sharp contrast to the rosetta diagram idea, relatively ingenious code reveals the postdetermined region can have major divergences. this broke a major conjecture and has left me somewhat thoughtful at best and reeling at worst. candidly, am feeling sometimes like maybe all the powerful ideas that have been exercised have not made any dent whatsoever in the problem. so yeah, a darker mood, and am wondering sometimes if there is anything that can be proven about the problem at all.
didnt have a lot of immediate ideas. but as time goes on without much experimenting (a few weeks), am feeling maybe some ideas coalescing from the emptiness.
the last installment found a way to construct large “ufos” in the postdetermined region and in crisp clinical terms refuted some longrunning hypotheses, in more informal pov “throwing a wrench in the works” of big overall proof strategies painstakingly built up over years and showing both limitations and strengths of the overall empirical/ datamining approach. however, maybe there is a silver lining; there was a larger hypothesis that maybe escapes largely unscathed, ie in more dramatic terms can yet be rescued amidst some of the substantial wreckage/ smoking ruins (aka “easy come, easy go”™). a recent experiment found that measuring “slope correlation” in the postdetermined region gives high adherence to linearity (ie close to 1) even as iterates get larger. this was discovered with
bitwise optimization and never did convert that finding to the stronger/ more rigorous
hybrid optimization and it was in the back of my mind to do that.
looked over the hybrid code and had the urge to refactor it. the bit vector initialization was servicable but really funky. also had an idea to extend vectors at a few bit positions at a time, not just at the msbs. also came up with the idea of a corresponding/ symmetric shortening operator.
the basic experiment here was trying to minimize slope correlation ‘sc’ for larger iterates. however the naive code simply found a significantly low ‘sc’ for some small iterates and then didnt get past those smaller iterates, the search got stuck so to speak. while this finding is consistent with the hypothesis my real question was what the trend for ‘sc’ was for higher iterates. so then bit the bullet and did multiparameter optimization for this hybrid code, something that is novel wrt prior code/ experiments also. the multiparameter optimization is based on gaussian normalization recalculated every 100 samples. the optimization is to push up ‘nw’ the iterate bit width and push down ‘sc’ the slope correlation. the code reaches very high iterate sizes ~4.5K and yet cant push down ‘sc’ more than ~0.9684. the multioptimization means that there are smaller iterates with smaller slope correlations, but the algorithm moves past those as pushed to search higher iterates. so this general observation/ trend still seems “robust”.