:star: :idea: :!: :o :cool: :D <3 hi all! am very pleased to “announce” (or at least feature) here the GI quasipolynomial time result by Babai. many in the TCS and math community have heard. this post collects/ consolidates many links on the topic circulating so far.
it was very difficult to time this post (maybe about one iota of the difficulty of the proof, lol). it reminds me somewhat of the Zhang breakthrough a few years ago. there is a long dribble of different articles and reactions, and so its hard to figure out an opportune moment/ trigger to post. my final trigger for this post is the Nature article.[a1] and the video of the 1st lecture is now available.[a12]
as the RJLipton blog calls it, its “the world series of complexity theory” or aaronson, “the TCS breakthrough of the decade”. having a result written up in Nature is nearly the pinnacle of achievement in the area as far as media exposure. Nature only occasionally covers theoretical computer science and its typically on very big results. (it would be very interesting to see the list of CS topics covered in Nature and Science over the years.) and these are very big by TCS standards/ metrics: while this problem has been heavily researched for decades, this is the first lower bound advance on the problem in 3 decades— beating Babais own record!
the breakthru is also already covered in Science, Science News, New Scientist, MIT Tech review.[a5][a3][a4][a2] could Babai have envisioned such a massive reaction already? probably not. he turned down an interview with Nature. wow! and who does that? the pressure is really on now. the result is blared largely on his own extremely sterling reputation and long track record of prior meticulous work in the area (over 3 decades of research) because he hasnt published a paper yet.
this is a result that reverberated thru social media very quickly (aka “spreading like wildfire”) with a lot of interest even on reddit.[a8][a10][a11] Gowers cites it on google+ and Tao commented.[b8] there are many blogs on it already by top experts[b] and significant detail in Kuns blog.[b6]
hi all, its a little over 3 yrs since the facebook IPO and we are sure awash in digital/ cyber culture these days. years ago it was tentative, and now its an utterly full blown case/ outbreak. years ago it was possible to wonder if facebook would be a fad, and even as a cyber aficionado, thought that might be the case. but nothing has come along to replace it. google+ is doing ok but its now clear, will never unseat facebook. years ago it was possible to talk about cyber culture as distinct from regular culture, now the two are utterly tightly coupled or as they say in psychology, “enmeshed”…
generally think that the world is a better place with widespread cyber but it has a tremendous pandoras box effect. its full implications are slowly being baked in, unwrapped and unravelled. just impartially collected links, but it turns out over 50 of them are about “negative/ bad/ alarmist” stuff. wow!
imagined and near-predicted such a scenario long ago from dabbling in cyberspace, but again, “be careful what you wish for, you might get it” and “the devil is in the details”. the world has rarely ever gone thru a social transformation on the level of (mass/ worldwide) social networking, and it seems to still be playing out. twitter is showing a few signs of peaking or plateauing, but worldwide use of smartphones is only increasing.
:star: :o :D :cool: hi all, its a very lively time for physics news and breakthroughs, its hard to keep up. its been over half a year since the last physics post in this blog and am swimming, almost drowning in topics/ links.
the very big news is a so-called “loophole free” Bell entanglement test, freshly published in Nature and referred to in NYT.[a][a5][a9] and of course lauded almost effusively by the experts/ authorities.[a4] :roll:
they say that they have closed the detector efficiency loophole aka the “fair sampling” assumption.
but wikipedia (currently) says under “fair sampling assumption”:
The “detection efficiency”, or “fair sampling” problem is the most prevalent loophole in optical experiments.
There is no way to test experimentally whether a given experiment does fair sampling, as the number of emitted but undetected pairs is by definition unknown.
my last post on subj is over 1mo age/ contents so am starting this new one. (in contrast to last time…) nothing earth shattering this time! (honestly, 2020 hindsight, now thinking distinct mania of last post on subj might have been influenced by Tao zeitgeist and blogging about his breakthru…) research in this area goes thru a bipolar-like cycle and am currently in a low :( :'( :P
…oh but as they say in kindergarten teachings, every cloud has a silver lining right? so lets turn that frown upside down! :roll:
:star: :idea: update/ announcement/ highlight: two cstheory stackexchange questions based on some ideas relating collatz to theorem proving and undecidability wrt FSM transducers did quite well on the site at 10 and 6 points and 3 great answers total, a clever one by familiar associate Marzio who has also long been linked on this blogs sidebar. thx man! :D :cool:
hi all. the age of internet celebrity is truly wild at times. it reminds me of that old zen-like saying “be careful what you wish for because you just might get it.” 14-yr old “high school dropout” ahmed mohamed clock engineer (covered here 1mo ago) made it to astronomy night at the whitehouse and did meet obama in the crowd. but not before meeting the president of Sudan a few weeks ago who has been accused of human rights/ war crimes.
he also met up with google CEO Brin at the google science fair. whoa! a lot sure can happen in a month! (wikipedia entry [a22])