Archive for November, 2017

Air Travelers Resisting the ‘Incredible Shrinking Airline Seat’ – The New York Times

Tuesday, November 14th, 2017

https://www.nytimes.com/2017/11/06/business/airline-seat.html?mc=aud_dev&mcid=tw-nytimes&mccr=NovTwitter&mcdt=2017-11&subid=NovTwitter&ad-keywords=AudDevGate

Heptadecagon – Wikipedia

Monday, November 13th, 2017

https://en.wikipedia.org/wiki/Heptadecagon

Can Ketones Rev Up Our Workouts? – The New York Times

Monday, November 13th, 2017

https://www.nytimes.com/2017/11/08/well/move/can-ketones-rev-up-our-workouts.html?partner=rss&emc=rss&smid=tw-nythealth&smtyp=cur

New Theory Cracks Open the Black Box of Deep Learning | Quanta Magazine

Monday, November 13th, 2017

New Theory Cracks Open the Black Box of #DeepLearning
https://www.QuantaMagazine.org/new-theory-cracks-open-the-black-box-of-deep-learning-20170921/ Highlights the importance of a compression phase for generalization

QT:{{”
“Then learning switches to the compression phase. The network starts to shed information about the input data, keeping track of only the strongest features — those correlations that are most relevant to the output label. This happens because, in each iteration of stochastic gradient descent, more or less accidental correlations in the training data tell the network to do different things, dialing the strengths of its neural connections up and down in a random walk. This
randomization is effectively the same as compressing the system’s representation of the input data. As an example”
“}}

Alignment-free sequence comparison: benefits, applications, and tools

Monday, November 13th, 2017

Might be useful for noncoding comparisons

Alignment-free seq. comparison: benefits, apps & tools
https://GenomeBiology.biomedcentral.com/articles/10.1186/s13059-017-1319-7 Great tidbits, viz: Shannon asked von Neumann what to call his info measure – “Why don’t you call it entropy…no one understands entropy…so in any discussion, you’ll be in a position of advantage.”

QT:{{”
“Reportedly, Claude Shannon, who was a mathematician working at Bell Labs, asked John von Neumann what he should call his newly developed measure of information content; “Why don’t you call it entropy,” said von Neumann, “[…] no one understands entropy very well so in any discussion you will be in a position of advantage […]” []. The concept of Shannon entropy came from the observation that some English words, such as “the” or “a”, are very frequent and thus unsurprising” ….
“The calculation of a distance between sequences using complexity (compression) is relatively straightforward (Fig. ). This procedure takes the sequences being compared (x = ATGTGTG and y = CATGTG) and concatenates them to create one longer sequence (xy = ATGTGTGCATGTG). If x and y are exactly the same, then the complexity (compressed length) of xy will be very close to the complexity of the individual x or y. However, if x and y are dissimilar, then the complexity of xy (length of compressed xy) will tend to the cumulative complexities of x and y.”

“Intriguingly, BLOSUM matrices, which are the most commonly used substitution matrix series for protein sequence alignments, were found to have been miscalculated years ago and yet produced significantly better alignments than their corrected modern version (RBLOSUM) []; this paradox remains a mystery.”
“}}

NIH awards to test ways to store, access, share, and compute on biomedical data in the cloud

Sunday, November 12th, 2017

NIH awards to test ways to store, access, share, and compute on biomedical data in the cloud
https://www.nih.gov/news-events/news-releases/nih-awards-test-ways-store-access-share-compute-biomedical-data-cloud

Why these powerful health care gurus left the East Coast for California

Sunday, November 12th, 2017

Why these powerful health care gurus left the East Coast for California https://www.StatNews.com/2017/11/06/california-startup-culture/ Quotes T Insel, ex-director of NIMH: “What I was doing was providing lots of additional funding to people whose major goal was to get a paper in Nature & get tenure.”

““What I was doing was providing lots of additional funding to people whose major goal was to get a paper in Nature and get tenure.” Dr. Tom Insel, Mindstrong Health co-founder”

Reading by the Numbers: When Big Data Meets Literature

Sunday, November 12th, 2017

Reading by the Numbers: When #BigData Meets Literature
https://www.NYTimes.com/2017/10/30/arts/franco-moretti-stanford-literary-lab-big-data.html Distant reading as a complement to close reading for literary texts. Perhaps a useful dichotomy for biosequences too!

QT:{{”
“Literary criticism typically tends to emphasize the singularity of exceptional works that have stood the test of time. But the canon, Mr. Moretti argues, is a distorted sample. Instead, he says, scholars need to consider the tens of thousands of books that have been forgotten, a task that computer algorithms and enormous digitized databases have now made possible.

“We know how to read texts,” he wrote in a much-quoted essay included in his book “Distant Reading,” which won the 2014 National Book Critics Circle Award for Criticism. “Now let’s learn how to not read them.””

“}}

Carl Friedrich Gauss – Wikipedia

Saturday, November 11th, 2017

https://en.wikipedia.org/wiki/Carl_Friedrich_Gauss

QT:{{"
Another story has it that in primary school after the young Gauss misbehaved, his teacher, J.G. Büttner, gave him a task: add a list of integers in arithmetic progression; as the story is most often told, these were the numbers from 1 to 100. The young Gauss reputedly produced the correct answer within seconds, to the astonishment of his teacher and his assistant Martin Bartels.

"}}

Heat equation – Wikipedia

Saturday, November 11th, 2017

https://en.wikipedia.org/wiki/Heat_equation