Posts Tagged ‘quote’

Future Opportunities for Genome Sequencing and Beyond

Monday, March 2nd, 2015

Workshop Report

http://www.genome.gov/27559219

Some quoted snippets that people felt were interesting:

QT:{{”

NHGRI has been a leader in creating ontologies and standards, and should continue to lead in this area. Given the large amount of sequencing occurring in the world, NHGRI could undertake steps to catalog different sequencing efforts, and to ensure that data are shared in truly useful ways. Given that raw data cannot always be shared, it is also important to improve interoperability and collaboration. Through these types of efforts the collective sequence data become a much more powerful resource for everyone.

The goal of identifying the mutations underlying all Mendelian conditions – the great majority of which are loss-of-function – is the human equivalent of the mouse knockout project, and is of great importance to understanding of both human biology and genomic medicine.

As discussed in the NHGRI Strategic Plan, continued acquisition of knowledge on genome function is valuable for understanding the biology of genomes and the genomic basis of disease. Integrating variant discovery with analysis of genome function can help with prediction of the following: causal variants based on identified tag variants; target genes and cell types based on disease associations; and mechanisms by which pathogenic variants act.

A paradox of precision medicine is that sequencing data needs to be generated in large numbers of subjects to interpret what is seen in individual patients.

Sequencing and alignment of primate and vertebrate genomes has led to identification of >3 million evolutionarily conserved elements and improved understanding of the origin of human and mammalian lineages. …
The amount of exome and genome data generated in research and clinical settings is expected to continue to increase significantly. NHGRI cannot expect to be a primary driver of sequence production and capacity. However, the Institute can take a catalytic lead in setting standards, improving and implementing new test methods, disseminating and integrating information, and serving as a model for other groups. A suggestion is for NHGRI to fund multiple pilot activities to explore how to organize capabilities and resources.

Consideration should be given to implementing foundational resources that make sequencing broadly useful for discovery and clinical applications. NHGRI needs to enhance public awareness of the process, progress and success of our programs. As one example, projects “building” for the future should be complemented by projects that can have fruitful products on a shorter timescale. Demonstrating impact and showing objective measures of progress are needed to maintain support of the rest of the community.

Functional Genomics

Large-scale centers with generalized capacity for assessing variant function are likely premature, because the most universally valuable data types and methods to take to scale are not yet known. For functional studies, there may be an advantage to leveraging existing consortia with expertise in specific tools, assays and approaches. Activity — particularly standards, quality control, resource development and data sharing — can be managed without being prematurely or overly prescriptive.

NHGRI needs to take the title of the workshop to heart and expand beyond genome sequencing. This will include functional studies, as discussed, as well as opportunities for large-scale efforts in epigenomics and metabolomics.
NHGRI should strive to put the “W” back in whole genome sequencing (WGS). This includes considering how the WGS as done today differs from perfection (i.e., phased telomere to telomere contiguity), articulating what is missing in current technologies, and specifying what could be enabled by alternative technologies. This topic is sometimes overlooked, since it is viewed as solved, or passé, but is an area that could benefit from more investment and creativity. Cases who currently go undiagnosed through routine sequencing for Mendelian disorders and other clinical phenotypes should be assessed with respect to the potential role of structural variants and other missing sequence data.
NHGRI programs all require advances and investment in bioinformatics, biocomputing and data science. Needs in this area should be considered in the design of research programs and funding opportunity
announcements. Projects need to have the right size and balance of expertise, along with sufficient resources and support, in
bioinformatics, biocomputing and data science.

Human disease is a perturbation of systems. Systems biology approaches should be incorporated into studies. The role of genetic variants needs to be appreciated in the larger context of the cell, the individual and the environment, which can be achieved through continued studies of gene-gene and gene-environment interactions and pathways.

“}}

An hereditary meritocracy

Monday, March 2nd, 2015

An hereditary meritocracy http://www.economist.com/news/briefing/21640316-children-rich-and-powerful-are-increasingly-well-suited-earning-wealth-and-power The rich gaming college admissions? Public good in progressive aid stemming from a $1M gift

QT:{{"
The fierce competition between universities to build endowments makes doing such favours for alumni enticing. And there is a public-good argument for it: a student who comes with $1m attached can pay for financial aid for many others. But in practice this is not how the system works. While it is true that some elite universities are rich enough to give out a lot of financial support, people who can pay the full whack are still at the centre of the business model for many. Mitchell Stevens, a Stanford sociologist who spent a year working in the admissions office of an unnamed liberal arts college in the north-east, found that the candidate the system most prized was one who could pay full tuition and was just good enough to make one of the higher-profile sports teams but had a strong enough academic record not to eat into the annual allocation reserved for students whose brains work best when encased in a football helmet.

"}}

The Trip Treatment – The New Yorker

Saturday, February 21st, 2015

Annals of Medicine
FEBRUARY 9, 2015 ISSUE
The Trip Treatment
Research into psychedelics, shut down for decades, is now yielding exciting results.
BY MICHAEL POLLAN

The Trip Treatment
http://www.newyorker.com/magazine/2015/02/09/trip-treatment A new idea to relieve cancer suffering & depression, revisiting #psychedelics banned for >40 years

QT:{{”

After the screening, Mettes was assigned to a therapist named Anthony Bossis, a bearded, bearish psychologist in his mid-fifties, with a specialty in palliative care. Bossis is a co-principal investigator for the N.Y.U. trial.

After four meetings with Bossis, Mettes was scheduled for two dosings—one of them an “active” placebo (in this case, a high dose of niacin, which can produce a tingling sensation), and the other a pill containing the psilocybin.

“I felt a little like an archeologist unearthing a completely buried body of knowledge,” he said. Beginning in the nineteen-fifties, psychedelics had been used to treat a wide variety of conditions, including alcoholism and end-of-life anxiety. The American Psychiatric Association held meetings centered on LSD. “Some of the best minds in psychiatry had seriously studied these compounds in therapeutic models, with government funding,” Ross said.

“I’m personally biased in favor of these type of studies,” Thomas R. Insel, the director of the National Institute of Mental Health (N.I.M.H.) and a neuroscientist, told me. “If it proves useful to people who are really suffering, we should look at it. Just because it is a psychedelic doesn’t disqualify it in our eyes.”

I was struck by how the descriptions of psychedelic journeys differed from the typical accounts of dreams. For one thing, most people’s recall of their journey is not just vivid but comprehensive, the narratives they reconstruct seamless and fully accessible, even years later.

This might help explain why so many cancer patients in the trials reported that their fear of death had lifted or at least abated: they had stared directly at death and come to know something about it, in a kind of dress rehearsal.

The default-mode network was first described in 2001, in a landmark paper by Marcus Raichle, a neurologist at Washington University, in St. Louis, and it has since become the focus of much discussion in neuroscience. The network comprises a critical and centrally situated hub of brain activity that links parts of the cerebral cortex to deeper, older structures in the brain, such as the limbic system and the hippocampus.

The network, which consumes a significant portion of the brain’s energy, appears to be most active when we are least engaged in attending to the world or to a task. It lights up when we are daydreaming, removed from sensory processing, and engaging in higher-level “meta-cognitive” processes such as self-reflection, mental time travel, rumination, and “theory of mind”—the ability to attribute mental states to others. Carhart-Harris describes the default-mode network variously as the brain’s “orchestra conductor” or “corporate executive” or “capital city,” charged with managing and “holding the entire system together.” It is thought to be the physical counterpart of the autobiographical self, or ego.

“The brain is a hierarchical system,” Carhart-Harris said. “The highest-level parts”—such as the default-mode network—“have an inhibitory influence on the lower-level parts, like emotion and memory.” He discovered that blood flow and electrical activity in the default-mode network dropped off precipitously under the influence of psychedelics, a finding that may help to explain the loss of the sense of self that volunteers reported….Just before Carhart-Harris published his results, in
a 2012 paper in Proceedings of the National Academy of Sciences, a researcher at Yale named Judson Brewer, who was using fMRI to study the brains of experienced meditators, noticed that their default-mode networks had also been quieted relative to those of novice meditators. It appears that, with the ego temporarily out of commission, the boundaries between self and world, subject and object, all dissolve. These are hallmarks of the mystical experience.


Carhart-Harris doesn’t romanticize psychedelics, and he has little patience for the sort of “magical thinking” and “metaphysics” they promote. In his view, the forms of consciousness that psychedelics unleash are regressions to a more “primitive style of cognition.” Following Freud, he says that the mystical experience—whatever its source—returns us to the psychological condition of the infant, who has yet to develop a sense of himself as a bounded individual. The pinnacle of human development is the achievement of the ego, which imposes order on the anarchy of a primitive mind buffeted by magical thinking. (The developmental psychologist Alison Gopnik has speculated that the way young children perceive the world has much in common with the psychedelic experience. As she puts it, “They’re basically tripping all the time.”) The psychoanalytic value of psychedelics, in his view, is that they allow us to bring the workings of the unconscious mind “into an observable space.”

In “The Doors of Perception,” Aldous Huxley concluded from his psychedelic experience that the conscious mind is less a window on reality than a furious editor of it. The mind is a “reducing valve,” he wrote, eliminating far more reality than it admits to our conscious awareness, lest we be overwhelmed. “What comes out at the other end is a measly trickle of the kind of consciousness which will help us to stay alive.” Psychedelics open the valve wide, removing the filter that hides much of reality, as well as dimensions of our own minds, from ordinary consciousness.
“}}

Hysteresis in a quantized superfluid atomtronic circuit : Nature : Nature Publishing Group

Tuesday, February 10th, 2015

Hysteresis in a quantized #superfluid atomtronic circuit
http://www.nature.com/nature/journal/v506/n7487/full/nature12958.html Moving atoms instead of electrons for future storage devices

Stephen Eckel,
Jeffrey G. Lee,
Fred Jendrzejewski,
Noel Murray,
Charles W. Clark,
Christopher J. Lobb,
William D. Phillips,
Mark Edwards
& Gretchen K. Campbell

Nature 506, 200–203 (13 February 2014) doi:10.1038/nature12958

QT:{{”
Atomtronics1, 2 is an emerging interdisciplinary field that seeks to develop new functional methods by creating devices and circuits where ultracold atoms, often superfluids, have a role analogous to that of electrons in electronics. Hysteresis is widely used in electronic circuits—it is routinely observed in superconducting circuits3 and is essential in radio-frequency superconducting quantum interference devices4. Furthermore, it is as fundamental to superfluidity5 …. “}}

Why Most Published Research Findings are false

Saturday, February 7th, 2015

Why Most Published Research Findings are False http://journals.plos.org/plosmedicine/article?id=10.1371/journal.pmed.0020124 Evaluating 2×2 confusion matrix, effects of bias & multiple studies

PLoS Medicine | www.plosmedicine.org 0696
August 2005 | Volume 2 | Issue 8 | e124

QT:{{"
Published research fi ndings are sometimes refuted by subsequent evidence, with ensuing confusion and disappointment. Refutation and controversy is seen across the range of research designs, from clinical trials and traditional epidemiological studies [1–3] to the most modern molecular research [4,5]. There is increasing concern that in modern research, false fi ndings may be the majority or even the vast majority of published research claims [6–8]. However, this should not be surprising. It can be proven that most claimed research fi ndings are false. Here I will examine the key


Research fi ndings are defi ned here as any relationship reaching formal statistical signifi cance, e.g., effective interventions, informative predictors, risk factors, or associations. “Negative” research is also very useful. “Negative” is actually a misnomer, and the misinterpretation is widespread. However, here we will target relationships that investigators claim exist, rather than null fi ndings. As has been shown previously, the probability that a research fi nding is indeed true depends on the prior probability of it being true (before doing the study), the statistical power of the study, and the level of statistical signifi cance [10,11]. Consider a 2 × 2 table in which research fi ndings are compared against the gold standard of true relationships in a scientifi c fi eld. In a research fi eld both true and false hypotheses can be made about the presence of relationships. Let R be the ratio of the number of “true relationships” to “no relationships” among those tested in the fi eld. R

is characteristic of the fi eld and can vary a lot depending on whether the fi eld targets highly likely relationships or searches for only one or a few true relationships among thousands and millions of hypotheses that may be postulated. Let us also consider, for computational simplicity, circumscribed fi elds where either there is only one true relationship (among many that can be hypothesized) or the power is similar to fi nd any of the several existing true relationships. The pre-study probability of a relationship being true is R⁄(R + 1). The probability of a study fi nding a true relationship refl ects the power 1 − β (one minus the Type II error rate). The probability of claiming a relationship when none truly exists refl ects the Type I error rate, α. Assuming that c relationships are being probed in the fi eld, the expected values of the 2 × 2 table are given in Table 1. After a research fi nding has been claimed based on achieving formal statistical signifi cance, the post-study probability that it is true is the positive predictive value, PPV. The PPV is also the complementary probability of what Wacholder et al. have called the false positive report probability [10]. According to the 2 × 2 table, one gets PPV = (1 − β)R⁄(R − βR + α). A research fi nding is thus
"}}

The Long Road to Maxwell’s Equations – IEEE Spectrum

Sunday, February 1st, 2015

The Long Road to #Maxwell’s Equations
http://spectrum.ieee.org/telecom/wireless/the-long-road-to-maxwells-equations Heaviside simplified the original 20 eqns. to the current 4 w. vector fields

Also, Hertz’s 2 “loop” experiments were key!

A great grave to visit.

QT:{{”

Should you wish to pay homage to the great physicist James Clerk Maxwell, you wouldn’t lack for locales in which to do it. There’s a memorial marker in London’s Westminster Abbey, not far from Isaac Newton’s grave. A magnificent statue was recently installed in Edinburgh, near his birthplace. Or you can pay your respects at his final resting place near Castle Douglas, in southwestern Scotland, a short distance from his beloved ancestral estate.

You could start the clock in 1800, when physicist Alessandro Volta reported the invention of a battery, which allowed experimenters to begin working with continuous direct current. Some 20 years later,Hans Christian Ørsted obtained the first evidence of a link between electricity and magnetism, by demonstrating that the needle of a compass would move when brought close to a current-carrying wire. Soon after, André-Marie Ampère showed that two parallel current-carrying wires could be made to exhibit a mutual attraction or repulsion depending on the relative direction of the currents. And by the early 1830s, Michael Faraday had shown that just as electricity could influence the behavior of a magnet, a magnet could affect electricity, when he showed that drawing a magnet through a loop of wire could generate current.

A major seed was planted by Faraday, who envisioned a mysterious, invisible “electrotonic state” surrounding the magnet—what we would today call a field. He posited that changes in this electrotonic state are what cause electromagnetic phenomena.

The net result of all of this complexity is that when Maxwell’s theory made its debut, almost nobody was paying attention.

But a few people were. And one of them was Oliver Heaviside. Once described by a friend as a “first rate oddity,” Heaviside, who was raised in extreme poverty and was partially deaf, never attended university.

Heaviside ended up reproducing a result that had already been published by another British physicist, John Henry Poynting. But he kept pushing further, and in the process of working through the complicated vector calculus, he happened upon a way to reformulate Maxwell’s score of equations into the four we use today.

Now confident that he was generating and detecting electromagnetic waves, Lodge planned to report his astounding results at a meeting of the British Association, right after he returned from a vacation in the Alps. But while reading a journal on the train out of Liverpool, he discovered he’d been scooped. In the July 1888 issue of Annalen der Physik, he found an article entitled “Über elektrodynamische Wellen im Luftraum und deren Reflexion” (“On electrodynamic waves in air and their reflection”) written by a little-known German researcher, Heinrich Hertz.

Hertz’s … noticed that something curious happened when he discharged a capacitor through a loop of wire. An identical loop a short distance away developed arcs across its unconnected terminals. Hertz recognized that the sparks in the unconnected loop were caused by the reception of electromagnetic waves that had been generated by the loop with the discharging capacitor.

Inspired, Hertz used sparks in such loops to detect unseen
radio-frequency waves. He went on to conduct experiments to verify that electromagnetic waves exhibit lightlike behaviors of reflection, refraction, diffraction, and polarization.
“}}

Distributed Information Processing in Biological and Computational Systems

Monday, January 26th, 2015

Distributed Info. Processing in Biological & Computational #Systems http://cacm.acm.org/magazines/2015/1/181614-distributed-information-processing-in-biological-and-computational-systems/fulltext Contrasts in strategies to handle node failures

QT:{{"
While both computational and biological systems need to address these similar types of failures, the methods they use to do so differs. In distributed computing, failures have primarily been handled by majority voting methods,37 by using dedicated failure detectors, or via cryptography. In contrast, most biological systems rely on various network topological features to handle failures. Consider for example the use of failure detectors. In distributed computing, these are either implemented in hardware or in dedicated additional software. In contrast, biology implements implicit failure detector mechanisms by relying on backup nodes or alternative pathways. Several proteins have paralogs, that is, structurally similar proteins that in most cases originated from the same ancestral protein (roughly 40% of yeast and human proteins have at least one paralog). In several cases, when one protein fails or is altered, its paralog can automatically take its place24 or protect the cell against the mutation.26 Thus, by preserving backup functionality in the protein interaction.


While we discussed some reoccurring algorithmic strategies used within both types of systems (for example, stochasticity and feedback), there is much more to learn in this regard. From the distributed computing side, new models are needed to address the dynamic aspects of communication (for example, nodes joining and leaving the network, and edges added and being subtracted), which are also relevant in mobile computing scenarios. Further, while the biological systems we discussed all operate without a single centralized controller, there is in fact a continuum in the term “distributed.” For example, hierarchical distributed models, where higher layers “control” lower layers with possible feedback, represent a more structured type of control system than traditional distributed systems without such a hierarchy. Gene regulatory networks and neuronal networks (layered columns) both share such a hierarchical structure, and this structure has been well-conserved across many different species, suggesting their importance to computation. Such models, however, have received less attention in the distributed computing literature.

"}}

The Many Guises of Aromaticity » American Scientist

Friday, January 23rd, 2015

The Many Guises of Aromaticity
http://www.americanscientist.org/issues/pub/2015/1/the-many-guises-of-aromaticity #Resonance is hyped; hence, many proposals for compounds w/ it that aren’t benchstable

QT:{{”
Today, an inflation of hype threatens this beautiful concept. Molecules constructed in silico are extolled as possessing surfeits of aromaticity—“doubly aromatic” is a favorite descriptor. Yet the molecules so dubbed have precious little chance of being made in bulk in the laboratory. One can smile at the hype, a gas of sorts, were it not for its volume. A century and a half after the remarkable suggestion of the cyclic structure of benzene, the conceptual value of aromaticity—so useful, so chemical—is in a way dissolving in that hype. Or so it seems to me.

Bench-Stable, Bottleable

Computers made the determination of the structure of molecules in crystals easy—what took half a year in 1960 takes less than an hour today. They also made computations of the stability of molecules facile.

Whoa! What do you mean by stability? Usually what’s computed is stability with respect to decomposition to atoms. But that is pretty meaningless; for instance, of the four homonuclear diatomic molecules (composed of identical atoms) that are most stable with respect to atomization, N2,C2, O2, and P2, two (C2 and P2) are not persistent. You will never see a bottle of them. Nor the tiniest crystal. They are reactive, very much so. In chemistry it’s the barriers to reaction that provide the opportunity to isolate a macroscopic amount of a compound. Ergo the neologism, “bench-stable.” “Bottleable” is another word for the idea. A lifetime of a day at room temperature allows a competent graduate student at the proverbial bench to do a crystal structure and take an NMR scan of a newly made compound. Or put it into a bottle and keep it there for a day, not worrying that it will turn into brown gunk.

“}}

The two cultures of mathematics and biology | Bits of DNA

Tuesday, January 20th, 2015

QT:{{”
What biologists should appreciate, what was on offer in Mumford’s obituary, and what mathematicians can deliver to genomics that is special and unique, is the ability to not only generalize, but to do so “correctly”. The mathematician Raoul Bott once reminisced that “Grothendieck was extraordinary as he could play with concepts, and also was prepared to work very hard to make arguments almost tautological.” In other words, what made Grothendieck special was not that he generalized concepts in algebraic geometry to make them more abstract, but that he was able to do so in the right way.
“}}

http://liorpachter.wordpress.com/2014/12/30/the-two-cultures-of-mathematics-and-biology/

WASP: allele-specific software for robust discovery of molecular quantitative trait loci | bioRxiv

Monday, January 19th, 2015

WASP: allele-specific software for robust discovery of molecular quantitative trait loci
Bryce van de Geijn, Graham McVicker, Yoav Gilad, Jonathan Pritchard

doi: http://dx.doi.org/10.1101/011221
http://biorxiv.org/content/early/2014/11/07/011221

QT:{{”
Mapping of reads to a reference genome is biased by sequence polymorphisms6. Reads which contain the non-reference allele may fail to map uniquely or map to a different (incorrect) location in the genome6.
“}}