Scale, proportion, and quantity belong to one of the cross cutting concepts in the next generation science standards (NGSS). According to Volume 2 of the NGSS, "in engineering, no structure could be conceived much less constructed without the engineer's precise sense of scale." The authors go on to note that scale and proportion are best understood using the scientific practice of working with models. Read more
Scale, proportion, and quantity belong to one of the cross cutting concepts in the next generation science standards (NGSS). According to Volume 2 of the NGSS, "in engineering, no structure could be conceived much less constructed without the engineer's precise sense of scale." The authors go on to note that scale and proportion are best understood using the scientific practice of working with models.
When scientists and engineers work with these concepts at a molecular scale, new kinds of technologies can be created to advance our understanding of the natural world. One example is DNA ... Read more
A few weeks back, we published a review about the development and role of the human reference genome. A key point of the reference genome is that it is not a single sequence. Instead it is an assembly of consensus sequences that are designed to deal with variation in the human population and uncertainty in the data. The reference is a map and like a geographical maps evolves though increased understanding over time.
In our series on why $1000 genomes cost $2000, I raised the issue that the $1000 genome is a value based on simplistic calculations that do not account for the costs of confirming the results. Next, I discussed how errors are a natural result of the many processing steps required to sequence DNA and why results need to be verified. In this and follow-on posts, I will discuss the four ways (oversampling, technical replicates, biological replicates, and cross-platform replicates) that results can be verified as recommended by Robaskyet. al. .
Previously , I introduced the idea that the $1000 genome has not been achieved because it is defined in simplistic terms that ignore many aspects of data completeness and verification. In that analysis, I cited a recent perspective by Robasky, Lewis, and Church  to present concepts related to the need to verify results and the general ways in which this is done. In this and the next few posts I will dig deeper into the elements of sequence data uncertainty and discuss how results are verified.
Getting an accurate genome sequence requires that you collect the data at least twice argue Robasky, Lewis, and Church in their recent opinion piece in Nat. Rev. Genetics . The DNA sequencing world kicked off 2014 with an audacious start.
In simple Mendelian genetics, a single change in one gene can produce a large change in mortality. The National Human Genome Research Institute (NHGRI) will be funding genomics studies on Mendelian traits using a similar strategy.
NHGRI will fund a small number of centers, dominant centers you might say, and look for large changes. The sequencing centers that will benefit are the Broad Institute, Washington University, and Baylor College of Medicine. For the next four years, the big three will be dividing $86 million a year according to a press ... Read more
You might think the coolest thing about the Next Generation DNA Sequencing technologies is that we can use them to sequence long-dead mammoths, entire populations of microbes, or bits of bone from Neanderthals.