When will the first Jehovah's Witness be sequenced?

An amusing article, Dr Atta-ur-Rehman, first Pakistani to have genome mapped:

Prof Dr M Iqbal Choudhary, Director International Centre for Chemical and Biological Sciences (ICCBS), Karachi University (KU), disclosed on Thursday that former Chairman, Higher Education Commission (HEC) Prof Dr Atta-ur-Rehman is the first Pakistani whose genome has been mapped by Pakistani scientists at a cost of $40,000 in just 10 months.

China has contributed $20,000 in the total cost of the genome project. Pakistani and Indian genomes have similarities compared to others, he said, while speaking at a press conference, held on Thursday at Dr Panjwani Center for Molecular Medicine & Drug Research (PCMD), Karachi University (KU).

“Dr Atta has become the first Muslim man with this distinction, while he is the third one among a list of renowned people in the world whose genomes have been mapped by scientists. The names of the first two persons are Prof Watson and Dr Ventor (2007), while others are unnamed.

Read More

Why hominin fossils matter

Yesterday Dienekes had a post up, Homo erectus soloensis fades into the past…. In it he states:

Every year or so there seems to be a redating of a key fossil in human evolution. It’s nice to see scientific self-correction in action, and soon after Neandertals got a little older, casting doubt on their supposedly long co-existence with modern humans, we now have a redating of Homo erectus soloensis from Java to about 150-550 thousand years ago, but certainly long before there were any anatomically modern humans in the area.

I think Dienekes is jumping the gun a bit in terms of the solidity of any given finding in knocking down prior consensus. That being said, the very young ages for Southeast Asian H. erectus, on the order of ~30-50,000 years B.P., always seemed strange to me. The paper Dienekes is referring to, The Age of the 20 Meter Solo River Terrace, Java, Indonesia and the Survival of Homo erectus in Asia, is rather technical in the earth science, as it involves dating and interpreting confounds in the stratigraphy. But this section of the discussion gets to the gist of the matter if you can’t follow the details of fossil dating:

If the middle Pleistocene 40Ar/39Ar ages better reflect the age of the Solo River 20 meter terrace deposits and hominins, the site of Ngandong remains a relatively late source of H. erectus; however, these H. erectus would not be the contemporaries of Neandertals and modern humans, and their chronology would widen the gap between the last surviving H. erectus and the population from Flores – whose source population has been argued to be Indonesian H. erectus…although this point is contested…Instead, the Ngandong hominins would be contemporaries of the H. heidelbergensis from Atapuerca, Spain and elsewhere in Europe, and, possibly the archaic H. sapiensspecimen from Bodo (Ethiopia), which might favor arguments that they are more closely affiliated with these taxa and differ from H. erectus…Such ages for Ngandong would suggest that a series of geographically relatively isolated lineages of hominins lived during the middle Pleistocene.

Read More

Google+, not Wave or Buzz

I’ve been playing around with Google+ a little today. Farhad Manjoo no like, More Like Google Minus:

… First, I don’t know whom the company thinks it’s kidding; Google+ is obviously a direct competitor to Facebook. Given the large overlap in functionality, I can’t imagine that many people will use Google+ and Facebook simultaneously. For most of us, it will be one or the other. Google+’s success, then, will rest in large part on Google’s ability to convince people to ditch Facebook for the new site. For that, Google+ will have to offer some compelling view of social networking that’s substantially different from what’s available on Facebook. And that’s where Google+ baffles me. What is so compelling about Google+ that I can’t currently get on Facebook or Twitter? Or Gmail, for that matter? At the moment, I can’t tell….

But circles are nothing new. Facebook has offered several ways to break your network into smaller chunks for many years now, and it has worked constantly to refine them. And you know what? Almost no one uses those features. Only 5 percent of Facebookers keep “Lists,” Facebook’s first attempt for people to categorize their friends. Recognizing that “Lists” weren’t great, last year the site unveiled a new way to manage your friends, called “Groups.” I was optimistic that “Groups” would help to compartmentalize Facebook, but from what I can tell, few people use that feature, either.

Since Google+ is not “prime time” I’m not going to judge it too much. The interface feels a lot zippier and more fluid than Facebook’s, but that might just be because there are hundreds of millions of people using Facebook. Unlike Manjoo I do think that the idea of “circles” is not without merit. I tried Facebook’s Lists, and it just plain didn’t work the way it was supposed to work, so I gave up. Right now I, along with others, slice and dice my online voice across different platforms. for public interaction, for semi-public interaction.

When you have friends you know through science blogging, transhumanism, right-wing politics, high school, not to mention cousins who were raised in the Tablighi subculture, Facebook’s one-size-fits-all tendency of throwing them into a big pot has been kind of suboptimal. Then again, most people probably don’t manifest as much dilettantism as I do, leading them to have a much more well “sorted” social set.

I will say though that Google+ doesn’t seem as patently useless as Wave and Buzz were. But if you haven’t gotten an invite, you aren’t missing out on much. There is no way this should warrant the hysteria which was the norm when Gmail first rolled out and required invites.

No bookstores in Nashville?

That’s what Ann Patchett is claiming. More specifically, there are no bricks & mortar institutions which specialize in selling new books. There are places you can get used books in the city of Nashville. To remedy the situation Patchett is opening up a bookstore herself. She asserts that “…we’ve got to get back to a 3000-square-foot store and not 30,000. Amazon is always going to have everything – you can’t compete with that. But there is, I believe, still a place for a store where people read books.”

I recall going to a Barnes & Noble when I was in Nashville in the summer of 2004. Here’s some demographic data: “As of the 2010 census, the balance population was 601,222. The 2000 population was 545,524.” The details here are a bit muddy because parts of Davidson county are included with the Nashville total, but you get a general sense of how substantial the population of this city is. As a point of comparison Eugene, OR, has a population of 156,185, and 29 Yelp hits for bookstores. Nashville has 46 results.

Back to Patchett’s claim, I think there is something there. I don’t know how it’s going to shake out in the details. But consider the fact that it is far cheaper to brew your own coffee at home, but more and more people are frequenting shops which sell coffee at a much higher per unit cost. Obviously people are going for the experience. The main issue with bookstores is that the per unit cost of a book is higher than even a fancy drink at most coffee shops.

The punctuated equilibrium of culture


John Winthrop, ~1600. Mitt Romney, 2008 – image credit, Jessica Rinaldi


Recently Megan Mcardle had a post up where she expressed curiosity as to why “futurists” circa 1900 had a tendency not to imagine revolutions in clothing style which might have been anticipated to occur over the next few decades. You also see see this in Star Trek in the 1960s, where faux-future fashion was clearly based on the trends of the day, from the beehive hair to miniskirts. So I thought this comment was of interest:

I don’t know the answer, but I don’t know that they were wrong to do it. Keeping fashions exactly the same as the present generally winds up with more in common with the actual future than deliberate “future” fashions. A fair number of men still wear ties, and on rare occasions a few even wear tailcoats; rather fewer wear silver jumpsuits.

There have been a few counters to extreme fashions in media SF: “Blade Runner”‘s lead wore the same trenchcoat as his noir forebears; “Babylon 5” went for modified business suits and moderate variations on military uniforms; the “Battlestar Galactica” reimagining was pretty much straight conservative turn-of-the-millennium wear despite being in a far different time. How have those worn versus the approach taken by “Star Trek” or the 2015 segment of the “Back to the Future” movies?

I’m not sure that I accept this case as airtight, but this is certainly true in the specifics. Though I just saw some clips of Running Man for the first time on Youtube, I viewed Blade Runner a few years back for the second time and was struck by how undated it was in regards to fashion sense. At least in a very noticeable manner. It got me to thinking of the nature of cultural evolution even then.

Read More

"What if you're wrong" – haplogroup J

Back when this sort of thing was cutting edge mtDNA haplogroup J was a pretty big deal. This was the haplogroup often associated with the demic diffusion of Middle Eastern farmers into Europe. This was the “Jasmine” clade in . A new paper in PLoS ONE makes an audacious claim: that J is not a lineage which underwent recent demographic expansion, but rather one which has been subject to a specific set of evolutionary dynamics which have skewed the interpretations due to a false “molecular clock” assumption. By this assumption, I mean that mtDNA, which is passed down in an unbroken chain from mother to daughter, is by and large neutral to forces like natural selection and subject to a constant mutational rate which can serve as a calibration clock to the last common ancestor between two different lineages. Additionally, mtDNA has a high mutational rate, so it accumulates lots of variation to sample, and, it is copious, so easy to extract. What’s not to like?

First, the paper, Mutation Rate Switch inside Eurasian Mitochondrial Haplogroups: Impact of Selection and Consequences for Dating Settlement in Europe:

Read More

Complex interactions among epilepsy genes

A debate has been raging over the last few years over the nature of the genetic architecture of so-called “complex” disorders. These are disorders – such as schizophrenia, epilepsy, type II diabetes and many others – which are clearly heritable across the population, but which do not show simple patterns of inheritance. A new study looking at the profile of mutations in hundreds of genes in patients with epilepsy dramatically illustrates this complexity. The possible implications are far-reaching, especially for our ability to predict risk based on an individual’s genetic profile, but do these findings apply to all complex disorders?

Complex disorders are so named because, while it is clear that they are highly heritable (risk to an individual increases the more closely related they are to someone who has the disorder), their mode of inheritance is far more difficult to discern. Unlike classical Mendelian disorders (such as cystic fibrosis or Huntington’s disease), these disorders do not show simple patterns of segregation within families that would peg them as recessive or dominant, nor can they be linked to mutations in a single gene. This has led people to propose two very different explanations for how they are inherited.

One theory is that such disorders arise due to unfortunate combinations of large numbers of genetic variants that are common in the population. Individually, such variants would have little effect on the phenotype, but collectively, if they surpass some threshold of burden, they could tip the balance into a pathological state. This has been called the common disease/common variant (CD/CV) model.

The alternative model is that these “disorders” are not really single disorders at all – rather they are umbrella terms for collections of a large number of distinct genetic disorders, which happen to result in a similar set of symptoms. Within any individual or family, the disorder may indeed be caused by a particular mutation. Because many of the disorders in question are very severe, with high mortality and reduced numbers of offspring, these mutations will be rapidly selected against in the population. They will therefore remain very rare and many cases of the disorder may arise from new, or de novo, mutations. This has therefore been called the multiple rare variants (MRV) model.

Lately, a number of mixed models have been proposed by various researchers, including myself. Even classical Mendelian disorders rarely show strictly Mendelian inheritance – instead the effects of the major mutations are invariably affected by modifiers in the genetic background. (These are variants with little effect by themselves but which may have a strong effect in combination with some other mutation). If this sounds like a return to the CD/CV model, there are a couple important distinctions to keep in mind. One is the nature of the mutations involved – the mixed model would still invoke some rare mutation that has a large effect on protein function. It may not always cause the disorder by itself (i.e., not every one who carries it will be affected), but could still be called causative in the sense that if the affected individual did not carry it one would expect they would not suffer from the disorder. The other is the number of mutations or variants involved – under the CD/CV model this could number in the thousands (a polygenic architecture), while under the mixed model one could expect a handful to be meaningfully involved (an oligogenic architecture – see diagram from review in Current Opinion in Neurobiology).

The new study, from the lab of Jeff Noebels, aimed to test these models in the context of epilepsy. Epilepsy is caused by an imbalance in excitation and inhibition within brain circuits. This can arise due to a large number of different factors, including alterations in the structural organisation of the brain, which may be visible on magnetic resonance imaging. Many neurodevelopmental disorders are therefore associated with epilepsy as a symptom (usually one of many). But it can also arise due to more subtle changes, not in the gross structure of the brain or the physical wiring of different circuits, but in the way the electrical activity of individual neurons is controlled.

The electrical properties of any neuron – how excitable it is, how long it remains active, whether it fires a burst of action potentials or single ones, what frequency it fires at and many other important parameters – are determined in large part by the particular ion channel proteins it expresses. These proteins form a pore crossing the membrane of the cell, through which electrically charged ions can pass. Different channels are selective for sodium, potassium or calcium ions and can be activated by different types of stimuli – binding a particular neurotransmitter or a change in the cell’s voltage for example. Many channels are formed from multiple subunits, each of which may be encoded by a different gene. There are hundreds of these genes in several large families, so the resultant complexity is enormous.

Many familial cases of epilepsy have been found to be caused by mutations in ion channel genes. However, most epilepsy patients outside these families do not carry these particular mutations. Therefore, despite these findings and despite the demonstrated high heritability, the particular genetic cause of the vast majority of cases of epilepsy has remained unknown. Large genome-wide association studies have looked for common variants that are associated with risk of epilepsy but have turned up nothing of note. The interpretation has been that common variants do not play a major role in the etiology of idiopathic epilepsy (epilepsy without a known cause).

The rare variants model suggests that many of these cases are caused by single mutations in any of the very large number of ion channel genes. A straightforward experiment to test that would be to sequence all these candidate genes in a large number of epilepsy patients. The hope is that it would be possible to shake out the “low hanging fruit” – obviously pathogenic mutations in some proportion of cases. The difficulty lies in recognising such a mutation as pathogenic when one finds it. This generally relies on some statistical evidence – any individual mutation, or such mutations in general, should be more frequent in epilepsy patients than in unaffected controls. The experiment must therefore involve as large a sample as possible and a control comparison group as well as patients.

Klassen and colleagues sequenced 237 ion channel genes in 152 patients with idiopathic epilepsy and 139 healthy controls. What they found was surprising in several ways. They did find lots of mutations in these genes, but they found them at almost equal frequency in controls as in patients. Even the mutations predicted to have the most severe effects on protein function were not significantly enriched in patients. Indeed, mutations in genes already known to be linked to epilepsy were found in patients and controls alike (though 96% of patients had such a mutation, so did 67% of controls). Either these specific mutations are not pathogenic or their effects can be strongly modified by the genetic background.

More interesting results emerged from looking at the occurrence of multiple mutations in these genes in individuals. 78% of patients vs 30% of controls had two or more mutations in known familial epilepsy genes. A similar trend was observed when looking at specific ion channel gene families, such as GABA receptors or sodium channels.

These data would seem to fit with the idea that an increasing mutational load pushes the system over a threshold into a pathological state. The reality seems more complicated, however, and far more nuanced. Though the average load was lower, many controls had a very high load and yet were quite healthy. It seems that the specific pattern of mutations is far more important than the overall number. This fits very well with the known biology of ion channels and previous work on genetic interactions between mutations in these genes.

Though one might expect a simple relationship between number of mutations and severity of phenotype, that is unlikely to be the case for these genes. It is well known that the effects of a mutation in one ion channel gene can be suppressed by mutation in another gene – restoring the electrical balance in the cell, at least to a degree sufficient for performance under normal conditions. The system is so complex, with so many individual components, that these interactions are extremely difficult to predict. This is complicated further by the fact that there are active processes within the system that act to normalise its function. It has been very well documented, especially by Eve Marder and colleagues, that changes to one ion channel in a neuron can be compensated for by homeostatic mechanisms within the cell that aim to readjust the electrical set-points for optimal physiological function. In fact, these mechanisms do not just happen within one cell, but across the circuit.

The upshot of the study is that, though some of the mutations they discovered are indeed likely to be the pathogenic culprits, it is very difficult to discern which ones they are. It is very clear that there is at least an oligogenic architecture for so-called “channelopathies” – the phenotype is determined by several mutations in each individual. (Note that this is not evidence for a highly polygenic architecture involving hundreds or thousands of genetic variants with tiny individual effects). The important insight is that it is not the overall number or mutational load that matters but the pattern of specific mutations in any individual that is crucial. Unfortunately, given how complicated the system is, this means it is currently not possible to predict an individual’s risk, even with this wealth of data. This will likely require a lot more biological information on the interactions between these mutations from experimental approaches and computational modelling.

What are the implications for other complex disorders? Should we expect a similarly complicated picture for diseases like schizophrenia or autism? Perhaps, though I would argue against over-extrapolating these findings. For the reasons described above, mutations in ion channel genes will show especially complex genetic interactions – it is, for example, even possible for two mutations that are individually pathogenic to suppress each other’s effects in combination. This is far less likely to occur for classes of mutations affecting processes such as neurodevelopment, many of which have been implicated in psychiatric disorders. Though by no means unheard of, it is far less common for the effects of one neurodevelopmental mutation to be suppressed by another – it generally just makes things worse. So, while modifying effects of genetic background will no doubt be important for such mutations, there is some hope that the interactions will be more straightforward to elucidate (mostly enhancing, far fewer suppressing). Others may see it differently of course (and I would be pleased to hear from you if you do); similar sequencing efforts currently underway for these disorders may soon tell whether that prediction is correct.

Klassen T, Davis C, Goldman A, Burgess D, Chen T, Wheeler D, McPherson J, Bourquin T, Lewis L, Villasana D, Morgan M, Muzny D, Gibbs R, & Noebels J (2011). Exome sequencing of ion channel genes reveals complex profiles confounding personal risk assessment in epilepsy. Cell, 145 (7), 1036-48 PMID: 21703448

Kasperaviciute, D., Catarino, C., Heinzen, E., Depondt, C., Cavalleri, G., Caboclo, L., Tate, S., Jamnadas-Khoda, J., Chinthapalli, K., Clayton, L., Shianna, K., Radtke, R., Mikati, M., Gallentine, W., Husain, A., Alhusaini, S., Leppert, D., Middleton, L., Gibson, R., Johnson, M., Matthews, P., Hosford, D., Heuser, K., Amos, L., Ortega, M., Zumsteg, D., Wieser, H., Steinhoff, B., Kramer, G., Hansen, J., Dorn, T., Kantanen, A., Gjerstad, L., Peuralinna, T., Hernandez, D., Eriksson, K., Kalviainen, R., Doherty, C., Wood, N., Pandolfo, M., Duncan, J., Sander, J., Delanty, N., Goldstein, D., & Sisodiya, S. (2010). Common genetic variation and susceptibility to partial epilepsies: a genome-wide association study Brain, 133 (7), 2136-2147 DOI: 10.1093/brain/awq130

Mitchell KJ (2011). The genetics of neurodevelopmental disease. Current opinion in neurobiology, 21 (1), 197-203 PMID: 20832285

Mirrored from http://wiringthebrain.blogspot.com

Reify my genes!

BEHOLD, REIFICATION!


In the comments below Antonio pointed me to this working paper, What Do DNA Ancestry Tests Reveal About Americans’ Identity? Examining Public Opinion on Race and Genomics. I am perhaps being a bit dull but I can’t figure where its latest version is found online (I stumbled upon what looks like another working paper version on one of the authors’ websites). Here’s the abstract:

Genomics research will soon have a deep impact on many aspects of our lives, but its political implications and associations remain undeveloped. Our broad goal in this research project is to analyze what Americans are learning about genomic science, and how they are responding to this new and potentially fraught technology.

We pursue that goal here by focusing on one arena of the genomics revolution — its relationship to racial and ethnic identity. Genomic ancestry testing may either blur racial boundaries by showing them to be indistinct or mixed, or reify racial boundaries by revealing ancestral homogeneity or pointing toward a particular geographic area or group as likely forebears. Some tests, or some contexts, may permit both outcomes. In parallel fashion, genomic information about race can emphasize its malleability and social constructedness or its possible biological bases. We posit that what information individuals choose to obtain, and how they respond to genomic information about racial ancestry will depend in part on their own racial or ethnic identity.

We evaluate these hypotheses in three ways. The first is a public opinion survey including vignettes about hypothetical individuals who received contrasting DNA test results. Second is an automated content analysis of about 5,500 newspaper articles that focused on race-related genomics research. Finally, we perform a finer-grained, hand-coded, content analysis of about 700 articles profiling people who took DNA ancestry tests.

Three major findings parallel the three empirical analyses. First, most respondents find the results of DNA ancestry tests persuasive, but blacks and whites have very different emotional responses and effects on their racial identity. Asians and Hispanics range between those two poles, while multiracials show a distinct pattern of reaction. Second, newspaper articles do more to teach the American reading public that race has a genetic component than that race is a purely social construction. Third, African Americans are disproportionately likely to react with displeasure to tests that imply a blurring of racial classifications. The paper concludes with a discussion, outline of next steps, and observations about the significance of genomics for political science and politics.

Read More

First Farmers Facing the Ocean

The image above is adapted from the 2010 paper A Predominantly Neolithic Origin for European Paternal Lineages, and it shows the frequencies of Y chromosomal haplogroup R1b1b2 across Europe. As you can see as you approach the Atlantic the frequency converges upon ~100%. Interestingly the fraction of R1b1b2 is highest among populations such as the Basque and the Welsh. This was taken by some researchers in the late 1990s and early 2000s as evidence that the Welsh adopted a Celtic language, prior to which they spoke a dialect distantly related to Basque. Additionally, the assumption was that the Basques were the ur-Europeans. Descendants of the Paleolithic populations of the continent both biologically and culturally, so that the peculiar aspects of the Basque language were attributed by some to its ancient Stone Age origins.

As indicated by the title the above paper overturned such assumptions, and rather implied that the origin of R1b1b2 haplogroup was in the Near East, and associated with the expansion of Middle Eastern farmers from the eastern Mediterranean toward western Europe ~10,000 years ago. Instead of the high frequency of R1b1b2 being a confident peg for the dominance of Paleolithic rootedness of contemporary Europeans, as well as the spread of farming mostly though cultural diffusion, now it had become a lynch pin for the case that Europe had seen one, and perhaps more than one, demographic revolutions over the past 10,000 years.

This is made very evident in the results from ancient DNA, which are hard to superimpose upon a simplistic model of a two way admixture between a Paleolithic substrate and a Neolithic overlay. Rather, it may be that there were multiple pulses into a European cul-de-sac since the rise of agriculture from different starting points. We need to be careful of overly broad pronouncements at this point, because as they say this is a “developing” area. But, I want to go back to the western European fringe for a moment.

Read More

The impact of genetic ancestry testing

Attitudes on DNA ancestry tests:

The DNA ancestry testing industry is more than a decade old, yet details about it remain a mystery: there remain no reliable, empirical data on the number, motivations, and attitudes of customers to date, the number of products available and their characteristics, or the industry customs and standard practices that have emerged in the absence of specific governmental regulations. Here, we provide preliminary data collected in 2009 through indirect and direct participant observation, namely blog post analysis, generalized survey analysis, and targeted survey analysis. The attitudes include the first available data on attitudes of those of individuals who have and have not had their own DNA ancestry tested as well as individuals who are members of DNA ancestry-related social networking groups. In a new and fluid landscape, the results highlight the need for empirical data to guide policy discussions and should be interpreted collectively as an invitation for additional investigation of (1) the opinions of individuals purchasing these tests, individuals obtaining these tests through research participation, and individuals not obtaining these tests; (2) the psychosocial and behavioral reactions of individuals obtaining their DNA ancestry information with attention given both to expectations prior to testing and the sociotechnical architecture of the test used; and (3) the applications of DNA ancestry information in varying contexts.

If anyone wants the paper, email me, I can send you a copy. But really it’s just kind of dated because the information was collected in 2009, before the massive increase in 23andMe’s customer base which began in the spring of 2010. Additionally, “genome blogging” really hadn’t started much at that point.

In terms of the reactions to ancestry analysis, my personal experience after doing analysis on hundreds of people (most in public for AAP, but some in private) is that most are pretty calm about whatever they find out. On occasion you run into a stubborn person who is basically going to fix upon a really implausible explanation for a particular ancestral slice rather than the lowest hanging fruit. But there was one individual who had a freak out when their results were published, because it did not accord with family beliefs. I was kind of confused, and checked their results with their self-reported ethnicity. Weirdly the results were exactly what I would have expected from the self-reported ethnicity, so it was a really strange reaction.