1. Researcher discovers brain measurements that can help with search for genes that affect shyness

    February 1, 2018 by Ashley

    From the University of Leiden press release:

    Previous research has shown that extreme shyness is hereditary, but because shyness is such a broad concept it is difficult to identify specific genes. Anita Harrewijn has discovered particular brain measurements that can help. PhD defence 18 January.

    Almost ten per cent of Dutch people will suffer from an anxiety disorder at some point in their lives. These people are afraid that others will think them strange or weird, or they are afraid of how they will react in particular social situations, such as when they have to face a group of people, which might make them tremble or blush.

    Similarities in brain activity

    Previous research has already shown that this kind of social anxiety disorder — or extreme shyness — is hereditary. Anyone who has extremely shy parents has a greater chance of developing the same symptoms. However, it is difficult to identify specific genes because social anxiety is such a broad term. In her PhD research, developmental psychologist Harrewijn shows that brain activity during an ‘attack’ of extreme shyness is also hereditary and is related to social anxiety. Clear similarities can be seen in the brain activity of family members who suffer from extreme shyness.

    Triggering shyness

    For her PhD research Harrewijn studied a total of 134 people from nine different families where some members have a social phobia. Before the EEG scans of the brains were made, the researcher triggered a feeling of extreme shyness in the people taking part in the experiment. For the first task she told them that a photo of themselves would be judged by someone from their age group, and for the second they had to make a film about their good and bad characteristics; the film would be judged by someone of their own age.

    Cortical and subcortical areas

    In fact, this judgement by peers never actually took place, but the threat alone was enough to raise the stress levels of the participants considerably. This could be seen in their brain activity shortly before making the film, which showed there was a raised level of activity between the cortical and subcortical areas in the participants with social anxiety. Harrewijn: ‘Whereas the cortical area mainly regulates control, the subcortical area deals with emotion. The two areas seem to be competing for attention.’

    Copycat behaviour

    ‘These findings will help future research on the genetic background to social anxiety,’ Harrewijn explains. ‘This brain activity is more specific than the disorder, and is therefore probably influenced by fewer genes. Although the brain activity may be a matter of copycat behaviour rather than genetic similarities, it is quite probable that a child of extremely shy parents can also learn that behaviour. Follow-up research will be needed to show whether this brain activity can be seen in children before they develop a social anxiety disorder, so that we can help them at as early a stage as possible.’


  2. Family study emphasizes distinct origins for bipolar disorder subtypes

    January 29, 2018 by Ashley

    From the Elsevier press release:

    The most common subtypes of bipolar disorder, bipolar I and bipolar II, stem — at least in part — from different biological causes, according to a new study published in Biological Psychiatry. Despite genetic overlap between the two subtypes, each subtype tended to cluster within families, suggesting a distinction between bipolar disorders I and II.

    The study, by Dr. Jie Song of the Department of Clinical Neuroscience, Karolinska Institutet, Sweden, and colleagues helps settle controversy over the relationship between bipolar I and bipolar II disorders. Although genetic similarities indicate overlap between the subtypes, the new findings emphasize different origins. According to Song, this is contrary to a common notion among many clinicians that bipolar II disorder is merely a milder form.

    “We have tended to view the two forms of bipolar disorder as variants of the same clinical condition. However, this new study highlights important differences in the heritable risk for these two disorders,” said Dr. John Krystal, Editor of Biological Psychiatry.

    The study is the first nationwide family study to explore the difference between the two main subtypes of bipolar disorder. Dr. Song and colleagues analyzed the occurrence of the bipolar disorder subtypes in families from the Swedish national registers. Although a strong genetic correlation between bipolar I and bipolar II disorder suggests that they are not completely different, the family occurrence for each subtype was stronger than co-occurrence between the subtypes, indicating that bipolar I and bipolar II disorders tend to “run” in families separately, rather than occurring together.

    “Within the context of our emerging appreciation of polygenic risk, where gene variations are implicated in several disorders, the new findings point to only partial overlap in the risk mechanisms for these two forms of bipolar disorder,” said Dr. Krystal.

    The study also provided some additional clues that bipolar I and II disorders have distinct origins. Only bipolar disorder II showed gender differences — the proportion of females to males was higher in bipolar disorder II but not bipolar disorder I. And bipolar I clustered together in families with schizophrenia, which was not apparent for bipolar disorder II.

    “Hopefully, our findings increase awareness of the need for refined distinctions between subtypes of mood disorder,” said Dr. Song. The distinction between the subtypes also has implications for treatment strategies for patients. Dr. Song added that future research is warranted to characterize new biomarkers to improve treatment and prognosis.


  3. Seeing gene influencing performance of sleep-deprived people

    January 10, 2018 by Ashley

    From the Washington State University press release:

    Washington State University researchers have discovered a genetic variation that predicts how well people perform certain mental tasks when they are sleep deprived.

    Their research shows that individuals with a particular variation of the DRD2 gene are resilient to the effects of sleep deprivation when completing tasks that require cognitive flexibility, the ability to make appropriate decisions based on changing information.

    Sleep-deprived people with two other variations of the gene tend to perform much more poorly on the same kinds of tasks, the researchers found.

    The DRD2 dopamine receptor gene influences the processing of information in the striatum, a region of the brain that is known to be involved in cognitive flexibility.

    “Our work shows that there are people who are resilient to the effects of sleep deprivation when it comes to cognitive flexibility. Surprisingly these same people are just as affected as everyone else on other tasks that require different cognitive abilities, such as maintaining focus,” said Paul Whitney, a WSU professor of psychology and lead author of the study, which appeared in the journal Scientific Reports. “This confirms something we have long suspected, namely that the effects of sleep deprivation are not general in nature, but rather depend on the specific task and the genes of the person performing the task.”

    Why sleep loss affects us differently

    When deprived of sleep, some people respond better than others. Scientists have identified genes associated with this, but they have wondered why the effects of sleep loss tend to vary widely across both individuals and cognitive tasks. For example, after a day without sleep, some people might struggle with a reaction time test but perform well on decision-making tasks, or vice versa.

    In the current study, Whitney, along with colleagues John Hinson, WSU professor of psychology, and Hans Van Dongen, director of the WSU Sleep and Performance Research Center at WSU Spokane, compared how people with different variations of the DRD2 gene performed on tasks designed to test both their ability to anticipate events and their cognitive flexibility in response to changing circumstances.

    Forty-nine adults participated in the study at the WSU Spokane sleep laboratory. After a 10-hour rest period, 34 participants were randomly selected to go 38 hours without sleep while the other participants were allowed to sleep normally.

    Before and after the period of sleep deprivation, subjects were shown a series of letter pairings on a computer screen and told to click the left mouse button for a certain letter combination (e.g., an A followed by an X) and the right mouse button for all other letter pairs. After a while, both the sleep-deprived group and the rested group were able to identify the pattern and click correctly for various letter pairs.

    Then came the tricky part: in the middle of the task, researchers told the participants to now click the left mouse button for a different letter combination. The sudden switch confounded most of the sleep-deprived participants, but those who had a particular variation of the DRD2 gene handled the switch as well as they did when well-rested.

    “Our research shows this particular gene influences a person’s ability to mentally change direction when given new information,” Van Dongen said. “Some people are protected from the effects of sleep deprivation by this particular gene variation but, for most of us, sleep loss does something to the brain that simply prevents us from switching gears when circumstances change.”

    Training to cope with sleep loss

    Sleep deprivation’s effect on cognitive flexibility can have serious consequences, especially in high stakes, real-world situations like an emergency room or military operations where the ability to respond to changing circumstances is critical. For example, after a night without sleep, a surgeon might notice a spike in a patient’s vital signs midway through a procedure but be unable to use this information to decide on a better course of action.

    The WSU research team is currently applying what they learned from their study to develop new ways to help surgeons, police officers, soldiers and other individuals who regularly deal with the effects of sleep deprivation in critical, dynamic settings cope with the loss of cognitive flexibility.

    “Our long-term goal is to be able to train people so that no matter what their genetic composition is, they will be able to recognize and respond appropriately to changing scenarios, and be less vulnerable to sleep loss.” Whitney said. “Of course, the more obvious solution is to just get some sleep, but in a lot of real-world situations, we don’t have that luxury.”


  4. Study suggests mutations in neurons accumulate with age; may explain normal cognitive decline and neurodegeneration

    January 2, 2018 by Ashley

    From the Boston Children’s Hospital press release:

    Scientists have wondered whether somatic (non-inherited) mutations play a role in aging and brain degeneration, but until recently there was no good technology to test this idea. A study published online today in Science, led by researchers from Boston Children’s Hospital and Harvard Medical School, used whole-genome sequencing of individual neurons and found strong evidence that brain mutations accumulate as we age. They also found that mutations accumulate at a higher rate in people with genetic premature aging disorders causing early brain degeneration.

    “It’s been an age-old question as to whether DNA mutations can accumulate in neurons — which usually don’t divide — and whether they are responsible for the loss of function that the brain undergoes as we get older,” says Christopher A. Walsh, MD, PhD, chief of the Division of Genetics and Genomics at Boston Children’s and co-senior author on the paper. “It hasn’t been possible to answer this question before, because we couldn’t sequence the genome of a single cell, and each mutation accumulated is unique to each cell.”

    Testing neurons one by one

    The research team tested DNA from 161 single neurons, taken from postmortem samples from the NIH NeuroBioBank. They came from 15 neurologically normal people of different ages (4 months to 82 years) and nine people with one of two accelerated aging and early-onset neurodegenerative disorders: Cockayne syndrome and xeroderma pigmentosum.

    Using the latest experimental and data analysis techniques, the team was able to detect mutations as small as single-letter changes in each neuron’s genetic code. Each cell had to have its genome amplified — by generating a multitude of copies — before its DNA sequence could be determined, and a large amount of data had to be analyzed.

    “Because many experimental artifacts arise during the single-cell experiments, a new computational method that can distinguish true mutations from the experimental noise was critical to the success of the project,” says Peter J. Park, PhD, of Harvard Medical School’s Department of Biomedical Informatics (DBMI), the paper’s other co-senior author.

    The neurons tested came from two areas of the brain implicated in age-related cognitive decline: the prefrontal cortex (the part of the brain most highly developed in humans) and the dentate gyrus of the hippocampus (a focal point in age-related degenerative conditions like Alzheimer’s).

    In neurons from neurologically normal people, the number of genetic mutations increased with age in both brain areas. However, mutations accumulated at a higher rate in the dentate gyrus. The researchers think this may be because the neurons have the ability to divide, unlike their counterparts in the prefrontal cortex.

    In neurons from people with Cockayne syndrome and xeroderma pigmentosum, there was an increase in mutations in the prefrontal cortex over time — more than two-fold compared to the normal rate. Additionally, the researchers found that the portions of the genome that neurons used the most accumulated mutations at the highest rate, with help from collaborators at WuXi NextCODE.

    The aging genome

    The researchers coined the term “genosenium” — combining the concepts of genome and senescence/senility — to capture the idea of gradual and inevitable accumulation of mutations contributing to brain aging.

    The mutations themselves fell into three categories. “We were able to take all the mutations we found and use mathematical techniques to deconstruct them into different types of DNA changes,” says Michael Lodato, PhD, one of six co-first authors on the paper. “It’s like hearing an orchestra and teasing out the different instruments.”


  5. Holding infants – or not – can leave traces on their genes

    December 19, 2017 by Ashley

    From the University of British Columbia press release:

    The amount of close and comforting contact between infants and their caregivers can affect children at the molecular level, an effect detectable four years later, according to new research from the University of British Columbia and BC Children’s Hospital Research Institute.

    The study showed that children who had been more distressed as infants and had received less physical contact had a molecular profile in their cells that was underdeveloped for their age — pointing to the possibility that they were lagging biologically.

    “In children, we think slower epigenetic aging might indicate an inability to thrive,” said Michael Kobor, a Professor in the UBC Department of Medical Genetics who leads the “Healthy Starts” theme at BC Children’s Hospital Research Institute.

    Although the implications for childhood development and adult health have yet to be understood, this finding builds on similar work in rodents. This is the first study to show in humans that the simple act of touching, early in life, has deeply-rooted and potentially lifelong consequences on genetic expression.

    The study, published last month in Development and Psychopathology, involved 94 healthy children in British Columbia. Researchers from UBC and BC Children’s Hospital asked parents of 5-week-old babies to keep a diary of their infants’ behavior (such as sleeping, fussing, crying or feeding) as well as the duration of caregiving that involved bodily contact. When the children were about 4 1/2 years old, their DNA was sampled by swabbing the inside of their cheeks.

    The team examined a biochemical modification called DNA methylation, in which some parts of the chromosome are tagged with small molecules made of carbon and hydrogen. These molecules act as “dimmer switches” that help to control how active each gene is, and thus affect how cells function.

    The extent of methylation, and where on the DNA it specifically happens, can be influenced by external conditions, especially in childhood. These epigenetic patterns also change in predictable ways as we age.

    Scientists found consistent methylation differences between high-contact and low-contact children at five specific DNA sites. Two of these sites fall within genes: one plays a role in the immune system, and the other is involved in metabolism. However, the downstream effects of these epigenetic changes on child development and health aren’t known yet.

    The children who experienced higher distress and received relatively little contact had an “epigenetic age” that was lower than would be expected, given their actual age. Such a discrepancy has been linked to poor health in several recent studies.

    “We plan on following up to see whether the ‘biological immaturity’ we saw in these children carries broad implications for their health, especially their psychological development,” says lead author Sarah Moore, a postdoctoral fellow. “If further research confirms this initial finding, it will underscore the importance of providing physical contact, especially for distressed infants.”


  6. Gene expression study may provide insights into autism, other neurodevelopmental disorders

    December 18, 2017 by Ashley

    From the University of California – San Francisco press release:

    The human brain has been called the most complex object in the cosmos, with 86 billion intricately interconnected neurons and an equivalent number of supportive glial cells. One of science’s greatest mysteries is how an organ of such staggering complexity — capable of producing both love poetry and scientific discovery — builds itself from just a handful of stem cells in the early embryo.

    Now researchers at UC San Francisco have taken the first step towards a comprehensive atlas of gene expression in cells across the developing human brain, making available new insights into how specific cells and gene networks contribute to building this most complex of organs, and serving as a resource for researchers around the world to study the interplay between these genetic programs and neurodevelopmental disorders such as autism, intellectual disability, and schizophrenia.

    The work described in the new paper — published December 8, 2017, in Science — was led by three young UCSF researchers: Tomasz Nowakowski, PhD, an assistant professor of anatomy; Alex Pollen, PhD, an assistant professor of neurology; and Aparna Bhaduri, PhD, when all three were post-doctoral researchers in the UCSF lab of Arnold Kriegstein, MD, PhD, the new paper’s senior author.

    “It’s critically important to be able to look at questions of brain development in real human tissue when you’re trying to study human disease. Many of the insights we’re able to gain with this data can’t be seen in the mouse,” said Kriegstein, a professor of neurology and director of the Eli and Edythe Broad Center of Regeneration Medicine and Stem Cell Research at UCSF.

    Previously, Pollen and Nowakowski had developed techniques for analyzing distinctive patterns of DNA activity in individual cells extracted from human brain tissue. The approach enabled a wide range of studies of human brain development, including implicating a new class of neural stem cell recently discovered by the lab in the evolutionary expansion of the human brain and identifying how the mosquito-borne Zika virus may contribute to microcephaly in infants infected in utero.

    Working with Bhaduri, who has a background in statistics and bioinformatics, Pollen and Nowakowski began exploring how specific classes of neurons and stem cells in the developing brain contribute to normal brain growth as well as to neurodevelopmental disease, and have begun to build a comprehensive, open-source atlas of gene expression across the developing brain, which they hope will serve as a resource for other scientists.

    “This is an attempt to generate an unbiased view of what genes are expressed in every cell type in the developing human brain in order to highlight potential cellular vulnerabilities in patient-relevant mutations,” said Nowakowski.

    “Identifying gene variants that are general risk factors for neurological and psychiatric disease is important, but understanding exactly which cell types in the developing brain are compromised and what the consequences are is still extremely challenging,” Pollen added. “A cell atlas could serve as a bridge to help us to do this with more confidence.”

    In their new Science paper, the researchers analyzed gene expression in single cells across key developmental time points and from different regions of the brain. Bhaduri then used statistical algorithms to cluster different cells based on their patterns of gene expression.

    This analysis allowed the team to trace the genetic signals driving brain development at a much finer level, both regionally and over time, than had previously been possible. For example, the researchers were able to identify previously unknown gene expression differences between the neural stem cells that give rise to the brain’s deep structures versus its neocortical surface, and to show that molecular signatures of different neural cell types arise much earlier in brain development than previously realized.

    “I was excited to come into this project with an incredibly rich dataset to analyze,” Bhaduri said. “By analyzing this dataset in new ways, we were able to discover early molecular distinctions across areas and over time that begin to specify the astonishing diversity of neurons in the cerebral cortex.”

    One of the team’s most exciting new observations suggests a link between autism and a type of neural stem cell called outer-radial glia (oRGs), discovered by the Kriegstein lab in 2010. These stem cell populations are greatly expanded in primates and may be responsible for the radical expansion of the cortex that gave rise to human intelligence. In the new study, the researchers discovered that during the second trimester of human brain development, oRG cells express genes related to a fundamental signaling pathway called mTOR, defects in which have previously been implicated in autism and several other psychiatric disorders. This finding suggests that future studies should also consider the role mTOR-expressing oRG cells may play in the origins of these disorders.

    Another provocative observation from the new study was that transient gene expression events during brain development set up broad distinctions in neural fate between cells in different areas in the cerebral cortex. This contrasts with an idea that has been dominant in neuroscience for many years: that the neocortex is made up of nearly identical “cortical columns” — a standard circuit of distinct cell types that connect across the cortex’s six layers — which tile the cortical surface like the hexagons in a honeycomb. In that model, the cell types and their local connections are generally similar everywhere — it’s just the inputs and outputs to the column that vary from place to place in the cortex. In contrast, the new data suggest that neurons in different parts of the brain express fundamentally different genetic programs during development, while nearby neurons in different layers of the cortex are surprisingly similar in their gene expression.

    The authors say the new paper is the first step in a larger effort to build a comprehensive atlas of genetically-defined cell types in the human brain. Kriegstein’s team recently received a 5-year $5 million grant from the National Institutes of Health (NIH) BRAIN Initiative to expand this effort, and to make the resulting data available publicly to all in the field through an interactive data browser built in collaboration with colleagues at UC Santa Cruz.

    “This study focused on the development of the neocortex, but we aim to analyze multiple brain regions and developmental stages to achieve a more comprehensive atlas of cell types in the developing human brain,” Kriegstein said. “It’s still a fundamentally open question how many cell types there are in the brain, which clearly has more cellular diversity than any other organ.”

    “We hope this atlas will be a roadmap for the field to explore the relationship between specific cell types, signaling pathways, receptors, and the physiological function of brain circuits,” Kriegstein added. “For example, there is a huge amount of interest and excitement globally in growing cerebral organoids” — miniature brain-like organs that can be studied in laboratory experiments — “from stem cells to model human brain development and disease mechanisms. Our brain development atlas will serve as a much-needed framework to calibrate these organoids against the real human brain.”


  7. Study suggests genes behind higher education linked to lower risk of Alzheimer’s

    December 15, 2017 by Ashley

    From the Karolinska Institutet press release:

    Using genetic information, researchers at Karolinska Institutet in Sweden provide new evidence that higher educational attainment is strongly associated with a lower risk of Alzheimer’s disease.

    The causes of Alzheimer’s disease are largely unknown and treatment trials have been disappointing. This has led to increasing interest in the potential for reducing the disease by targeting modifiable risk factors. Many studies have found that education and vascular risk factors are associated with the risk of Alzheimer’s disease, but whether these factors actually cause Alzheimer’s has been difficult to disentangle.

    Mendelian randomisation is a method that uses genetic information to make causal inferences between potential risk factors and disease. If a gene with a specific impact on the risk factor is also associated with the disease, then this indicates that the risk factor is a cause of the disease.

    Susanna C. Larsson, associate professor at the Institute of Environmental Medicine at Karolinska Institutet, and colleagues in Cambridge and Munich, used the Mendelian randomisation approach to assess whether education and different lifestyle and vascular risk factors are associated with Alzheimer’s disease. The analysis included more than 900 genetic variants previously shown to be associated with the risk factors. Comparisons of these genetic variants among 17,000 patients with Alzheimer’s disease and 37,000 healthy controls revealed a strong association for genetic variants that predict education.

    “Our results provide the strongest evidence so far that higher educational attainment is associated with a lower risk of Alzheimer’s disease. Therefore, improving education may substantially decrease the number of people developing this devastating disease,” says Susanna C. Larsson.

    According to the researchers, one possible explanation for this link is ‘cognitive reserve‘, which refers to the ability to recruit and use alternative brain networks or structures not normally used in order to compensate for brain ageing.

    “Evidence suggests that education helps improve brain networks and thus could increase this reserve,” says Susanna C. Larsson.

    The study was financed by the European Union’s Horizon 2020 research and innovation programme and the Swedish Brain Foundation.


  8. Study identifies gene variant that protects against Alzheimer’s disease

    December 11, 2017 by Ashley

    From the Brigham Young University press release:

    Research published Wednesday in Genome Medicine details a novel and promising approach in the effort to treat Alzheimer’s disease.

    Brigham Young University professors Perry Ridge and John Kauwe led the discovery of a rare genetic variant that provides a protective effect for high-risk individuals — elderly people who carry known genetic risk factors for Alzheimer’s — who never acquired the disease.

    In other words, there’s a specific reason why people who should get Alzheimer’s remain healthy. Study authors believe this genetic function could be targeted with drugs to help reduce the risk of people getting the disease.

    “Instead of identifying genetic variants that are causing disease, we wanted to identify genetic variants that are protecting people from developing disease,” said Ridge, assistant professor of biology at BYU. “And we were able to identify a promising genetic variant.”

    That former approach to Alzheimer’s disease has been generally effective in producing a list of genes that might impact risk for the disease, but it leaves researchers without sufficient data on what to do next. In this new approach, Ridge and Kauwe develop the biological mechanism by which a genetic variant actually impacts Alzheimer’s disease.

    Using data from the Utah Population Database — a 20-million-record database of the LDS Church’s genealogical records combined with historical medical records from Utah — Ridge and Kauwe first identified families that had a large number of resilient individuals: those who carried the main genetic risk factor for Alzheimer’s (E4 Allele) but remained healthy into advanced age.

    Using whole genome sequencing and a linkage analysis methodology, they then looked for the DNA that those resilient individuals shared with each other that they didn’t share with loved ones who died of Alzheimer’s. They discovered the resilient subjects shared a variant in the RAB10 gene while those who got the disease did not share the genetic variant.

    Once the researchers identified the potentially protective gene variant, they over expressed it in cells and under expressed it in cells to see the impact on Alzheimer’s disease related proteins. They learned that when this gene is reduced in your body, it has the potential to reduce your risk for Alzheimer’s.

    “There are currently no meaningful interventions for Alzheimer disease; No prevention, no modifying therapies, no cure,” Kauwe said. “The discoveries we’re reporting in this manuscript provide a new target with a new mechanism that we believe has great potential to impact Alzheimer’s disease in the future.”


  9. Tracking down genetic influences on brain disorders

    December 8, 2017 by Ashley

    From the Universität Basel press release:

    New findings will help to identify the genetic causes of brain disorders: researchers at the Universities of Basel, Bonn and Cologne have presented a systematic catalog of specific variable locations in the genome that influence gene activity in the human hippocampus, as they report in the journal Nature Communications.

    Individual differences in gene regulation contribute to the development of numerous multifactorial disorders. Researchers are therefore attempting to clarify the influence of genetic variants (single-nucleotide polymorphisms, or SNPs) on gene expression and on the epigenetic modification of regulatory sections of the genome (DNA methylation). The German-Swiss team has now studied the genetic determinants of gene expression, as well as the process of DNA methylation in the human hippocampus.

    Three million genomic locations analyzed

    The researchers have presented an extensive catalog of variable locations in the genome — that is, of SNPs — that affect the activity of genes in the human hippocampus. Specifically, they have analyzed the influence of more than three million SNPs, spread throughout the genome, on activity in nearby genes and the methylation of adjacent DNA sections.

    The special thing about their work is that the researchers used freshly frozen hippocampus tissue obtained during surgery on 110 treatment-resistant epilepsy patients. They extracted DNA and RNA from the hippocampus tissue and, for all of the obtained samples, used microchips to determine several hundred thousand SNPs, as well as the degree of methylation at several hundred thousand locations (known as CpG dinucleotides) in the genome. Among other analyses, they measured the gene expression of over 15,000 genes using RNA microchips.

    Development of schizophrenia

    The researchers also demonstrated the preferred areas in which variably methylated CpG dinucleotides appear in the genome, and they were able to assign these to specific regulatory elements, revealing a link to brain disorders: a significant proportion of the identified SNPs that individually influence DNA methylation and gene expression in the hippocampus also contribute to the development of schizophrenia. This underlines the potentially significant role played by SNPs with a regulatory effect in the development of brain disorders.

    The study’s findings will make it considerably easier to interpret evidence of genetic associations with brain disorders in the future. Of the SNPs involved in the development of brain disorders, many of those identified in recent years are located in the non-coding part of the genome. Their functional effect in cells is therefore largely unclear.

    An important factor in the project’s success was the close cooperation between the Universities of Basel, Bonn and Cologne. This collaboration is supported by the IntegraMent Consortium, which is sponsored by Germany’s Federal Ministry of Education and Research and coordinated by Professor Markus Nöthen of the University of Bonn.


  10. Genetic study links tendency to undervalue future rewards with ADHD

    December 5, 2017 by Ashley

    From the University of California – San Diego press release:

    Researchers at University of California San Diego School of Medicine have found a genetic signature for delay discounting — the tendency to undervalue future rewards — that overlaps with attention-deficit/hyperactivity disorder (ADHD), smoking and weight.

    In a study published December 11 in Nature Neuroscience, the team used data of 23andme customers who consented to participate in research and answered survey questions to assess delay discounting. In all, the study included the data of more than 23,000 people to show that approximately 12 percent of a person’s variation in delay discounting can be attributed to genetics — not a single gene, but numerous genetic variants that also influence several other psychiatric and behavioral traits.

    “Studying the genetic basis of delay discounting is something I’ve wanted to do for the entirety of my 20 years of research, but it takes a huge number of people for a genetics study to be meaningful,” said senior author Abraham Palmer, PhD, professor of psychiatry and vice chair for basic research at UC San Diego School of Medicine. “By collaborating with a company that already has the genotypes for millions of people, all we needed was for them to answer a few questions. It would have been difficult to enroll and genotype this many research participants on our own in academia — it would’ve taken years and been cost prohibitive. This is a new model for science.”

    According to Palmer, every complicated nervous system needs a way of assessing the value of current versus delayed rewards. Most people think of the “marshmallow experiment,” he said, referring to the classic experiment where children were tested for their ability to delay gratification by giving them the choice between one marshmallow now or two marshmallows a few minutes later.

    “A person’s ability to delay gratification is not just a curiosity, it’s integrally important to physical and mental health,” Palmer said. “In addition, a person’s economic success is tied to delay discounting. Take seeking higher education and saving for retirement as examples — these future rewards are valuable in today’s economy, but we’re finding that not everyone has the same inclination to achieve them.”

    For the study, the team looked at data from 23andMe research participants who answered survey questions that could be used to assess delay discounting. For example, customers were asked to choose between two options: “Would you rather have $55 today or $75 in 61 Days?”

    “In less than four months, we had responses from more than 23,000 research participants,” said Pierre Fontanillas, PhD, a senior statistical geneticist at 23andMe. “This shows the power of our research model to quickly gather large amounts of phenotypic and genotypic data for scientific discovery.”

    By comparing participants’ survey responses to their corresponding genotypes and complementary data from other studies, Palmer’s team found a number of genetic correlations.

    “We discovered, for the first time, a genetic correlation between ADHD and delay discounting,” said first author Sandra Sanchez-Roige, PhD, a postdoctoral researcher in Palmer’s lab. “People with ADHD place less value in delayed rewards. That doesn’t mean that everyone with ADHD will undervalue future rewards or vice versa, just that the two factors have a common underlying genetic cause.”

    The researchers also found that delay discounting is genetically correlated with smoking initiation. In other words, people who undervalue future rewards may be more likely to start smoking and less likely to quit if they did.

    Body weight, as determined by body mass index (BMI), was also strongly correlated with delay discounting, suggesting that people who don’t place a high value on future rewards tend to have a higher BMI.

    The team determined that delay discounting negatively correlated with three cognitive measures: college attainment, years of education and childhood IQ. In other words, the genetic factors that predict delay discounting also predict these outcomes.

    In many studies that rely on surveys, particularly for those in which the participants are paid to fill out the survey, there’s always a chance that some answered randomly or carelessly. Palmer’s survey included three questions to assess how carefully the research participants were answering the questions. For example, one asked “Would you rather have $60 today or $20 today?” There’s only one correct answer and the team saw only 2.1 percent of participants get even one of those three questions wrong, assuring them that the vast majority were answering the questions carefully.

    “We are very thankful to the 23andMe research participants who took the time to complete our survey — they weren’t paid to do it, they are citizen-scientists volunteering their help,” Palmer said. “It’s quite a feeling to think that so many people were willing to help out in this interest of ours.”

    Palmer hopes to expand the study to a larger and more diverse population to strengthen their findings.

    “An even larger study would help start identifying specific genes with a higher level of confidence,” he said. “Then we can do hypothesis-driven studies of this trait with animal or cellular models.”

    While most research studies begin in test tubes, cells grown in the laboratory and animal models before moving to humans, the opposite is true here. After starting with these human observations, Palmer’s team is now studying the same delay discounting-related genetic traits in rodent models. They want to determine if changing those genes experimentally changes rodent behavior as expected. If it does, they will be able to use the animals to study how those delay discounting-related genes lead to those behaviors, at a molecular level.