1. Too much TV can impact primary school readiness for some kids

    March 24, 2017 by Ashley

    From the Concordia University press release:

    We interrupt this program for an important message: Watching television for more than a couple of hours a day has been linked to lower school readiness skills in kindergarteners.

    That’s according to a new study by researchers from Concordia’s PERFORM Centre and New York University’s Steinhardt School of Culture, Education, and Human Development, which shows the particular impact on children from low-income families.

    The findings, published in the Journal of Developmental & Behavioral Pediatrics, reinforce the need for limits on screen time, such as those laid out by the American Academy of Pediatrics. The organization recently changed its norms to recommend that children between two and five watch no more than one hour daily — down from its previous recommendation of two hours.

    Given the current prevalence of smartphones and tablets, screens play a bigger role in many families’ lives now than ever before.

    “Research has shown that watching TV is negatively associated with early academic skills, but little is known about how socio-economic status influences viewing,” says study co-author and PERFORM Centre researcher Caroline Fitzpatrick.

    “We wanted to examine whether the negative relationship between watching TV and school readiness varied by family income.”

    Lower math skills and executive functioning

    Fitzpatrick and NYU Steinhardt co-authors Andrew Ribner and Clancy Blair looked at data from 807 kindergarteners of diverse backgrounds. Parents of participants reported family income, as well as the hours of TV their children watch on a daily basis. Video game, tablet and smartphone use was not included in the measurement.

    To determine school readiness, the study measured the children’s math skills and knowledge of letters and words.

    The researchers also assessed executive functions, which are key cognitive and social-emotional competencies, including working memory, cognitive flexibility and inhibitory control. Executive functions are essential for everyday problem-solving and self-control of behaviours and emotional responses.

    “We found that the number of hours young children watch TV is related to decreases in their school readiness, particularly when it comes to math and executive function,” confirms Ribner, a doctoral candidate in the Department of Applied Psychology at NYU Steinhardt and the study’s lead author.

    “This association was strongest when children watched more than two hours daily.”

    Poorer families take the hardest hit

    As family incomes decreased, the link between TV watching and school readiness grew.

    Those at or near the poverty line — an annual income of around $21,200 for a family of four — saw the largest drop in school readiness when children watched more than two hours of TV a day.

    The study noted a more modest drop among middle-income families, measured as $74,200 per year for a family of four. And researchers found no link between school readiness and TV viewing in high-income homes, which were measured as around $127,000 per year for a family of four.

    “Our results suggest that the circumstances that surround child screen time can influence its detrimental effects on learning outcomes,” says Fitzpatrick, who also teaches at Université Sainte-Anne.

    “We recommend that pediatricians and child care centres help parents limit the amount of TV children watch to less than two hours a day, especially those from middle- to lower-income families.”


  2. Tablet devices show promise in managing agitation among patients with dementia

    January 9, 2017 by Ashley

    From the McLean Hospital media release:

    senior researcher with tabletA new pilot study led by McLean Hospital’s Ipsit Vahia, MD, medical director of Geriatric Psychiatry Outpatient Services at McLean Hospital, suggests that the use of tablet computers is both a safe and a potentially effective approach to managing agitation among patients with dementia.

    “Tablet use as a nonpharmacologic intervention for agitation in older adults, including those with severe dementia, appears to be feasible, safe, and of potential utility,” said Vahia. “Our preliminary results are a first step in developing much-needed empirical data for clinicians and caregivers on how to use technology such as tablets as tools to enhance care and also for app developers working to serve the technologic needs of this population.”

    “Use of Tablet Devices in the Management of Agitation Among Inpatients with Dementia: An Open Label Study” was recently published in the online version of The American Journal of Geriatric Psychiatry. This research builds upon previous studies demonstrating that art, music, and other similar therapies can effectively reduce symptoms of dementia without medication. By using tablet devices to employ these therapies, however, patients and providers also benefit from a computer’s inherent flexibility.

    “The biggest advantage is versatility,” said Vahia. “We know that art therapy can work, music therapy can work. The tablet, however, gives you the option of switching from one app to another easily, modifying the therapy seamlessly to suit the individual. You don’t need to invest in new equipment or infrastructure.”

    Researchers loaded a menu of 70 apps onto the tablets for the study. The apps were freely available on iTunes and varied greatly in their cognitive complexity — from an app that displayed puppy photos to one that featured Sudoku puzzles.

    The researchers found that tablet use was safe for every patient, regardless of the severity of their dementia, and that with proper supervision and training, the engagement rate with the devices was nearly 100 percent. The study also found that the tablets demonstrated significant effectiveness in reducing symptoms of agitation, particularly — but not exclusively — among patients with milder forms of dementia.

    Vahia cited several examples of the tablet’s potential to improve a patient’s condition. One particular patient, who only spoke Romanian, was very withdrawn and irritable, and medications were ineffective in controlling his symptoms.

    “We started showing him Romanian video clips on YouTube, and his behavior changed dramatically and instantaneously,” said Vahia. “His mood improved. He became more interactive. He and his medical support team also started using a translation app so that staff could ask him simple questions in Romanian, facilitating increased interaction. These significant improvements are a clear testament of the tablet’s potential as a clinical tool.”

    Based on such promising outcomes, the Geriatric Psychiatry Outpatient Services clinical team is expanding the use of tablet devices as a means to control agitation in dementia patients at McLean. This will allow researchers to develop more robust data and expand the scope of the study, including a focus on specific clinical factors that may impact how patients with dementia engage with and respond to apps.


  3. You’ve got mail: Personality differences in email use

    by Ashley

    From the British Psychological Society media release:

    tablet computer seniorA new study shows that while many of us cannot do our job without email, it can stress us out — and that personality differences affect how we use email and what we find stressful.

    The results of the study are being presented on January 6, 2017, at the British Psychological Society’s Division of Occupational Psychology annual conference in Liverpool by John Hackston from OPP Ltd.

    Data was collected via an online survey of 368 people, all of whom had already completed a personality type questionnaire.

    The results showed that those of us with a big picture focus are more likely to check our emails on holiday, at the weekend and before and after work than our more matter of fact counterparts. Unfortunately, sending emails outside of work hours leads to stress, as does the amount of emails we send and receive. Managers, regardless of personality type, are more likely to feel that they waste time on email and to find it overwhelming and stressful.

    People with different personality preferences found different aspects of using email stressful, allowing the researchers to compile guidance to help individuals cope with email more effectively.

    Hackston commented: “Our research shows that while there are some general guidelines for using email, everyone is different. Knowing your personality type can help you to avoid stress and communicate better with others.


  4. Gaming your brain to treat depression

    January 6, 2017 by Ashley

    From the University of Washington Health Sciences/UW Medicine media release:

    Researchers have found promising results for treating depression with a video game interface that targets underlying cognitive issues associated with depression rather than just managing the symptoms.

    “We found that moderately depressed people do better with apps like this because they address or treat correlates of depression,” said Patricia Areán, a UW Medicine researcher in psychiatry and behavioral sciences.

    The first study enrolled older adults diagnosed with late-life depression into a treatment trial where they were randomized to receive either a mobile, tablet-based treatment technology developed by Akili Interactive Labs called Project: EVO or an in-person therapy technique known as problem-solving therapy (PST).

    Project: EVO runs on phones and tablets and is designed to improve focus and attention at a basic neurological level. The results, published Jan. 3 in the journal Depression and Anxiety, showed that the group using Project: EVO demonstrated specific cognitive benefits (such as attention) compared to the behavioral therapy, and saw similar improvements in mood and self-reported function. Joaquin A. Anguera, a University of California, San Francisco (UCSF), researcher in neurology and psychiatry, is the lead author, and Areán is the senior author. The researchers have no commercial interests in the intervention manufactured by Akili Interactive Labs in Boston.

    “While EVO was not directly designed to treat depressive symptoms; we hypothesized that there may indeed be beneficial effects on these symptoms by improving cognitive issues with targeted treatment, and so far, the results are promising,” said Anguera.

    People with late-life depression (60+) are known to have trouble focusing their attention on personal goals and report trouble concentrating because they are so distracted by their worries. Akili’s technology was designed to help people better focus their attention and to prevent people from being easily distracted.

    Areán, a UW Medicine researcher in psychiatry and behavioral sciences, said most of the participants had never used a tablet, let alone played a video game, but compliance was more than 100 percent. The participants were required to play the game five times a week for 20 minutes, but many played it more. Participants in this arm of the study also attended weekly meetings with a clinician. The meetings served as a control for the fact that participants in the problem-solving therapy arm were seen in person on a weekly basis, and social contact of this nature can have a positive effect on mood.

    Second study

    A second study, which was another joint effort by UW and UCSF, randomized more than 600 people across the United States assessed as moderately or mildly depressed to one of three interventions: Akili’s Project: EVO; iPST, an app deployment of problem-solving therapy; or a placebo control (an app called Health Tips, which offered healthy suggestions).

    Areán, the lead researcher on the study published Dec. 20 in the Journal of Medical Internet Research (JIMR), found that people who were mildly depressed were able to see improvements in all three groups, including the placebo. However, those individuals who were more than mildly depressed showed a greater improvement of their symptoms following their use of Project EVO or iPST versus the placebo.

    Areán said much of her research is aimed at providing effective treatment to people who need it, and these results provide great potential for helping people who don’t have the resources to access effective problem solving therapy. But, she stressed, the apps should be used under clinical supervision because without a human interface, people were not as motivated to use it. In the JIMR study, 58 percent of participants did not download the app.

    Akili’s technologies are based on a proprietary neuroscience approach developed to target specific neurological systems through sensory and digital mechanics. The company’s technology platform used in this trial is based on cognitive science exclusively licensed from the lab of Dr. Adam Gazzaley at UCSF, and propietary adaptive algorithms developed at Akili, which are built into action video game interfaces. The technology targets an individual’s core neurological ability to process multiple streams of information.

    Project: EVO is undergoing multiple clinical trials for use in cognitive disorders — including Alzheimer’s disease, traumatic brain injury and pediatric attention deficit hyperactivity disorder (ADHD), and the company is on path for potential FDA clearance for the game’s use to treat pediatric ADHD.


  5. How to avoid feeling depressed on Facebook

    December 9, 2016 by Ashley

    From the Lancaster University media release:

    Computer UserComparing yourself with others on Facebook is more likely to lead to feelings of depression than making social comparisons offline.

    That’s one of the findings from a review of all the research on the links between social networking and depression by David Baker and Dr Guillermo Perez Algorta from Lancaster University.

    They examined studies from 14 countries with 35,000 participants aged between 15 and 88.

    There are among 1.8 billion people on online social networking sites worldwide, with Facebook alone having more than 1 billion active users.

    Concerns over the effect on mental health led the American Academy of Pediatrics in 2011 to define “Facebook depression” as a “depression that develops when preteens and teens spend a great deal of time on social media sites, such as Facebook, and then begin to exhibit classic symptoms of depression.”

    The Lancaster University review of existing research found that the relationship between online social networking and depression may be very complex and associated with factors like age and gender.

    In cases where there is a significant association with depression, this is because comparing yourself with others can lead to “rumination” or overthinking.

    • Negative comparison with others when using Facebook was found to predict depression via increased rumination
    • Frequent posting on Facebook was found to be associated with depression via rumination

    However, the frequency, quality and type of online social networking is also important.

    Facebook users were more at risk of depression when they:

    • Felt envy triggered by observing others
    • Accepted former partners as Facebook friends
    • Made negative social comparisons
    • Made frequent negative status updates

    Gender and personality also influenced the risk, with women and people with neurotic personalities more likely to become depressed.

    But the researchers stressed that online activity could also help people with depression who use it as a mental health resource and to enhance social support.


  6. Online social media use does not impair our ability to concentrate

    October 19, 2016 by Ashley

    From the Inderscience Publishers media release:

    tablet computerUsing online social media does not lead to long-term problems with our ability to concentrate, according to new research published in the International Journal Social Media and Interactive Learning Environments.

    We are social animals, so it is really no surprise that billions of us now use online tools to communicate, educate and inform each other. The advent of social media and social networking has nevertheless been phenomenally rapid. “These networks have become an imprint of our everyday life and part of pop culture, revolutionizing the way people communicate and in the way organizations act, says Deborah Carstens of the Florida Institute of Technology.”With the abundance of technological devices, an increasing number of users of all ages rely on technology and specifically social media.”

    There are, however, worries about the impact such tools have on our psyche and our ability to concentrate, for instance. Now research from Carstens’ team and their colleagues at Barry University also in Florida, demonstrates that despite the often skittish and transient nature of online social interactions there is no difference to be seen in the attention span or “offline” sociability of occasional users and frequent users of online social media. These modern communication tools do not, it seems, interfere with our primal instincts, such as long-term attitudes, time appreciation, and concentration, in the way that many critics have suggested in recent years.

    “Social media is not a fad as it continues to play an increasing role in the individuals’ lives. Understanding how to utilize this social media epidemic to enhance learning, relationships and business knowledge is essential as individuals are spending an increasing amount of time on these networks,” the researchers conclude.


  7. Brain waves can be used to detect potentially harmful personal information

    October 5, 2016 by Ashley

    From the Texas Tech University media release:

    brain waves eegCyber security and authentication have been under attack in recent months as, seemingly every other day, a new report of hackers gaining access to private or sensitive information comes to light. Just recently, more than 500 million passwords were stolen when Yahoo revealed its security was compromised.

    Securing systems has gone beyond simply coming up with a clever password that could prevent nefarious computer experts from hacking into your Facebook account. The more sophisticated the system, or the more critical, private information that system holds, the more advanced the identification system protecting it becomes.

    Fingerprint scans and iris identification are just two types of authentication methods, once thought of as science fiction, that are in wide use by the most secure systems. But fingerprints can be stolen and iris scans can be replicated. Nothing has proven foolproof from being subject to computer hackers.

    “The principal argument for behavioral, biometric authentication is that standard modes of authentication, like a password, authenticates you once before you access the service,” said Abdul Serwadda a cybersecurity expert and assistant professor in the Department of Computer Science at Texas Tech University.

    “Now, once you’ve accessed the service, there is no other way for the system to still know it is you. The system is blind as to who is using the service. So the area of behavioral authentication looks at other user-identifying patterns that can keep the system aware of the person who is using it. Through such patterns, the system can keep track of some confidence metric about who might be using it and immediately prompt for reentry of the password whenever the confidence metric falls below a certain threshold.”

    One of those patterns that is growing in popularity within the research community is the use of brain waves obtained from an electroencephalogram, or EEG. Several research groups around the country have recently showcased systems which use EEG to authenticate users with very high accuracy.

    However, those brain waves can tell more about a person than just his or her identity. It could reveal medical, behavioral or emotional aspects of a person that, if brought to light, could be embarrassing or damaging to that person. And with EEG devices becoming much more affordable, accurate and portable and applications being designed that allows people to more readily read an EEG scan, the likelihood of that happening is dangerously high.

    The EEG has become a commodity application. For $100 you can buy an EEG device that fits on your head just like a pair of headphones,” Serwadda said. “Now there are apps on the market, brain-sensing apps where you can buy the gadget, download the app on your phone and begin to interact with the app using your brain signals. That led us to think; now we have these brain signals that were traditionally accessed only by doctors being handled by regular people. Now anyone who can write an app can get access to users’ brain signals and try to manipulate them to discover what is going on.”

    That’s where Serwadda and graduate student Richard Matovu focused their attention: attempting to see if certain traits could be gleaned from a person’s brain waves. They presented their findings recently to the Institute of Electrical and Electronics Engineers (IEEE) International Conference on Biometrics.

    Brain waves and cybersecurity

    Serwadda said the technology is still evolving in terms of being able to use a person’s brain waves for authentication purposes. But it is a heavily researched field that has drawn the attention of several federal organizations. The National Science Foundation (NSF), funds a three-year project on which Serwadda and others from Syracuse University and the University of Alabama-Birmingham are exploring how several behavioral modalities, including EEG brain patterns, could be leveraged to augment traditional user authentication mechanisms.

    There are no installations yet, but a lot of research is going on to see if EEG patterns could be incorporated into standard behavioral authentication procedures,” Serwadda said

    Assuming a system uses EEG as the modality for user authentication, typically for such a system, all variables have been optimized to maximize authentication accuracy. A selection of such variables would include:

    • The features used to build user templates.

    • The signal frequency ranges from which features are extracted.

    • The regions of the brain on which the electrodes are placed, among other variables.

    Under this assumption of a finely tuned authentication system, Serwadda and his colleagues tackled the following questions: • If a malicious entity were to somehow access templates from this authentication-optimized system, would he or she be able to exploit these templates to infer non-authentication-centric information about the users with high accuracy? • In the event that such inferences are possible, which attributes of template design could reduce or increase the threat?

    Turns out, they indeed found EEG authentication systems to give away non-authentication-centric information. Using an authentication system from UC-Berkeley and a variant of another from a team at Binghamton University and the University of Buffalo, Serwadda and Matovu tested their hypothesis, using alcoholism as the sensitive private information which an adversary might want to infer from EEG authentication templates.

    In a study involving 25 formally diagnosed alcoholics and 25 non-alcoholic subjects, the lowest error rate obtained when identifying alcoholics was 25 percent, meaning a classification accuracy of approximately 75 percent.

    When they tweaked the system and changed several variables, they found that the ability to detect alcoholic behavior could be tremendously reduced at the cost of slightly reducing the performance of the EEG authentication system.

    Motivation for discovery

    Serwadda’s motivation for proving brain waves could be used to reveal potentially harmful personal information wasn’t to improve the methods for obtaining that information. It’s to prevent it.

    To illustrate, he gives an analogy using fingerprint identification at an airport. Fingerprint scans read ridges and valleys on the finger to determine a person’s unique identity, and that’s it.

    In a hypothetical scenario where such systems could only function accurately if the user’s finger was pricked and some blood drawn from it, this would be problematic because the blood drawn by the prick could be used to infer things other than the user’s identity, such as whether a person suffers from certain diseases, such as diabetes.

    Given the amount of extra information that EEG authentication systems are able glean about the user, current EEG systems could be likened to the hypothetical fingerprint reader that pricks the user’s finger. Serwadda wants to drive research that develops EEG authentication systems that perform the intended purpose while revealing minimal information about traits other than the user’s identity in authentication terms.

    Currently, in the vast majority of studies on the EEG authentication problem, researchers primarily seek to outdo each other in terms of the system error rates. They work with the central objective of designing a system having error rates which are much lower than the state-of-the-art. Whenever a research group develops or publishes an EEG authentication system that attains the lowest error rates, such a system is immediately installed as the reference point.

    A critical question that has not seen much attention up to this point is how certain design attributes of these systems, in other words the kinds of features used to formulate the user template, might relate to their potential to leak sensitive personal information. If, for example, a system with the lowest authentication error rates comes with the added baggage of leaking a significantly higher amount of private information, then such a system might, in practice, not be as useful as its low error rates suggest. Users would only accept, and get the full utility of the system, if the potential privacy breaches associated with the system are well understood and appropriate mitigations undertaken.

    But, Serwadda said, while the EEG is still being studied, the next wave of invention is already beginning.

    “In light of the privacy challenges seen with the EEG, it is noteworthy that the next wave of technology after the EEG is already being developed,” Serwadda said. “One of those technologies is functional near-infrared spectroscopy (fNIRS), which has a much higher signal-to-noise ratio than an EEG. It gives a more accurate picture of brain activity given its ability to focus on a particular region of the brain.”

    The good news, for now, is fNIRS technology is still quite expensive; however there is every likelihood that the prices will drop over time, potentially leading to a civilian application to this technology. Thanks to the efforts of researchers like Serwadda, minimizing the leakage of sensitive personal information through these technologies is beginning to gain attention in the research community.

    The basic idea behind this research is to motivate a direction of research which selects design parameters in such a way that we not only care about recognizing users very accurately but also care about minimizing the amount of sensitive personal information it can read,” Serwadda said.


  8. New neuroendovascular technique shows promise in stroke patients with large-vessel clots

    June 30, 2016 by Ashley

    From the Medical University of South Carolina media release:

    hospital emergency signIn an article published online April 16, 2016 by the Journal of Neurointerventional Surgery, investigators at the Medical University of South Carolina (MUSC) report promising 90-day outcomes for stroke patients with large-vessel clots who underwent thrombectomy or clot removal using the direct-aspiration, first pass technique (ADAPT). Approximately 58% of stroke patients with a large-vessel clot removed using the technique achieved a good outcome at 90 days, defined as a Modified Rankin Score (mRS) of 0 to 2.

    ADAPT aims to remove the clot in its entirety with a large-diameter aspiration catheter in a single pass. In contrast, stent retrievers, currently considered standard of care, frequently fragment the clot for removal and can require several passes.

    ADAPT was developed by MUSC Health neuroendovascular surgeons M. Imran Chaudry, M.D., Alejandro M. Spiotta, M.D., Aquilla S. Turk, D.O., and Raymond D. Turner, M.D., all co-authors on the April 2016 Journal of Neurointerventional Surgery article. MUSC Health neurosurgery resident Jan Vargas, M.D., is first author on the article.

    The goal in ADAPT is to take the largest-bore catheter available up to the blood clot and put suction where it’s blocked and pull it out of the head to reestablish flow in that blood vessel,” said Turk. If the first-pass attempt is unsuccessful, stent retrievers can still be used to remove the clot.

    In the article, the investigators report the results of a retrospective study of 191 consecutive patients with acute ischemic stroke who underwent ADAPT at MUSC Health. In 94.2% of patients, blood vessels were successfully opened–by direct aspiration alone in 145 cases and by the additional use of stent retrievers in another 43 cases. Good outcomes at 90 days (mRS, 0-2) were achieved in 57.7% of patients who were successfully revascularized with aspiration alone and in 43.2% of those who also required a stent retriever. The average time required to reopen the blocked blood vessels was 37.3 minutes–29.6 minutes for direct aspiration alone and 61.4 minutes for cases that also required stent retrievers. Patients presented for thrombectomy on average 7.8 hours after stroke onset.

    These results confirm the promise of ADAPT, which was first described by the MUSC Health team in a seminal 2014 article in the Journal of Neurointerventional Surgery. Since the publication of that article, a number of single-center series studies have reported impressive recanalization times (the time it takes to open the blood vessel) and good neurological outcomes with ADAPT using a large-bore catheter, suggesting that it could offer an alternative approach to stent retrievers for mechanical thrombectomy.

    Stent retrievers have been considered standard of care for stroke patients since the publication in the October 2015 issue of Stroke of a scientific statement on thrombectomy by the American Heart Association. That statement recommended rapid clot removal in addition to tissue plasminogen activator (tPA), a clot-busting drug that can minimize stroke complications if administered in a tight time window. The recommendation was based on the promising findings of five large clinical trials comparing treatment with tPA alone versus treatment with tPA plus thrombectomy using stent retrievers in large-vessel clots: MR CLEAN, EXTEND-IA, ESCAPE, SWIFT PRIME, and REVASCAT.

    A definitive answer as to whether ADAPT could likewise become standard of care for stroke patients with large-vessel clots will require clinical trials comparing the efficacy of the direct aspiration technique versus stent retrievers in this population of stroke patients.

    The MUSC Health neuroendovascular surgery team is currently running the COMPASS trial (COMParison of ASpiration vs Stent retriever as first-line approach; Clinicaltrials.gov identifier NCT02466893) in conjunction with colleagues Dr. J. Mocco of Mount Sinai and Dr. Adnan Siddiqui of the University of Buffalo. The trial is randomizing patients to either ADAPT or a stent retriever as the initial thrombectomy technique. The trial, scheduled to enroll 270 patients, has enrolled 90 patients in the past year at ten sites in the United States.


  9. Why hierarchy exists in biological networks: New insight will aid development of artificial intelligence

    June 17, 2016 by Ashley

    From the PLOS media release:

    mind mazeNew research explains why so many biological networks, including the human brain (a network of neurons), exhibit a hierarchical structure, and will improve attempts to create artificial intelligence. The study, published in PLOS Computational Biology, demonstrates this by showing that the evolution of hierarchy — a simple system of ranking — in biological networks may arise because of the costs associated with network connections.

    Like large businesses, many biological networks are hierarchically organised, such as gene, protein, neural, and metabolic networks. This means they have separate units that can each be repeatedly divided into smaller and smaller subunits. For example, the human brain has separate areas for motor control and tactile processing, and each of these areas consist of sub-regions that govern different parts of the body.

    But why do so many biological networks evolve to be hierarchical? The results of this paper suggest that hierarchy evolves not because it produces more efficient networks, but instead because hierarchically wired networks have fewer connections. This is because connections in biological networks are expensive — they have to be built, housed, maintained, etc. — and there is therefore an evolutionary pressure to reduce the number of connections.

    In addition to shedding light on the emergence of hierarchy across the many domains in which it appears, these findings may also accelerate future research into evolving more complex, intelligent computational brains in the fields of artificial intelligence and robotics.

    Researchers from the University of Wyoming and INRIA (France) led by Henok S. Mengistu simulated the evolution of computational brain models, known as artificial neural networks, both with and without a cost for network connections. They found that hierarchical structures emerge much more frequently when a cost for connections is present.

    Author Jeff Clune says, “For over a decade we have been on a quest to understand why networks evolve to have the properties of modularity, hierarchy, and regularity. With these results, we have now uncovered evolutionary drivers for each of these key properties.” Mengistu notes: “The findings not only explain why biological networks are hierarchical, they might also give an explanation for why many human-made systems such as the Internet and road systems are also hierarchical.”

    Author Joost Huizinga adds “The next step is to harness and combine this knowledge to evolve large-scale, structurally organized networks in the hopes of creating better artificial intelligence and increasing our understanding of the evolution of animal intelligence, including our own.”


  10. Study offers explanation for why women leave engineering

    June 16, 2016 by Ashley

    From the Massachusetts Institute of Technology media release:

    teacher meditating womanWomen who go to college intending to become engineers stay in the profession less often than men. Why is this? While multiple reasons have been offered in the past, a new study co-authored by an MIT sociologist develops a novel explanation: The negative group dynamics women tend to experience during team-based work projects makes the profession less appealing.

    More specifically, the study finds, women often feel marginalized, especially during internships, other summer work opportunities, or team-based educational activities. In those situations, gender dynamics seem to generate more opportunities for men to work on the most challenging problems, while women tend to be assigned routine tasks or simple managerial duties.

    In such settings, “It turns out gender makes a big difference,” says Susan Silbey, the Leon and Anne Goldberg Professor of Humanities, Sociology, and Anthropology at MIT, and co-author of a newly-published paper detailing the study.

    As a result of their experiences at these moments, women who have developed high expectations for their profession — expecting to make a positive social impact as engineers — can become disillusioned with their career prospects.

    “It’s a cultural phenomenon,” adds Silbey, regarding the way this group-dynamics problem crops up at a variety of key points during students’ training.

    The paper, titled “Persistence is Cultural: Professional Socialization and the Reproduction of Sex Segregation,” appears in the latest print issue of the journal Work and Occupations. The co-authors are Silbey, who is the corresponding author; Carroll Seron, a professor at the University of California at Irvine; Erin Cech, an assistant professor at the University of Michigan; and Brian Rubineau, an associate professor at McGill University.

    “Menial tasks” instead of “all the fun”

    Overall, about 20 percent of undergraduate engineering degrees are awarded to women, but only 13 percent of the engineering workforce is female. Numerous explanations have been offered for this discrepancy, including a lack of mentorship for women in the field; a variety of factors that produce less confidence for female engineers; and the demands for women of maintaining a balance between work and family life.

    The current study does not necessarily preclude some of those other explanations, but it adds an additional element to the larger discussion.

    To conduct the study, the researchers asked more than 40 undergraduate engineering students to keep twice-monthly diaries. The students attended four institutions in Massachusetts: MIT, the Franklin W. Olin College of Engineering, Smith College, and the University of Massachusetts at Amherst. That generated more than 3,000 individual diary entries that the scholars systematically examined.

    What emerges is a picture in which female engineering students are negatively affected at particular moments of their educational terms — especially when they engage in team-based activities outside the classroom, where, in a less structured environment, older gender roles re-emerge.

    This crops up frequently in the diary entries. To take an example, one student named Kimberly described an episode in a design class in which “two girls in a group had been working on the robot we were building in that class for hours, and the guys in their group came in and within minutes had sentenced them to doing menial tasks while the guys went and had all the fun in the machine shop. We heard the girls complaining about it. … ”

    Or, as the paper puts it, “Informal interactions with peers and everyday sexism in teams and internships are particularly salient building blocks of [gender] segregation.” The researchers add: “For many women, their first encounter with collaboration is to be treated in gender stereotypical ways.” And by contrast, as the researchers note in the paper, “Almost without exception, we find that the men interpret the experience of internships and summer jobs as a positive experience.”

    Such experiences lead to a problem involving what the researchers call “anticipatory socialization.” The women in the study, Silbey and her colleagues observed, are more likely than men to say they are entering the field of engineering with the explicit idea that it will be a “socially responsible” profession that will “make a difference in people’s lives.” But group dynamics seem to affect this specific expectation in two ways: by leading women to question whether other professions could be a better vehicle for affecting positive social change, and by leading them to question if their field has a “commitment to a socially conscious agenda that … was a key motivator for them in the first place.”

    Changes beyond the classroom

    As Silbey observes, the findings suggest that engineering’s gender gap is not precisely rooted in the engineering curriculum or the classroom, which have often been the focus of past scrutiny in this area.

    “We think engineering education is quite successful by its own standards,” Silbey says. Moreover, she adds, “The teaching environment is for the most part very successful.”

    That means some new kinds of remedies could be explored, which might have a positive impact on women’s experiences as engineers in training. For instance, as Silbey has previously recommended, institutions could develop “directed internship seminars,” in which student internship experiences could be dissected to help many people grasp and learn from the problems women face.

    In this vein, Silbey adds, whatever other components education may have, it is useful to remember that “education is a process of socialization.”