First year, through the eyes of a baby bird

From my journal, April 2014, 6 months into my first year of grad school.

This is what I feel like. Vulnerable, awkward, feeling tentative about leaving the comfort of my nest, beak wide open hoping to consume as much as possible.

I still feel like that sometimes. I think baby birds usually learn how to fly pretty quickly, but becoming a researcher is not so quick. I spent a while early in my grad career flapping my wings frantically – I was doing the activities that I saw everyone else doing, but I felt like I still wasn’t getting it in the way that they were. They’d flap and fly. I’d flap and stay grounded.

But gradually, my flapping started to lift me off the ground. Initially, I’d be airborne only briefly. Over time, I spent longer in the air. I’m still on the ground flapping some days, but I now spend much more time actually flying. I probably couldn’t yet withstand a full-blown winter migration, but I can get from place to place. The real miraculous thing is that some days I don’t even have to flap my wings so hard to fly. I flap a little, and with way less effort than I used to expend, I can soar.

But we all start as baby birds.

CogSci 2016 Day 3 Personal Highlights

  • There is more to gesture than meets the eye: Visual attention to gesture’s referents cannot account for its facilitative effects during math instruction (Miriam Novack, Elizabeth Wakefield, Eliza Congdon, Steven Franconeri, Susan Goldin-Meadow): Earlier work has shown that gestures can help kids learn math concepts, but this work explores one possible explanation for why this is so: that gestures attract and focus visual attention. To test this, kids watched a video in which someone explained how to do a mathematical equivalence problem (a problem like 5 + 6 + 3 = __ + 3. For some kids, the explainer gestured by pointing to relevant parts of the problem as she explained; for others, she just explained (using the exact same speech as for the gesture-receiving kids). The researchers used eye tracking while the kids watched the videos and found that those who watched the video with gestures looked more to the problem (and less at the speaker) than who watched the video sans gesture. More importantly, those who watched the gesture video did better on a posttest than those who didn’t. The main caveat was that the kids’ eye patterns did not predict their posttest performance; in other words, looking more at the problem and less at the speaker while learning may have contributed to better understanding of the math principle, but not significantly; other mechanisms must also be underlying gesture’s effect on learning. 

    But in case you started to think that gestures are a magic learning bullet:

  • Effects of Gesture on Analogical Problem Solving: When the Hands Lead You Astray (Autumn Hostetter, Mareike Wieth, Keith Moreno, Jeffrey Washington): There’s a pretty famous problem for cognitive science tests studying people’s analogical abilities, referred to as Duncker’s radiation problem: A person has a tumor and needs radiation. A strong beam will be too strong and will kill healthy skin. A weak beam won’t be strong enough to kill the tumor. What to do? The reason this problem is used as a test of analogical reading is that participants are presented a different story – an army wants to attack a fortress (and the fortress is at the intersection of a bunch of roads), but there are mines placed on the roads leading up to it, so the whole army can’t pass down one road at a time. Yet if they only send a small portion of the army down a road, the attack will be too weak. The solve this by splitting up and all converging on the fortress at the same time. Now can you solve the radiation problem? Even though the solution is analogous (target the tumor with weak rays coming from different directions) people (college undergrads) usually still struggle. It’s a testament to how hard analogical reasoning is.
    But that’s just background leading to the current study, where the researchers asked: if people gesture while retelling the fortress story, will they have more success on the radiation problem? To test this, they had one group of participants that they explicitly told to gesture, one group that they told not to gesture, and a final group that they didn’t instruct at all regarding gestures. They found that the gesturers in fact did worse than non-gesturers, and after analyzing the things that people actually talked about in the different conditions, discovered that when people gestured, they tended to talk more about concrete details of the situation – for example, the roads and the fortress – and this focus on the perceptual features of the fortress story actually inhibited their ability to apply the analogical relations of that story to the radiation case.
    Taking this study into consideration with the previous one, it’s clear that gesture is not all good or all bad; there are lots of nuances of a situation that need to be taken into account and lots of open questions ripe for research.
  • tDCS to premotor cortex changes action verb understanding: Complementary effects of inhibitory and excitatory stimulation (Tom Gijssels, Daniel Casasanto): We know the premotor cortex is involved when we execute actions, and there’s quite a bit of debate about to what extent it’s involved in using language about actions. They used transcranial direct current stimulation – a method that provides a small electrical current to a targeted area of the brain – over the premotor cortex (PMC) to test for its involvement in processing action verbs (specifically, seeing a word or a non-word and indicating whether it’s a real English word). People who received PMC inhibitory stimulation (which decreases the likelihood of the PMC neurons firing) were more accurate for their responses about action verbs, while those who received PMC excitatory stimulation (increasing the likelihood of the PMC neurons firing). This at first seems paradoxical – inhibiting the motor area helps performance and exciting it hurts, but there are some potential explanations for this finding. One that seems intriguing to me is that since the PMC is also responsible for motor movements, inhibiting the area helped people suppress the inappropriate motor action (for example, actually grabbing if they read the verb grab), and as a consequence facilitated their performance on the word task; excitatory stimulation over the same area had the opposite effect. Again, this study makes it clear that something cool is going on in the parts of our brain responsible for motor actions when we encounter language about actions… but as always, more research is needed.


  • Tacos for dinner. After three days of long, stimulating conference days, the veggie tacos at El Vez were so good that they make the conference highlight list.

For every cool project I heard about, there were undoubtedly many more that I didn’t get to see. Luckily, the proceedings are published online, giving us the printed version of all the work presented at the conference. Already looking forward to next year’s event in London!

How We Learn: A Guest Review

I mentioned in a previous post that I have some stellar undergraduate Research Assistants. I neglected to mention that this summer I also have some stellar high school assistants. Juliette Hill is a rising senior whose main goal for her time in the lab was to learn what it’s like to be a cognitive science grad student. She worked on some open-ended and exploratory questions as well as some very detailed data collection. She also read and thought about cognitive science ideas beyond the specific ones we’re addressing in the lab. Here are her thoughts on How We Learn, a book by Benedict Carey:


Like most of us, Benedict Carey grew up with the belief that in order to learn best, one had to find a quiet, designated study space. Practice was the only path to perfection. The Internet and all other electronic devices should be turned off lest they disturb your concentration. Highlighting and rereading notes, if done frequently, will improve your test scores. Forgetting is the enemy of learning.

Yet most of these adages are far from the truth.

Distractions can actually aid learning in ways that remaining focused cannot. Studying in the same spot repeatedly may weaken your grasp on the subject. After an intense study session of revising notes, we feel confident we know our subject inside out, but we still barely manage a B on the test. Why?

With the advent of modern science, we are barely able to scrape the surface of discovering the cognitive aspect of learning. In his book How We Learn, Benedict Carey walks the reader through a multitude of discoveries that may revolutionize the way we perceive the learning process. Here are some of the findings he explains:

Distraction can aid learning. While this is not an absolute (checking Facebook during a lecture does not help you learn what the teacher is presenting you), it certainly has much potential, especially in today’s society. While stuck on a difficult math problem or other similar pit, taking a study break can definitely boost your ability to solve the problem the second time around. Does this mean taking an hour-long nap will have similar effects? Absolutely! And it can possibly help even more than a simple distraction.

Sleep is your friend. Most people know that sleep can help consolidate learned facts and motor skills, but few people know when such benefits occur in the night. Each night is comprised of several cycles, alternating between a deep sleep and a more wakeful one. The times in the night when you sleep the deepest occur around the first 2 to 3 hours of sleep. This deep sleep has been found to reinforce the learning of rote facts. Yet if you are preparing for a music recital (which would involve your motor skills and learning), your peak of the night would occur slightly later.

Highlighting and rereading of notes will not carry you far. In fact, you will feel as if you know the subject manner by heart, but will be disappointed when you see an unexpected score on your test. What happened? You knew the content so well, right? The danger of highlighting and rereading is that it gives you the impression that you know the material, when you actually are only familiar with it. The best way to review content is to maintain a “desirable difficulty” (as coined by Dr. Robert Bjork) in your studying. This means that testing yourself (as opposed to just reading the content) will help you retain the material much better. So you can dig up those flashcards you never thought you’d use again. This applies to preparing a speech too, in that you will be better prepared if you practice reciting your speech instead of just rereading your notes.

Interleaving helps retain information best. If you are asked to memorize the styles of 12 different artists from different eras, do you think you would do best by studying all the works done by each artist one at a time (a method called “blocking”) or by mixing up the artists? If you are like most, you may choose to study by blocking. However, this has shown to be significantly less effective than mixing up the artists (interleaving) and studying that way. Ever noticed that when you do your math problems (by each section), you understand right away and feel like you mastered the skill, yet come time for the test, you are confused by which equations to use? This can easily be avoided with interleaving, which would mean, in this case, that you include problems from previous sections along with the night’s homework.

Your study corner is a trap. There have been several studies that looked at the effect of location on retention and found that if you studied certain information in a particular spot and were tested on it at that same location, you do better than if you studied the material in one place and tested in another location. The same is true for body states (hunger, influence of drugs, mood…) or when listening to music. You do best when these stay consistent. Yet it is often too hard to study and test in the same location, and more importantly, it becomes harder to recall the information when not in that same area. The answer is to vary your location when studying. If you only study in one location, the information will unconsciously (though not on a large scale), be tied to that location. This means that if you move to another spot, your recall will not be at its optimal. However, by altering your study spots, you can avoid this dependence on your surroundings and possibly increase your score on the next test.

These are just a few of the topics Carey explains in his book, and there are many more discovered since the book’s publishing. Therefore, I highly recommend that you look into this book and share your findings with others. It’s a shame so few people know about the science of learning, despite the fact that their lives revolve around it.

Back to school inspiration

The beginning of September marks the traditional start of a new school year, even if in reality, many start sooner or later. A few pieces of back-to-school inspiration:

The first is a blog post, How to learn anything better by tweaking your mindset. The post describes a study in which two groups were taught the exact same information, but one group was told ahead of time that they’d later need to teach the information to someone, and the other group was told they’d be tested on the material. In actuality, no one had to teach the information to someone new, and participants in both groups received the same post-learning test. Those who had been planning to teach the new info, however, did significantly better on the test than those who were planning on being tested. The bottom line is that when we learn something with the intent of teaching it, we actually synthesize the information more and mentally organize it better than when we believe we’re learning for a test.

Anecdotally, I find this true. The classes I’ve TA’ed in the past year have been outside my realm of knowledge, but I knew I’d have to get up in front of a group of students just a few days after hearing the professor’s lecture and help the students synthesize the information presented and answer questions about it. I’d never have a written test on the material, as the students would, but I’d have an oral one when leading discussion. Technically, the stakes were low for me – I wasn’t going to get a bad grade or lose my job as a TA, but learning the information in order to be a competent teacher seemed crucial. As a result, I went into sponge mode right before every lecture, and I believe that I sopped up much more information and made stronger connections among the things being taught than if I had been a student expecting to be tested on it later.

On a related note, Khan Academy reminds us that You can learn anything.  Even though we often have to fail before we can succeed, “thankfully, we’re built to learn.” Screen Shot 2014-08-28 at 12.38.48 PM

Exponential Learning

We toss around the  phrase, “learn something new everyday” jokingly, but in reality, we learn so much more than one thing per day. Many of these things are implicit, so we don’t realize we’re learning, but each experience we have is making its mark on our cognition. Many other things we learn, though, are explicit – we’re consciously learning in an effort to get better at something. Before we can master a skill or knowledge set, we often have to learn how to learn that thing. What strategies facilitate optimal learning? Which are ineffective? A recent NYT column by David Brooks highlights some overarching differences in the learning processes in different domains.

In some domains, progress is logarithmic. This means that for every small increase in x (input, or effort), there is a disproportionately large increase in y (output, or skill) early on. Over time, the same increases in x will no longer yield the same return, and progress will slow. Running and learning a language are two examples of skills that show logarithmic learning processes.


Other domains have exponential learning processes. Early on, large increases in effort are needed to see even minimal progress. Eventually, though, progress accelerates and might continue to do so without substantial additional effort.

Mastering an academic discipline is an exponential domain. You have to learn the basics over years of graduate school before you internalize the structures of the field and can begin to play creatively with the concepts.

My advisor has also told me a version of this story. She’s said that working hard in grad school (specifically I think she phrased it as “tipping the work-life balance in favor of work”) is an investment in my career. Just as monetary investments become exponentially more valuable over time, intense work early in my career will be exponentially more valuable in the long run than trying to compensate by working extra later on.


Even in my first year of grad school, I developed a clear sense that even learning how the field works and what are good questions to ask takes time. When I wrote my progress report for my first year, I concluded that most of what I learned this year has been implicit. I can’t point to much technical knowledge that I’ve acquired, but I can say that I’ve gained a much better idea of what cognitive science is about as a field. I’ve gained this by talking (and especially by listening) to others’ ideas, by attending talks, and by reading as much as I could. This implicit knowledge doesn’t necessarily advance my “PhD Progress Meter” (a meter that exists only in my mind), but it is also necessary to at least start to acquire before I’ll see any real progress on that meter. Once the PhD meter is complete, I will merely have built the foundation for my career, but will probably still have much learning to do before I reach the steepest and most gratifying part of the learning curve.

Brooks points out that many people quit the exponential domains early on. He uses the word “bullheaded” as a requirement for someone who wants to stick with one of these domains, since you must be able to continually put in work while receiving no glory. I think that understanding where you are on the curve at any given time is crucial for sticking with one of these fields, so that you can recognize that eventually, the return on effort will accelerate, and the many hours (tears, complaints, whatever) that went into mastering the domain early on were not in vain. Where I stand right now, progress is pretty flat… so I must be doing something right.

Review of Roger Schank’s “Teaching Minds”

Teaching Minds image

Along the lines of yesterday’s post evaluating the efficacy of higher education, here’s my review of Schank’s book, Teaching Minds:

In Teaching Minds: How Cognitive Science can Save our Schools, Roger Schank critically evaluates current methods of education in light of what is known about how we learn. The core of his argument is that in order to be successful in the world post school, students must master cognitive abilities, or processes, instead of subjects and the fact-based knowledge that subjects often entail. Specifically, Schank identifies twelve processes which he claims underlie learning, all of which fall under the categories of conceptual, analytic, or social processes. They include abilities such as experimentation, evaluation, planning, causation, teamwork, and negotiation. Schank’s view is that mastery of the twelve cognitive processes is crucial for success in life after school.

Because subjects, such as math and history, are the core of our current education system, Schank argues that we are doing a disservice to 98% of our students. At the university level, he claims that all institutions attempt to emulate Yale, whose curriculum is ideal for future scholars. However, most students receiving a college education aren’t going to be scholars, so they shouldn’t go to colleges that only teach useful skills for a career in academia. Because colleges are focused on training scholars, and high schools are focused on getting their students ready for college, high schools are also teaching useless knowledge and subjects to students. Again, what is necessary is a shift in focus from subjects to cognitive processes.

Another tenet that Schank relies on is that students need to want to learn what they’re learning; they need motivation. Young children have great motivation to learn to walk and talk, and not surprisingly, he argues, they master these skills relatively quickly and are generally successful. By learning through real-life projects, students will be more engaged and will consequently gain more from their studies. In order to provide the real-scenarios from which students would optimally learn, Schank advocates for a shift to more online learning. Online curricula would allow dissemination to a greater number of students and would also allow students to make more choices in what they learn, thus ensuring they are motivated to learn what they are studying. The book concludes with examples of lessons aimed at imparting the cognitive processes that would be effective online.

Schank’s argument that our education system over-emphasizes subject knowledge and under-emphasizes the cognitive skills underlying all professions is compelling. Instead of being able to recite the preamble to the Declaration of Independence, shouldn’t students be able to use logic to diagnose problems in the real-world, understand their causes, and make hypotheses about potential solutions? However, the program outlined in Teaching Minds is radically different from the current education system, and would therefore be met with resistance if it were proposed as an alternative.

Schank’s twelve cognitive processes are one source of skepticism: why did he choose those particular skills and not others? Why are there no physical skills, such as bodily awareness or deep breathing, included in his list? If the processes on which his proposed program is based seem arbitrary, how can we avoid being skeptical of his entire program?

It is also surprising to read an education advocate who would like to overhaul the current system in favor of online curricula. Schank does not address the educational costs of online learning. For example, schools are dynamic environments that foster spontaneous learning which will disappear if students learn entirely from pre-planned online curricula. In a similar vain, the time students spend at school outside of the classroom, such as in the cafeteria or even the halls between classes, is also valuable time. Students encounter peer pressure, bullying, and have the opportunity to engage in relationships that would be fundamentally altered if they communicated with each other solely through their computers. Might this type of learning hinder students’ social skills, which are already coming under attack in today’s video game-filled society?

A final concern of mine that Schank does not address is the danger of allowing students to learn only that which they choose. Although Schank does not believe there is a compelling reason that students must be “well-rounded” in their knowledge, intuitively it seems to me that having exposure to many different fields and ideas is beneficial. Further, it seems likely that students, in choosing only subjects that initially interest them, they may miss out on interests they didn’t realize they had, interests that could have become evident had the students been exposed to them.

Overall, Teaching Minds is a thought-provoking read, especially for those of us who feel the current state of education leaves something to be desired. While his main point is one that few people would argue with, his radical proposal for fixing it is likely to render it an impossibility.