Smart People Go to College, and Other Twists in Measuring the Value of a Degree

07/02/14 | by Training Games | Categories: Play

By Beckie Supiano - The Chronicle of Higher Education - June 19th, 2014

It is well established that, on average, people with college degrees earn quite a bit more over the course of their careers than do those without. That earnings premium is one of higher education’s major selling points. A slew of studies—especially recently—have sought to quantify the return on investment, examining annual or lifetime earnings by attainment level or subject studied.

But people who go to college or not aren’t otherwise identical. And even those who do go self-select into different majors.

In a new paper on the college payoff, Douglas Webber, an assistant professor of economics at Temple University, tries to take all of that into account. Mr. Webber spoke with The Chronicle about how prospective students and policy makers should think about the value of a degree. What follows is an edited transcript of that conversation.

Q. You look at the lifetime-earnings premium a little bit differently than some others have. For one, you try to eliminate selection bias. Why is that important?

A. There are two main reasons that people who go to college earn more than everyone else. One is that they are hopefully learning something in college that is going to help them in their future careers. Another is that smarter people tend to go to college, and they were going to be more productive regardless.

So when you’re trying to measure the college premium, you have to look a little deeper than just saying, How much do people who go to college make versus how much do people who don’t go to college make? It’s not an apples-to-apples comparison.

Let’s say college graduates make 40 percent more than high-school graduates. If we were to randomly pick a high-school graduate to receive a college degree, we wouldn’t expect that person to actually get 40 percent more in earnings.

There’s also selection into majors. On average, smarter people tend to major in certain fields over others. Doctors tend to be smarter than the rest of us, and they also tend to earn more money. They probably majored in some science field as undergrads. So is the reason they’re earning more money because they’re smarter, is it because of the training they got, or is it some combination?

Q. So how did you account for selection bias?

A. I use data from the National Longitudinal Survey of Youth, which has detailed information on individuals—everything from their standardized-test scores to noncognitive skills (like, say, self-esteem or locus of control, which is how much you think your actions have an impact on your life).

These things wouldn’t be captured in a standard IQ test, but they’re probably very important for determining whether you go to college—and also your future wages. If you’re kind of fatalistic, and you don’t think what you do is going to make any difference, then you’re probably not going to invest heavily in your education.

The NLSY allows me to control for a very wide range of characteristics, and I can follow people over time.

Q. Did any of the findings surprise you?

A. I was a little surprised at how, even after controlling for as wide a range of selection biases as possible, there were still such big differences in lifetime earnings across majors. I definitely expected some differences, but I didn’t expect them to be as big as they were.

Q. What implications do you see here for prospective students?

A. They should absolutely take into account the big differences in lifetime earnings. That said, it should be only part of the puzzle. They shouldn’t make all of their decisions based solely on which major has the highest potential earnings. They have to take into account things like what they enjoy, what they’re good at. Those all matter a lot in terms of lifetime satisfaction. Money isn’t everything. But money is important.

Q. What implications do you see for policy makers?

A. Universities and policy makers should make it known to 18-year-olds—who may not know how to find these data, who may not even be thinking about their market prospects later on—that decisions you’re making when you’re 18 can have a big impact.

I am absolutely not saying that universities should cut, I really hate to pick out any major, but let’s say art history. But there should be at least some attention paid to graduates in these fields. Are they able to make a good living when they graduate?

Q. On the broader conversation going on about whether college is worth it, do you think the traditional ways economists tackle that question provide good answers?

A. There’s a big piece of the puzzle that many studies and many articles in the popular press miss. Not all colleges are created equal. There’s a big difference if you’re talking about going to Harvard or to a random college no one’s ever heard of. And people also miss that less than 60 percent of 18-year-olds starting college full time are actually going to earn a degree within six years.

Now everything I’m talking about, I’m using average returns. When I said that higher-ability people tend to go into certain majors, I’m saying that on average. So there are many, many absolutely brilliant people who major in art history, and there are many not-so-brilliant people who major in engineering.

A lot of times people put too much weight on outliers. They see someone who is really successful, and they think that’s a good path to take. But if you are an average person, then you should be looking at the average return.

Mick Jagger—and the world—would be much worse off if he had stayed at the London School of Economics and gotten an econ degree instead of dropping out to hang out with Keith Richards. But you know, he’s an extreme outlier.

Q. So we shouldn’t all drop out of school and start a band.

A. Exactly.

A Learning Secret: Don’t Take Notes with a Laptop

06/20/14 | by Training Games | Categories: Play

Students who used longhand remembered more and had a deeper understanding of the material

Jun 3, 2014, Scientific American/Mind Matters | By Cindi May

note taking“More is better.” From the number of gigs in a cellular data plan to the horsepower in a pickup truck, this mantra is ubiquitous in American culture. When it comes to college students, the belief that more is better may underlie their widely-held view that laptops in the classroom enhance their academic performance. Laptops do in fact allow students to do more, like engage in online activities and demonstrations, collaborate more easily on papers and projects, access information from the internet, and take more notes. Indeed, because students can type significantly faster than they can write, those who use laptops in the classroom tend to take more notes than those who write out their notes by hand. Moreover, when students take notes using laptops they tend to take notes verbatim, writing down every last word uttered by their professor.

Obviously it is advantageous to draft more complete notes that precisely capture the course content and allow for a verbatim review of the material at a later date. Only it isn’t. New research by Pam Mueller and Daniel Oppenheimer demonstrates that students who write out their notes on paper actually learn more. Across three experiments, Mueller and Oppenheimer had students take notes in a classroom setting and then tested students on their memory for factual detail, their conceptual understanding of the material, and their ability to synthesize and generalize the information. Half of the students were instructed to take notes with a laptop, and the other half were instructed to write the notes out by hand. As in other studies, students who used laptops took more notes. In each study, however, those who wrote out their notes by hand had a stronger conceptual understanding and were more successful in applying and integrating the material than those who used took notes with their laptops.

What drives this paradoxical finding? Mueller and Oppenheimer postulate that taking notes by hand requires different types of cognitive processing than taking notes on a laptop, and these different processes have consequences for learning. Writing by hand is slower and more cumbersome than typing, and students cannot possibly write down every word in a lecture. Instead, they listen, digest, and summarize so that they can succinctly capture the essence of the information. Thus, taking notes by hand forces the brain to engage in some heavy “mental lifting,” and these efforts foster comprehension and retention. By contrast, when typing students can easily produce a written record of the lecture without processing its meaning, as faster typing speeds allow students to transcribe a lecture word for word without devoting much thought to the content.


Cindi May is a Professor of Psychology at the College of Charleston. She explores mechanisms for optimizing cognitive function in college students, older adults, and individuals with intellectual disabilities. She is also the project director for a TPSID grant from the Department of Education, which promotes the inclusion of students with intellectual disabilities in postsecondary education.

Learning games take a step up

06/20/14 | by Training Games | Categories: Play


By Peggy Walsh-Sarnecki Detroit Free Press
January 2, 2007

Kids trying to sell Mom and Dad on video games have a new angle -- some of those hot titles may be educational. Long snubbed by game-savvy kids who prefer action and horror titles like Gears of War, educational computer games are making a comeback. The problem in recent years has been finding educational games that can match the graphic quality and speed of other video games. But that's changing. "They're using techniques that mainstream commercial games are using in order to catch kids' interest and hold their interest, but coupling that with educational theory and educational content," said Ethan Watrall, an assistant professor of information studies and media at Michigan State University who's a researcher with MSU's Games for Entertainment and Learning Lab. For older children, experts say games that put abstract concepts into real world situations, making the subject much easier to learn, can be good choices.

Why Games and Learning

06/03/14 | by Training Games | Categories: Play

learningThe meaning of knowing today has shifted from being able to recall and repeat information to being able to find it, evaluate it and use it compellingly at the right time and in the right context.

Education in the early part of the twentieth century tended to focus on the acquisition of basic skills and content knowledge, like reading, writing, calculation, history or science. Many experts believe that success in the twenty-first century depends on education that treats higher order skills, like the ability to think, solve complex problems or interact critically through language and media.

Games naturally support this form of education. They are designed to create a compelling complex problem space or world, which players come to understand through self-directed exploration. They are scaffolded to deliver just-in-time learning and to use data to help players understand how they are doing, what they need to work on and where to go next. Games create a compelling need to know, a need to ask, examine, assimilate and master certain skills and content areas. Some experts argue that games are, first and foremost, learning systems, and that this accounts for the sense of engagement and entertainment players experience.

There are other attributes of games that facilitate learning. One of these is the state of being known as play. Much of the activity of play consists in failing to reach the goal established by a game’s rules. And yet players rarely experience this failure as an obstacle to trying again and again, as they work toward mastery. There is something in play that gives players permission to take risks considered outlandish or impossible in “real life.” There is something in play that activates the tenacity and persistence required for effective learning.

There are three key moments in game play with important implications for learning. The first is when a would-be player approaches a game and expresses a wish to participate: “Can I try? Can I join in?” The second moment comes when a player asks, “Can I save it?” In other words, “I’m deeply invested in this experience, which has value and meaning, and I’d like to pick up where I left off.” The third moment comes when a player attains a level of mastery and offers advice to a novice: “Want me to show you how?” A corollary to this moment occurs in the community of practice that arises around games, when one player asks another, “How did you do that? Will you teach me?”

We are happy to observe the public discourse around games and learning moving beyond the polemics which have tended to cast digital games―on the one hand―as a scourge on civil society and―on the other hand―as a Holy Grail in the quest to keep kids in school and on track. Games are already widely used by teachers, parents, schools and other institutions with an interest in learning. They function as doorways into content areas, introductions into specific skill sets and/or nodes in larger knowledge networks. In fact, games and learning have enjoyed an association that predates digital technology by thousands of years. That’s why when we discuss the properties of games, we mean to refer to games of all types: board games, physical games, puzzle games, online games, console games, mobile games, etc.

The Institute is most interested in games as complex eco-systems extending beyond the game space to involve networks of people in a variety of roles and rich interactions. Learning represents just one activity within this larger, highly engaging system.

From the Institute of Play

Does Playing Football Hurt the Brain

06/03/14 | by Training Games | Categories: Play

Scientific American Mind - Vol. 25 Issue 1 May Dec 19, 2013
By Jacqueline C. Tanaka and Gregg B. Wells
(Partial Article)

The rise of chronic traumatic encephalopathy among some athletes suggests that repeated blows to the head may trigger the brain's unraveling

Repeated traumatic brain injuries increase a person's chances of developing a neurodegenerative disorder called chronic traumatic encephalopathy (CTE).

Players of many sports, but most notably football, appear to be especially vulnerable.

New techniques for observing the disorder's warning signs promise to help coaches and clinicians identify vulnerable players before they fall victim to CTE.

Mike Webster played for 17 seasons in the National Football League (NFL). He was instrumental to the Pittsburgh Steelers' four Super Bowl victories during his career. In 2002 he died of heart failure in the coronary care unit of Allegheny General Hospital at age 50. His medical history included serious neuropsychiatric problems beginning around the time he left the NFL.

After Webster retired at age 38, his family watched him disintegrate into a tormented, wandering soul living out of his Chevrolet S-10 pickup truck. After his death, an astute neuropathologist at the University of Pittsburgh, Bennet Omalu, performed an autopsy on Webster and preserved regions of his brain for later microscopic analysis.

When Omalu examined the specimens, he observed atrophy similar to that seen in Alzheimer's disease patients—but in different areas of the brain. Omalu recognized the abnormalities as chronic traumatic encephalopathy (CTE), a form of brain deterioration previously reported in boxers and associated with the repeated traumatic brain injuries experienced in that sport. The 2005 report that Omalu published on Webster's brain was the first known case of CTE in a professional NFL player.

In the eight years since, the number of reports of the behavioral and cognitive changes experienced by NFL players has exploded. And the athletes themselves have taken notice. When Chicago Bears player Dave Duerson committed suicide in 2011, he shot himself in the chest and left a note requesting that his brain be donated to science. Analyses revealed that he, too, had developed CTE. The year of Duerson's death, approximately 4,500 players sued the NFL for concealing information about the dangers of traumatic brain injuries. Last August the league agreed to an out-of-court settlement for $765 million. Since then, former players have launched new suits against the NFL, the National Collegiate Athletic Association (NCAA) and a helmet manufacturer, Riddell.

The legal furor has been matched by a frenzy of activity on the scientific side. More than 100 NFL players and athletes from other sports have pledged their brains to the study of CTE. So far few of the mysteries of this disorder have been solved, but scientists have nonetheless gleaned compelling insights. Participating in contact sports and sustaining brain trauma raise a person's risk of several forms of cognitive impairment and dementia, not only CTE. Yet the neuropathology of CTE is distinct, and its link to sports raises important questions regarding athletes' safety. Science is progressing rapidly, and its message is clear: to preserve the game and its players, the culture of football must change.

Pages: << 1 2 3 4 5 6 7 8 9 10 11 ... 16 >>

February 2015
Sun Mon Tue Wed Thu Fri Sat
 << <   > >>
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
We are your business and classroom solution for low-cost PowerPoint training games, ice breakers and team building games.


XML Feeds