What Straight-A Students Get Wrong

If you always succeed in school, you’re not setting yourself up for success in life.

Adam Grant

By Adam Grant

Dr. Grant is an organizational psychologist and a contributing opinion writer.

CreditLinda Huang

A decade ago, at the end of my first semester teaching at Wharton, a student stopped by for office hours. He sat down and burst into tears. My mind started cycling through a list of events that could make a college junior cry: His girlfriend had dumped him; he had been accused of plagiarism. “I just got my first A-minus,” he said, his voice shaking.

Year after year, I watch in dismay as students obsess over getting straight A’s. Some sacrifice their health; a few have even tried to sue their school after falling short. All have joined the cult of perfectionism out of a conviction that top marks are a ticket to elite graduate schools and lucrative job offers.

I was one of them. I started college with the goal of graduating with a 4.0. It would be a reflection of my brainpower and willpower, revealing that I had the right stuff to succeed. But I was wrong.

The evidence is clear: Academic excellence is not a strong predictor of career excellence. Across industries, research shows that the correlation between grades and job performance is modest in the first year after college and trivial within a handful of years. For example, at Google, once employees are two or three years out of college, their grades have no bearing on their performance. (Of course, it must be said that if you got D’s, you probably didn’t end up at Google.)

Academic grades rarely assess qualities like creativity, leadership and teamwork skills, or social, emotional and political intelligence. Yes, straight-A students master cramming information and regurgitating it on exams. But career success is rarely about finding the right solution to a problem — it’s more about finding the right problem to solve.

In a classic 1962 study, a team of psychologists tracked down America’s most creative architects and compared them with their technically skilled but less original peers. One of the factors that distinguished the creative architects was a record of spiky grades. “In college our creative architects earned about a B average,” Donald MacKinnon wrote. “In work and courses which caught their interest they could turn in an A performance, but in courses that failed to strike their imagination, they were quite willing to do no work at all.” They paid attention to their curiosity and prioritized activities that they found intrinsically motivating — which ultimately served them well in their careers.

Getting straight A’s requires conformity. Having an influential career demands originality. In a study of students who graduated at the top of their class, the education researcher Karen Arnold found that although they usually had successful careers, they rarely reached the upper echelons. “Valedictorians aren’t likely to be the future’s visionaries,” Dr. Arnold explained. “They typically settle into the system instead of shaking it up.”

This might explain why Steve Jobs finished high school with a 2.65 G.P.A., J.K. Rowling graduated from the University of Exeter with roughly a C average, and the Rev. Dr. Martin Luther King Jr. got only one A in his four years at Morehouse.

If your goal is to graduate without a blemish on your transcript, you end up taking easier classes and staying within your comfort zone. If you’re willing to tolerate the occasional B, you can learn to program in Python while struggling to decipher “Finnegans Wake.” You gain experience coping with failures and setbacks, which builds resilience.

 

Straight-A students also miss out socially. More time studying in the library means less time to start lifelong friendships, join new clubs or volunteer. I know from experience. I didn’t meet my 4.0 goal; I graduated with a 3.78. (This is the first time I’ve shared my G.P.A. since applying to graduate school 16 years ago. Really, no one cares.) Looking back, I don’t wish my grades had been higher. If I could do it over again, I’d study less. The hours I wasted memorizing the inner workings of the eye would have been better spent trying out improv comedy and having more midnight conversations about the meaning of life.

So universities: Make it easier for students to take some intellectual risks. Graduate schools can be clear that they don’t care about the difference between a 3.7 and a 3.9. Colleges could just report letter grades without pluses and minuses, so that any G.P.A. above a 3.7 appears on transcripts as an A. It might also help to stop the madness of grade inflation, which creates an academic arms race that encourages too many students to strive for meaningless perfection. And why not let students wait until the end of the semester to declare a class pass-fail, instead of forcing them to decide in the first month?

Employers: Make it clear you value skills over straight A’s. Some recruiters are already on board: In a 2006 study of over 500 job postings, nearly 15 percent of recruiters actively selected against students with high G.P.A.s (perhaps questioning their priorities and life skills), while more than 40 percent put no weight on grades in initial screening.

Straight-A students: Recognize that underachieving in school can prepare you to overachieve in life. So maybe it’s time to apply your grit to a new goal — getting at least one B before you graduate.

Adam Grant, an organizational psychologist at Wharton and contributing opinion writer, is the author of “Originals” and “Give and Take” and is the host of the podcast “WorkLife.”

How to Raise a Creative Child. Step One: Back Off

The New York Times
Adam Grant Adam Grant JAN. 30, 2016

THEY learn to read at age 2, play Bach at 4, breeze through calculus at 6, and speak foreign languages fluently by 8. Their classmates shudder with envy; their parents rejoice at winning the lottery. But to paraphrase T. S. Eliot, their careers tend to end not with a bang, but with a whimper.

Consider the nation’s most prestigious award for scientifically gifted high school students, the Westinghouse Science Talent Search, called the Super Bowl of science by one American president. From its inception in 1942 until 1994, the search recognized more than 2000 precocious teenagers as finalists. But just 1 percent ended up making the National Academy of Sciences, and just eight have won Nobel Prizes. For every Lisa Randall who revolutionizes theoretical physics, there are many dozens who fall far short of their potential.

Child prodigies rarely become adult geniuses who change the world. We assume that they must lack the social and emotional skills to function in society. When you look at the evidence, though, this explanation doesn’t suffice: Less than a quarter of gifted children suffer from social and emotional problems. A vast majority are well adjusted — as winning at a cocktail party as in the spelling bee.

What holds them back is that they don’t learn to be original. They strive to earn the approval of their parents and the admiration of their teachers. But as they perform in Carnegie Hall and become chess champions, something unexpected happens: Practice makes perfect, but it doesn’t make new.

The gifted learn to play magnificent Mozart melodies, but rarely compose their own original scores. They focus their energy on consuming existing scientific knowledge, not producing new insights. They conform to codified rules, rather than inventing their own. Research suggests that the most creative children are the least likely to become the teacher’s pet, and in response, many learn to keep their original ideas to themselves. In the language of the critic William Deresiewicz, they become the excellent sheep.

In adulthood, many prodigies become experts in their fields and leaders in their organizations. Yet “only a fraction of gifted children eventually become revolutionary adult creators,” laments the psychologist Ellen Winner. “Those who do must make a painful transition” to an adult who “ultimately remakes a domain.”

Most prodigies never make that leap. They apply their extraordinary abilities by shining in their jobs without making waves. They become doctors who heal their patients without fighting to fix the broken medical system or lawyers who defend clients on unfair charges but do not try to transform the laws themselves.

So what does it take to raise a creative child? One study compared the families of children who were rated among the most creative 5 percent in their school system with those who were not unusually creative. The parents of ordinary children had an average of six rules, like specific schedules for homework and bedtime. Parents of highly creative children had an average of fewer than one rule.

Creativity may be hard to nurture, but it’s easy to thwart. By limiting rules, parents encouraged their children to think for themselves. They tended to “place emphasis on moral values, rather than on specific rules,” the Harvard psychologist Teresa Amabile reports.

Even then, though, parents didn’t shove their values down their children’s throats. When psychologists compared America’s most creative architects with a group of highly skilled but unoriginal peers, there was something unique about the parents of the creative architects: “Emphasis was placed on the development of one’s own ethical code.”

Yes, parents encouraged their children to pursue excellence and success — but they also encouraged them to find “joy in work.” Their children had freedom to sort out their own values and discover their own interests. And that set them up to flourish as creative adults.

When the psychologist Benjamin Bloom led a study of the early roots of world-class musicians, artists, athletes and scientists, he learned that their parents didn’t dream of raising superstar kids. They weren’t drill sergeants or slave drivers. They responded to the intrinsic motivation of their children. When their children showed interest and enthusiasm in a skill, the parents supported them.

Top concert pianists didn’t have elite teachers from the time they could walk; their first lessons came from instructors who happened to live nearby and made learning fun. Mozart showed interest in music before taking lessons, not the other way around. Mary Lou Williams learned to play the piano on her own; Itzhak Perlman began teaching himself the violin after being rejected from music school.

Even the best athletes didn’t start out any better than their peers. When Dr. Bloom’s team interviewed tennis players who were ranked in the top 10 in the world, they were not, to paraphrase Jerry Seinfeld, doing push-ups since they were a fetus. Few of them faced intense pressure to perfect the game as Andre Agassi did. A majority of the tennis stars remembered one thing about their first coaches: They made tennis enjoyable.
SINCE Malcolm Gladwell popularized the “10,000-hour rule” suggesting that success depends on the time we spend in deliberate practice, debate has raged about how the hours necessary to become an expert vary by field and person. In arguing about that, we’ve overlooked two questions that matter just as much.

First, can’t practice itself blind us to ways to improve our area of study? Research reveals that the more we practice, the more we become entrenched — trapped in familiar ways of thinking. Expert bridge players struggled more than novices to adapt when the rules were changed; expert accountants were worse than novices at applying a new tax law.

Second, what motivates people to practice a skill for thousands of hours? The most reliable answer is passion — discovered through natural curiosity or nurtured through early enjoyable experiences with an activity or many activities.

Evidence shows that creative contributions depend on the breadth, not just depth, of our knowledge and experience. In fashion, the most original collections come from directors who spend the most time working abroad. In science, winning a Nobel Prize is less about being a single-minded genius and more about being interested in many things. Relative to typical scientists, Nobel Prize winners are 22 times more likely to perform as actors, dancers or magicians; 12 times more likely to write poetry, plays or novels; seven times more likely to dabble in arts and crafts; and twice as likely to play an instrument or compose music.
No one is forcing these luminary scientists to get involved in artistic hobbies. It’s a reflection of their curiosity. And sometimes, that curiosity leads them to flashes of insight. “The theory of relativity occurred to me by intuition, and music is the driving force behind this intuition,” Albert Einstein reflected. His mother enrolled him in violin lessons starting at age 5, but he wasn’t intrigued. His love of music only blossomed as a teenager, after he stopped taking lessons and stumbled upon Mozart’s sonatas. “Love is a better teacher than a sense of duty,” he said.

Hear that, Tiger Moms and Lombardi Dads? You can’t program a child to become creative. Try to engineer a certain kind of success, and the best you’ll get is an ambitious robot. If you want your children to bring original ideas into the world, you need to let them pursue their passions, not yours.

Adam Grant is a professor of management and psychology at the Wharton School of the University of Pennsylvania, and a contributing opinion writer. This essay is adapted from his new book Originals: How Non-Conformists Move the World.

A New Paradigm for Accountability: The Joy of Learning

Posted: 

Now that we have endured more than a dozen long years of No Child Left Behind and five fruitless, punitive years of Race to the Top, it is clear that they both failed. They relied on carrots and sticks and ignored intrinsic motivation. They crushed children’s curiosity instead of cultivating it.* They demoralized schools. They disrupted schools and communities without improving children’s education.

We did not leave no child behind. The same children who were left behind in 2001-02 are still left behind. Similarly, Race to the Top is a flop. The Common Core tests are failing most students, and we are nowhere near whatever the “Top” is. If a teacher gave a test, and 70% of the students failed, we would say she was not competent, tested what was not taught, didn’t know her students. The Race turns out to be NCLB with a mask. NCLB on steroids. NCLB 2.0.

Whatever you call it, RTTT has hurt children, demoralized teachers, closed community schools, fragmented communities, increased privatization, and doubled down on testing.

I have an idea for a new accountability system that relies on different metrics. We begin by dropping standardized test scores as measures of quality or effectiveness. We stop labeling, ranking, and rating children, teachers, and schools. We use tests only when needed for diagnostic purposes, not for comparing children to their peers, not to find winners and losers. We rely on teachers to test their students, not corporations.

The new accountability system would be called No Child Left Out. The measures would be these:

How many children had the opportunity to learn to play a musical instrument?

How many children had the chance to play in the school band or orchestra?

How many children participated in singing, either individually or in the chorus or a glee club or other group?

How many public performances did the school offer?

How many children participated in dramatics?

How many children produced documentaries or videos?

How many children engaged in science experiments? How many started a project in science and completed it?

How many children learned robotics?

How many children wrote stories of more than five pages, whether fiction or nonfiction?

How often did children have the chance to draw, paint, make videos, or sculpt?

How many children wrote poetry? Short stories? Novels? History research papers?

How many children performed service in their community to help others?

How many children were encouraged to design an invention or to redesign a common item?

How many students wrote research papers on historical topics?

Can you imagine an accountability system whose purpose is to encourage and recognize creativity, imagination, originality, and innovation? Isn’t this what we need more of?

Well, you can make up your own metrics, but you get the idea. Setting expectations in the arts, in literature, in science, in history, and in civics can change the nature of schooling. It would require far more work and self-discipline than test prep for a test that is soon forgotten.

My paradigm would dramatically change schools from Gradgrind academies to halls of joy and inspiration, where creativity, self-discipline, and inspiration are nurtured, honored, and valued.

This is only a start. Add your own ideas. The sky is the limit. Surely we can do better than this era of soul-crushing standardized testing.

*Kudos to Southold Elementary School in Long Island, where these ideas were hatched as I watched the children’s band playing a piece they had practiced.

Science Shows How People With Messy Desks Are Actually Different Than Everyone Else

News.Mic

Are you too messy? Instead of a filing cabinet, do you have piles of folders bursting to the seams? Is your Rolodex covered with doodles, while your drawers are full of loose business cards? Do memos arrive at your desk only to be tossed in an overstuffed trash can or linger in eternity amid a heap of their forgotten brethren?

We’re trained to think that messiness is evil and unproductive. But there might be a method to all that madness.

It turns out science can explain. There’s fairly robust psychological evidence that messiness isn’t just symptomatic of poor standards or effort, but might actually provoke creativity.

That’s the hypothesisis that, as psychologist Kathleen Vohs writes in the New York Times, “being around messiness would lead people away from convention, in favor of new directions.” To test this hypothesis, Vohs invited 188 adults to rooms that were either tidy or “messy, with papers and books strewn around haphazardly.”

Each adult was then presented with one of two menus from a deli that served fruit smoothies, with half of the subjects seeing a menu with one item billed as “classic” and another billed as “new.” The results (published in Psychological Science), Vohs reports, were enlightening:

As predicted, when the subjects were in the tidy room they chose the health boost more often — almost twice as often — when it had the “classic” label: that is, when it was associated with convention. Also as predicted, when the subjects were in the messy room, they chose the health boost more often — more than twice as often — when it was said to be “new”: that is, when it was associated with novelty. Thus, people greatly preferred convention in the tidy room and novelty in the messy room.

A second experiment with 48 adults found that subjects in a messy environment came up with ideas “28% more creative” while creating a list of unconventional uses for ping pong balls, even though the two groups came up with the same number of ideas. Vohs argues the results are clear: Messiness actually spurs creativity.

Source: Getty Images

Columbia Business School professor Eric Abrahamson notes that the debate on messiness can overlook the crucial fact that order has opportunity costs, like forcing employees to devote valuable time to maintaining an orderly environment that could otherwise be spent on projects. He argues:

Creativity is spurred when things that we tend not to organize in the same category come together. When you allow some messiness into a system, new combinations can result. If you keep all your tools in the tool shed and all your kitchen utensils in the kitchen, you might never think of using a kitchen utensil as a tool or vice-versa.

Of course, messiness doesn’t necessarily refer to how many coffee cups on your desk need to be thrown out. Abrahamson adds that “the best studies on strategic planning indicate that firms with elaborate strategic planning systems do no better than firms that don’t have them,” possibly because an emphasis on order can reduce the flexibility of some companies.

On the micro level, in 2007 Abrahamson and fellow researcher David H. Freedman wrote that a messy desk could actually be a “highly effective prioritizing and accessing system” that quickly sorts items according to their importance. Piles of clutter that amass on unkempt desks may just be repositories for “safely ignorable stuff.” In other words, if it looks like trash, perhaps that’s because it wasn’t important enough to waste time filing.

On the other hand, as Freedman told the New York Times, “almost anything looks pretty neat if it’s shuffled into a pile.” Order doesn’t necessarily have inherent benefits in every space.

The takeaway: That’s not necessarily an argument for messiness (and nothing here justifies leaving underwear on the floor). But Your Story’s Malavika Velayanikal argued that there were two lessons that entrepreneurs could take away. One, employers shouldn’t overvalue orderliness in a work setting, because disorder might help trigger creative solutions to problems in the workplace. The other was that employers should harness the creative energy of disorderly environments by creating “varied office spaces” instead of minimalist ones in order to help employees “break free from conventional thinking.”

Basically, the emphasis on order and efficiency in work settings can be misplaced. Instead of maximizing efficiency, a rigid focus on routine can force employees to waste time on minutiae. Office Space had this lesson down:

 

 

Creativity vs. Quants

The New York Times

By Timothy Eagen

March 21, 2014

Here’s how John Lennon wrote “Nowhere Man,” as he recalled it in an interview that ran just before he was murdered in 1980: After working five hours trying to craft a song, he had nothing to show for it. “Then, ‘Nowhere Man’ came, words and music, the whole damn thing as I lay down.”

Here’s how Steve Jobs came up with the groundbreaking font selection when Apple designed the Mac: He had taken a class in the lost art of calligraphy and found it “beautiful, historical, artistically subtle in a way that science can’t capture.” Ten years later, it paid off when Apple ushered in a typeface renaissance.

And here’s how Oscar Wilde defined his profession: “A writer is someone who has taught his mind to misbehave.”

We’ve bottled lust. We’ve refined political analysis so that nearly every election can be accurately forecast. And we’ve compressed the sum of education for an average American 17-year-old into the bloodless numbers of standardized test scores. What still eludes the captors of knowledge is creativity, even though colleges are trying to teach it, corporations are trying to own it, and Apple has a “creativity app.”

But perhaps because creativity remains so unquantifiable, it’s still getting shortchanged by educators, new journalistic ventures, Hollywood and the company that aspires to be the earth’s largest retailer, Amazon.com.

An original work, an aha! product or a fresh insight is rarely the result of precise calculation at one end producing genius at the other. You need messiness and magic, serendipity and insanity. Creativity comes from time off, and time out. There is no recipe for “Nowhere Man,” other than showing up, and then, maybe lying down.

The push for Common Core standards in the schools came from colleges and employers who complained that high schools were turning out too many graduates unprepared for the modern world. That legitimate criticism prompted a massive overhaul affecting every part of the country. Now, the pushback, in part, is coming from people who feel that music, art and other unmeasured values got left behind — that the Common Core stifles creativity. Educators teach for the test, but not for the messy brains of the kids in the back rows.

In relaunching his data-driven FiveThirtyEight website this week, Nate Silver took a swipe at old-school commentators. He recalled the famously off prediction of Peggy Noonan, who criticized people “too busy looking at data on paper” to pick up on the “vibrations” of a Mitt Romney victory in 2012. “It’s time for us to start making the news nerdier,” Silver wrote in his manifesto.

Data journalism has certainly done much to clean up the guesswork in a profession still struggling to find its way in the digital age. On election eve, it’s far better to look at the aggregate of all scientific polls than to listen to a pundit’s hunch. But numbers, as Silver himself acknowledged, are not everything in the information game. Satire, journalism’s underappreciated sibling, belongs to the creative realm. And there are no quants on the planet who could write Jonathan Swift’s “Modest Proposal,” or a decent episode of “The Daily Show.”

Nor could they produce an original film. Sure, they’ve tried. Most of Hollywood’s big budget, so-called tent-pole openings are the net result of exhaustive crunching of the elements of a hit. A robot can write a screenplay — about robots fighting one another! — that is just as effective at the box office as the fart-joke formula of an Adam Sandler movie. Before a major release, audiences are tested and polled, and producers fix and calibrate.

In the end, it’s just product, matching audience preferences. So it was encouraging to see a big-name Hollywood director, Darren Aronofsky, the filmmaker behind the upcoming epic “Noah,” show some defiance against the numbers men. “Ten men in a room trying to come up with their favorite ice cream are going to agree on vanilla,” he said in The New Yorker. “I’m the rocky road guy.”

Book publishers, cowering in the shadow of Amazon.com, deserved their kick to the head when the online company forced them to drag their archaic business practices into the 21st century. But they can take heart that Amazon, trying to crowd source and metrically mold its way into producing its own “content,” has stumbled. Amazon works by gathering data on millions of readers and then giving the same thing back to them. The oldest tale of publishing, or filmmaking for that matter, is the orphaned, oddball story that became a smash. Everyone rejected it because, well, it wasn’t like anything else.

At Amazon, the quants rule. Daydreaming, pie-in-the-sky time and giving people room to fail — the vital ingredients of creativity — are costly, the first things to go at a data-driven company. As a business model, Amazon is a huge success. As a regular generator of culture-altering material, it’s a bit player. Why? It has marginalized messiness.

Learning to Think Outside the Box, Creativity Becomes an Academic Discipline

The New York Times

An interesting related to the “Maker Movement.”  We hope to open our Sacred Heart “Maker Lab” in the near future.

By LAURA PAPPANOFEB. 5, 2014

 

Launch media viewer
Students in creative studies at Buffalo State College posted key points to being a creative thinker. Brendan Bannon for The New York Times

IT BOTHERS MATTHEW LAHUE and it surely bothers you: enter a public restroom and the stall lock is broken. Fortunately, Mr. Lahue has a solution. It’s called the Bathroom Bodyguard. Standing before his Buffalo State College classmates and professor, Cyndi Burnett, Mr. Lahue displayed a device he concocted from a large washer, metal ring, wall hook, rubber bands and Lincoln Log. Slide the ring in the crack and twist. The door stays shut. Plus, the device fits in a jacket pocket.

The world may be full of problems, but students presenting projects for Introduction to Creative Studies have uncovered a bunch you probably haven’t thought of. Elie Fortune, a freshman, revealed his Sneaks ’n Geeks app to identify the brand of killer sneakers you spot on the street. Jason Cathcart, a senior, sported a bulky martial arts uniform with sparring pads he had sewn in. No more forgetting them at home.

“I don’t expect them to be the next Steve Jobs or invent the flying car,” Dr. Burnett says. “But I do want them to be more effective and resourceful problem solvers.” Her hope, she says, is that her course has made them more creative.

 

Launch media viewer
Cyndi Burnett teaches Introduction to Creative Studies at Buffalo State College. Brendan Bannon for The New York Times

Once considered the product of genius or divine inspiration, creativity — the ability to spot problems and devise smart solutions — is being recast as a prized and teachable skill. Pin it on pushback against standardized tests and standardized thinking, or on the need for ingenuity in a fluid landscape.

“The reality is that to survive in a fast-changing world you need to be creative,” says Gerard J. Puccio, chairman of the International Center for Studies in Creativity at Buffalo State College, which has the nation’s oldest creative studies program, having offered courses in it since 1967.

“That is why you are seeing more attention to creativity at universities,” he says. “The marketplace is demanding it.”

Critical thinking has long been regarded as the essential skill for success, but it’s not enough, says Dr. Puccio. Creativity moves beyond mere synthesis and evaluation and is, he says, “the higher order skill.” This has not been a sudden development. Nearly 20 years ago “creating” replaced “evaluation” at the top of Bloom’s Taxonomy of learning objectives. In 2010 “creativity” was the factor most crucial for success found in an I.B.M. survey of 1,500 chief executives in 33 industries. These days “creative” is the most used buzzword in LinkedIn profiles two years running.

Traditional academic disciplines still matter, but as content knowledge evolves at lightning speed, educators are talking more and more about “process skills,” strategies to reframe challenges and extrapolate and transform information, and to accept and deal with ambiguity.

 

Launch media viewer
Annoyed by restroom doors that are always broken? Matthew Lahue, a junior, designed the Bathroom Bodyguard.
Jim Lahue

Creative studies is popping up on course lists and as a credential. Buffalo State, part of the State University of New York, plans a Ph.D. and already offers a master’s degree and undergraduate minor. Saybrook University in San Francisco has a master’s and certificate, and added a specialization to its psychology Ph.D. in 2011. Drexel University in Philadelphia has a three-year-old online master’s. St. Andrews University in Laurinburg, N.C., has added a minor. And creative studies offerings, sometimes with a transdisciplinary bent, are new options in business, education, digital media, humanities, arts, science and engineering programs across the country.

Suddenly, says Russell G. Carpenter, program coordinator for a new minor in applied creative thinking at Eastern Kentucky University, “there is a larger conversation happening on campus: ‘Where does creativity fit into the E.K.U. student experience?’ ” Dr. Carpenter says 40 students from a broad array of fields, including nursing and justice and safety, have enrolled in the minor — a number he expects to double as more sections are added to introductory classes. Justice and safety? Students want tools to help them solve public safety problems and deal with community issues, Dr. Carpenter explains, and a credential to take to market.

The credential’s worth is apparent to Mr. Lahue, a communication major who believes that a minor in the field carries a message. “It says: ‘This person is not a drone. They can use this skill set and apply themselves in other parts of the job.’ ”

On-demand inventiveness is not as outrageous as it sounds. Sure, some people are naturally more imaginative than others. What’s igniting campuses, though, is the conviction that everyone is creative, and can learn to be more so.

Just about every pedagogical toolbox taps similar strategies, employing divergent thinking (generating multiple ideas) and convergent thinking (finding what works).The real genius, of course, is in the how.

 

Launch media viewer
Edwin Perez’s FaceSaver keeps your phone from falling. Cyndi Burnett

Dr. Puccio developed an approach that he and partners market as FourSight and sell to schools, businesses and individuals. The method, which is used in Buffalo State classrooms, has four steps: clarifying, ideating, developing and implementing. People tend to gravitate to particular steps, suggesting their primary thinking style. Clarifying — asking the right question — is critical because people often misstate or misperceive a problem. “If you don’t have the right frame for the situation, it’s difficult to come up with a breakthrough,” Dr. Puccio says. Ideating is brainstorming and calls for getting rid of your inner naysayer to let your imagination fly. Developing is building out a solution, and maybe finding that it doesn’t work and having to start over. Implementing calls for convincing others that your idea has value.

Jack V. Matson, an environmental engineer and a lead instructor of “Creativity, Innovation and Change,” a MOOC that drew 120,000 in September, teaches a freshman seminar course at Penn State that he calls “Failure 101.” That’s because, he says, “the frequency and intensity of failures is an implicit principle of the course. Getting into a creative mind-set involves a lot of trial and error.”

His favorite assignments? Construct a résumé based on things that didn’t work out and find the meaning and influence these have had on your choices. Or build the tallest structure you can with 20 Popsicle sticks. The secret to the assignment is to destroy the sticks and reimagine their use. “As soon as someone in the class starts breaking the sticks,” he says, “it changes everything.”

Dr. Matson also asks students to “find some cultural norms to break,” like doing cartwheels while entering the library. The point: “Examine what in the culture is preventing you from creating something new or different. And what is it like to look like a fool because a lot of things won’t work out and you will look foolish? So how do you handle that?”

It’s a lesson that has been basic to the ventures of Brad Keywell, a Groupon founder and a student of Dr. Matson’s at the University of Michigan. “I am an absolute evangelist about the value of failure as part of creativity,” says Mr. Keywell, noting that Groupon took off after the failure of ThePoint.com, where people were to organize for collective action but instead organized discount group purchases. Dr. Matson taught him not just to be willing to fail but that failure is a critical avenue to a successful end. Because academics run from failure, Mr. Keywell says, universities are “way too often shapers of formulaic minds,” and encourage students to repeat and internalize fail-safe ideas.

 

Launch media viewer
Chanil Mejia and Yasmine Payton present their big idea, a campus chill spot, in Introduction to Creative Studies. Brendan Bannon for The New York Times

Bonnie Cramond, director of the Torrance Center for Creativity and Talent Development at the University of Georgia, is another believer in taking bold risks, which she calls a competitive necessity. Her center added an interdisciplinary graduate certificate in creativity and innovation this year. “The new people who will be creative will sit at the juxtaposition of two or more fields,” she says. When ideas from different fields collide, Dr. Cramond says, fresh ones are generated. She cites an undergraduate class that teams engineering and art students to, say, reimagine the use of public spaces. Basic creativity tools used at the Torrance Center include thinking by analogy, looking for and making patterns, playing, literally, to encourage ideas, and learning to abstract problems to their essence.

In Dr. Burnett’s Introduction to Creative Studies survey course, students explore definitions of creativity, characteristics of creative people and strategies to enhance their own creativity.These include rephrasing problems as questions, learning not to instinctively shoot down a new idea (first find three positives), and categorizing problems as needing a solution that requires either action, planning or invention. A key objective is to get students to look around with fresh eyes and be curious. The inventive process, she says, starts with “How might you…”

Dr. Burnett is an energetic instructor with a sense of humor — she tested Mr. Cathcart’s martial arts padding with kung fu whacks. Near the end of last semester, she dumped Post-it pads (the department uses 400 a semester) onto a classroom desk with instructions: On pale yellow ones, jot down what you learned; on rainbow colored pads, share how you will use this learning. She then sent students off in groups with orders that were a litany of brainstorming basics: “Defer judgment! Strive for quantity! Wild and unusual! Build on others’ ideas!”

As students scribbled and stuck, the takeaways were more than academic. “I will be optimistic,” read one. “I will look at tasks differently,” said another. And, “I can generate more ideas.”

Asked to elaborate, students talked about confidence and adaptability. “A lot of people can’t deal with things they don’t know and they panic. I can deal with that more now,” said Rony Parmar, a computer information systems major with Dr. Dre’s Beats headphones circling his neck.

Mr. Cathcart added that, given tasks, “you think of other ways of solving the problem.” For example, he streamlined the check-in and reshelving of DVDs at the library branch where he works.

The view of creativity as a practical skill that can be learned and applied in daily life is a 180-degree flip from the thinking that it requires a little magic: Throw yourself into a challenge, step back — pause — wait for brilliance to spout.

The point of creative studies, says Roger L. Firestien, a Buffalo State professor and author of several books on creativity, is to learn techniques “to make creativity happen instead of waiting for it to bubble up. A muse doesn’t have to hit you.”

Laura Pappano is writer in residence at Wellesley Center for Women at Wellesley College and author of several books, including “Inside School Turnarounds.”

Far From My Tree

Motherlode - Adventures in Parenting


By SUE ROBINS

My eldest son is 20 years old, lives in a house crammed with seven scrabbly roommates, works part time in a restaurant kitchen, doesn’t drive, is a vegetarian, and has homemade tattoos etched into his thighs.

He’s firmly a musician – a drummer in a loud punk band, and he loves nothing more than to tour across North America, playing gigs in sketchy houses in Oakland, Calif., and south Chicago.

He appears to have only one pair of pants – dirty, black cutoff jeans, and his shirts are also of the ripped-off-arms variety. I’m not sure who has been ripping up all his clothes. Maybe there’s a wild dog living in his house.

I’m both proud of and horrified for my boy. His jaw is squarely set, and he’s acutely committed to what he wants to do. And that is to tour with his band in their black-panel van, crisscrossing borders, dodging death in dubious neighborhoods, sleeping on strangers’ couches, and eating vegetarian burritos.

As my children traveled through their teenage years, I emphasized to them: Find your passion and follow it. What I really meant was: Find your passion, but do it in the way I did it. That is, go to college first, get a liberal arts degree, meander through your 20s, and then supplement your undergraduate degree with graduate studies. All while wearing clean, intact clothing.

But what if, as Andrew Solomon so eloquently addresses in his masterpiece, “Far From the Tree,” your child ends up so very different from you? I read “Far From the Tree” because it speaks of children with disabilities (and my youngest son has Down syndrome), but I gained a deeper knowledge of all children who stray from their parents. If we face reality squarely, and give our children the space to be who they want to be, every single child should be different from his parents, and should be allowed and even encouraged to fall far from our trees.

My oldest boy does not show up to family events in his collared shirt and pressed pants. In fact, he rarely shows up at all. He doesn’t respond to calls from grandparents, although he will send thank you texts for birthday gifts, so he still has a sliver of decorum. He’s proudly anti-establishment, and my current lifestyle with my husband (and his stepfather) – living in the suburbs and driving a BMW – clearly disgusts him.

I watch my friends’ children embarking on their second year of college, most of them still living at home with their parents. They are clean-cut, unfailingly polite, sit quietly at dinner parties and patiently dole out answers to questions from adults. Inevitably, someone asks me, “What’s your son doing?” and then I feel a strange mix of pride and apology. “He’s living his life,” I say. “But what graduate program? What path is he taking?” “He’s not in a program,” I say. “He’s working and playing in a band.” They take a deep gulp of wine and look down at their expensive shoes.

I read a biography of Dave Grohl, the former drummer for Nirvana. In it, Mr. Grohl’s mother – a teacher herself – agreed to let him drop out of high school so he could tour with his band. She said that he was good at a lot of things, but school was not one of them. Clearly Mr. Grohl’s path did not include the traditional, go-to-college-get-a-job trajectory.

My son is teaching me that there isn’t just one way to live life. Yes, I wish he would go to college so he doesn’t live below the poverty line and reside in a house of squalor.

But that’s what I want for him. That’s not what he wants for himself. He is not the male version of me. He’s a musician, and the creative life means a guaranteed amount of struggle and heartache. Every time we meet for lunch, I tell him that I love him, and that I’m proud of him.

Even if my boy’s path never rises out of moshing in the basements of America, that’s got to be O.K., too. There are no conditions placed on unconditional love.

Is There an App for That?

Harvard Magazine

THE LOST GENERATION. The Greatest Generation. Generation X. And now…the App Generation.

“Are kids growing up in the digital age really different?” asks Howard Gardner, Hobbs professor of cognition and education. Six years ago, he and then-student Katie Davis, Ed.D. ’11 (now an assistant professor at the University of Washington) set out to explore the question, and in their new book, The App Generation: How Today’s Youth Navigate Identity, Intimacy, and Imagination in a Digital World (Yale), they argue that the answer is unambiguously yes.

“This is a generation that expects and wants to have applications,” says Gardner. Applications, more commonly known as apps, are shortcuts designed for accomplishing specific tasks. They’re ubiquitous, powerful, and strongly structured, and the authors argue that they’re changing the way we think. “Young people growing up in our time are not only immersed in apps,” they write, “they’ve come to think of the world as an ensemble of apps, to see their lives as a string of ordered apps, or perhaps, in many cases, a single, extended, cradle-to-grave app.”

The app mindset, they say, motivates youth to seek direct, quick, easy solutions—the kinds of answers an app would provide—and to shy away from questions, whether or large or small, when there’s no “app for that.” In a wide-ranging cultural critique, the authors identify myriad resulting effects loosely structured around three of the stages of psychosocial development proposed by Gardner’s mentor Erik Erikson in 1950—here called identity, intimacy, and imagination.

They investigated the first two themes primarily through interviews with adolescents and focus groups of adults who work with teens. In terms of identity, Gardner and Davis argue that youth today are polished and packaged, in line with the cool, suave look of online profiles. In “Reflecting on Your Life” sessions with Harvard freshmen (see “The Most Important Course,” May-June 2011, page 56), Gardner writes, he encountered students “with their lives all mapped out—a super-app.” But the external polish often hides deep-seated anxiety, outwardly expressed as a need for approval. In their conversations with camp counselors and teachers, Gardner and Davis were repeatedly told that youth today are risk-averse; the app generation, said one focus group participant, is “scared to death.”

In exploring intimacy, Gardner and Davis saw repeated signs of greater isolation. Although social media can enhance friendships and family relationships, digital media can give the impression of closeness while promoting only shallow connections. Online relationships are often conducted at arm’s length, allowing youth to avoid the deeper emotional investment and vulnerability of more complicated, in-person relationships. (This emotional distance can also facilitate racist and sexist language that would be unacceptable in person.)

The book’s most unexpected results come from its study of imagination. Prompted by Gardner’s curiosity about how his high-school literary magazine might have changed in the 50 years since he was editor, the authors examined hundreds of samples of adolescent visual art and fiction between 1990 and 2010. Using a blind coding scheme to measure changes in topics such as subject, composition, and narrative flow, the authors concluded that graphic art has become more imaginative and diverse in the past 20 years, whereas creative writing has shown the opposite trend.

Though they acknowledged that all of their work is correlative, not causative, they speculated that the difference may reflect the emergence of online communities like deviantART and tools like Photoshop that increase amateur engagement with graphic media; in contrast, instant messaging and texting have largely supplanted more formal, written communications. The authors suggest that digital tools promote what they call “middle c” creativity, between the “little c” creativity of everyday problem-solving and the “Big C” of groundbreaking achievements. Though software may lower the bar for creative engagement, they write, users may never move beyond the tools’ inherent limitations.

“When do things that are optional become blinkers on how we see the world?” asks Gardner. He and Davis argue that people can be app-enabled, using apps as tools to eliminate tedious tasks and catalyze new forms of exploration, or app-dependent, relying heavily on the available tools as a substitute for skill and reflection. And the authors argue that automation itself is a dual-edged sword. “Who decides what is important?” they write. “And where do we draw the line between an operation”—using a GPS to navigate to Boston’s North End, for instance—“and the content on which the operation is carried out?”—orienting oneself in the city. Gardner points out that many of today’s teens have never been lost, either literally or metaphorically, and that many don’t even see the point of a “random walk,” an experience that he argues can build independence and resilience.

Apps are here to stay, the authors make clear, and the question now is how to make use of them in a productive, creative way. As an educator, Gardner favors what he calls a “constructivist” approach to learning—in which knowledge is acquired through exploration—and he believes that apps, by shortcutting discovery, can diminish this engagement with the world. Before downloading an app, he says, people should ask themselves what they would do without it: if they had to obtain directions or contact a friend, for instance, without a smartphone. “Even though a well-demonstrated toy or well-designed app has its virtues,” he and Davis write, “there is also virtue—and even reward—in figuring out things for yourself on your own time, in your own way.”

Minecraft, an Obsession and an Educational Tool

The New York Times

By NICK BILTON
Luca Citrone, 8, and his sister Willow play Minecraft before they go to bed.Michael CitroneLuca Citrone, 8, and his sister Willow play Minecraft before they go to bed.

If you were to walk into my sister’s house in Los Angeles, you’d hear a bit of yelling from time to time. “Luca! Get off Minecraft! Luca, are you on Minecraft again? Luca! Enough with the Minecraft!”

Luca is my 8-year-old nephew. Like millions of other children his age, Luca is obsessed with the video game Minecraft. Actually, obsessed might be an understated way to explain a child’s idée fixe with the game. And my sister, whom you’ve probably guessed is the person doing all that yelling, is a typical parent of a typical Minecraft-playing child: she’s worried it might be rotting his brain.

For those who have never played Minecraft, it’s relatively simple. The game looks a bit crude because it doesn’t have realistic graphics. Instead, it’s built in 16-bit, a computer term that means the graphics look blocky, like giant, digital Lego pieces.

Unlike other video games, there are few if any instructions in Minecraft. Instead, like the name suggests, the goal of the game is to craft, or build, structures in these 16-bit worlds, and figuring things out on your own is a big part of it. And parents, it’s not terribly violent. Sure, you can kill a few zombies while playing in the game’s “survival mode.” But in its “creative mode,” Minecraft is about building, exploration, creativity and even collaboration.

The game was first demonstrated by Markus Persson, a Swedish video game programmer and designer known as Notch, in 2009 and released to the public in November 2011. Today, the game runs on various devices, including desktop computers, Google Android smartphones, Apple iOS and the Microsoft Xbox. There are thousands of mods, or modifications, for the game, that allow people to play in prebuilt worlds, like a replica of Paris (Eiffel Tower included) or an ancient Mayan civilization.

While parents — my sister included — might worry that all these pixels and the occasional zombie might be bad for children, a lot of experts say they shouldn’t fret.

Earlier this year, for example, a school in Stockholm made Minecraft compulsory for 13-year-old students. “They learn about city planning, environmental issues, getting things done, and even how to plan for the future,” said Monica Ekman, a teacher at the Viktor Rydberg school.

Around the world, Minecraft is being used to educate children on everything from science to city planning to speaking a new language, said Joel Levin, co-founder and education director at the company TeacherGaming. TeacherGaming runs MinecraftEdu, which is intended to help teachers use the game with students.

A history teacher in Australia set up “quest missions” where students can wander through and explore ancient worlds. An English-language teacher in Denmark told children they could play Minecraft collectively in the classroom but with one caveat: they were allowed to communicate both orally and through text only in English. A science teacher in California has set up experiments in Minecraft to teach students about gravity.

Mr. Levin said that in addition to classroom exercises, children were learning the digital skills they would need as they got older.

“Kids are getting into middle school and high school and having some ugly experiences on Facebook and other social networks without an understanding of how to interact with people online,” he said. “With Minecraft, they are developing that understanding at a very early age.”

While there are no known neuroscience studies of Minecraft’s effect on children’s brains, research has shown video games can have a positive impact on children.

A study by S.R.I. International, a Silicon Valley research group that specializes in technology, found that game-based play could raise cognitive learning for students by as much as 12 percent and improve hand-eye coordination, problem-solving ability and memory.

Games like Minecraft also encourage what researchers call “parallel play,” where children are engrossed in their game but are still connected through a server or are sharing the same screen. And children who play games could even become better doctors. No joke. Neuroscientists performed a study at Iowa State University that found that surgeons performed better, and were more accurate on the operating table, when they regularly played video games.

“Minecraft extends kids’ spatial reasoning skills, construction skills and understanding of planning,” said Eric Klopfer, a professor and the director of the Massachusetts Institute of Technology’s Scheller Teacher Education Program. “In many ways, it’s like a digital version of Lego.”

Professor Klopfer suggested that if parents were worried about the game, they should simply play it with their children. He said he set up a server in his house so his children’s friends could play together and he could monitor their behavior and then explain that some actions, even in virtual worlds, are unethical — like destroying someone’s Minecraft house, or calling them a bad name.

But Professor Klopfer warned that, as with anything, there was — probably to my nephew’s chagrin — such as thing as too much Minecraft.

“While the game is clearly good for kids, it doesn’t mean there should be no limits,” he said. “As with anything, I don’t want my kids to do any one thing for overly extended periods of time. Whether Legos or Minecraft; having limits is an important part their learning.”

Many children would happily ignore that little warning if their parents let them.

Last weekend, my sister saw Luca on his computer with what appeared to be Minecraft on the screen. “Luca, I told you, you can’t play Minecraft anymore,” she said.

“I’m not playing Minecraft, mama,” he replied. “I’m watching videos on YouTube of other people playing Minecraft.”

New Middle School “MakerBot” 3D Printer

In support of our efforts to support student innovation and create STEM opportunities, the Middle School recently purchased a “MakerBot” 3D printer. Watch the following videos for more information:

[youtube http://www.youtube.com/watch?v=O3NGzO_pPQY&w=560&h=315]

[youtube http://www.youtube.com/watch?v=GMLIiJFRinY&w=560&h=315]