The conventional wisdom about 21st century skills holds that students need to master the STEM subjects — science, technology, engineering and math — and learn to code as well because that’s where the jobs are. It turns out that is a gross simplification of what students need to know and be able to do, and some proof for that comes from a surprising source: Google.
All across America, students are anxiously finishing their “What I Want To Be …” college application essays, advised to focus on STEM (Science, Technology, Engineering, and Mathematics) by pundits and parents who insist that’s the only way to become workforce ready. But two recent studies of workplace success contradict the conventional wisdom about “hard skills.” Surprisingly, this research comes from the company most identified with the STEM-only approach: Google.
Sergey Brin and Larry Page, both brilliant computer scientists, founded their company on the conviction that only technologists can understand technology. Google originally set its hiring algorithms to sort for computer science students with top grades from elite science universities.
In 2013, Google decided to test its hiring hypothesis by crunching every bit and byte of hiring, firing, and promotion data accumulated since the company’s incorporation in 1998. Project Oxygen shocked everyone by concluding that, among the eight most important qualities of Google’s top employees, STEM expertise comes in dead last. The seven top characteristics of success at Google are all soft skills: being a good coach; communicating and listening well; possessing insights into others (including others different values and points of view); having empathy toward and being supportive of one’s colleagues; being a good critical thinker and problem solver; and being able to make connections across complex ideas.
Those traits sound more like what one gains as an English or theater major than as a programmer. Could it be that top Google employees were succeeding despite their technical training, not because of it? After bringing in anthropologists and ethnographers to dive even deeper into the data, the company enlarged its previous hiring practices to include humanities majors, artists, and even the MBAs that, initially, Brin and Page viewed with disdain.
Project Aristotle, a study released by Google this past spring, further supports the importance of soft skills even in high-tech environments. Project Aristotle analyzes data on inventive and productive teams. Google takes pride in its A-teams, assembled with top scientists, each with the most specialized knowledge and able to throw down one cutting-edge idea after another. Its data analysis revealed, however, that the company’s most important and productive new ideas come from B-teams comprised of employees who don’t always have to be the smartest people in the room.
Project Aristotle shows that the best teams at Google exhibit a range of soft skills: equality, generosity, curiosity toward the ideas of your teammates, empathy, and emotional intelligence. And topping the list: emotional safety. No bullying. To succeed, each and every team member must feel confident speaking up and making mistakes. They must know they are being heard.
Google’s studies concur with others trying to understand the secret of a great future employee. A recent survey of 260 employers by the nonprofit National Association of Colleges and Employers, which includes both small firms and behemoths like Chevron and IBM, also ranks communication skills in the top three most-sought after qualities by job recruiters. They prize both an ability to communicate with one’s workers and an aptitude for conveying the company’s product and mission outside the organization. Or take billionaire venture capitalist and “Shark Tank” TV personality Mark Cuban: He looks for philosophy majors when he’s investing in sharks most likely to succeed.
STEM skills are vital to the world we live in today, but technology alone, as Steve Jobs famously insisted, is not enough. We desperately need the expertise of those who are educated to the human, cultural, and social as well as the computational.
No student should be prevented from majoring in an area they love based on a false idea of what they need to succeed. Broad learning skills are the key to long-term, satisfying, productive careers. What helps you thrive in a changing world isn’t rocket science. It may just well be social science, and, yes, even the humanities and the arts that contribute to making you not just workforce ready but world ready.
Like a lot of children, my sons, Toby, 7, and Anton, 4, are obsessed with robots. In the children’s books they devour at bedtime, happy, helpful robots pop up more often than even dragons or dinosaurs. The other day I asked Toby why children like robots so much.
“Because they work for you,” he said.
What I didn’t have the heart to tell him is, someday he might work for them — or, I fear, might not work at all, because of them.
It is not just Elon Musk, Bill Gates and Stephen Hawking who are freaking out about the rise of invincible machines. Yes, robots have the potential to outsmart us and destroy the human race. But first, artificial intelligence could make countless professions obsolete by the time my sons reach their 20s.
You do not exactly need to be Marty McFly to see the obvious threats to our children’s future careers.
Say you dream of sending your daughter off to Yale School of Medicine to become a radiologist. And why not? Radiologists in New York typically earn about $470,000, according to Salary.com.
But that job is suddenly looking iffy as A.I. gets better at reading scans. A start-up called Arterys, to cite just one example, already has a program that can perform a magnetic-resonance imaging analysis of blood flow through a heart in just 15 seconds, compared with the 45 minutes required by humans.
Maybe she wants to be a surgeon, but that job may not be safe, either. Robots already assist surgeons in removing damaged organs and cancerous tissue, according to Scientific American. Last year, a prototype robotic surgeon called STAR (Smart Tissue Autonomous Robot) outperformed human surgeons in a test in which both had to repair the severed intestine of a live pig.
So perhaps your daughter detours to law school to become a rainmaking corporate lawyer. Skies are cloudy in that profession, too. Any legal job that involves lots of mundane document review (and that’s a lot of what lawyers do) is vulnerable.
Software programs are already being used by companies including JPMorgan Chase & Company to scan legal papers and predict what documents are relevant, saving lots of billable hours. Kira Systems, for example, has reportedly cut the time that some lawyers need to review contracts by 20 to 60 percent.
As a matter of professional survival, I would like to assure my children that journalism is immune, but that is clearly a delusion. The Associated Press already has used a software program from a company called Automated Insights to churn out passable copy covering Wall Street earnings and some college sports, and last year awarded the bots the minor league baseball beat.
What about other glamour jobs, like airline pilot? Well, last spring, a robotic co-pilot developed by the Defense Advanced Research Projects Agency, known as Darpa, flew and landed a simulated 737. I hardly count that as surprising, given that pilots of commercial Boeing 777s, according to one 2015 survey, only spend seven minutes during an average flight actually flying the thing. As we move into the era of driverless cars, can pilotless planes be far behind?
Then there is Wall Street, where robots are already doing their best to shove Gordon Gekko out of his corner office. Big banks are using software programs that can suggest bets, construct hedges and act as robo-economists, using natural language processing to parse central bank commentary to predict monetary policy, according to Bloomberg. BlackRock, the biggest fund company in the world, made waves earlier this year when it announced it was replacing some highly paid human stock pickers with computer algorithms.
So am I paranoid? Or not paranoid enough? A much-quoted 2013 studyby the University of Oxford Department of Engineering Science — surely the most sober of institutions — estimated that 47 percent of current jobs, including insurance underwriter, sports referee and loan officer, are at risk of falling victim to automation, perhaps within a decade or two.
Just this week, the McKinsey Global Institute released a report that found that a third of American workers may have to switch jobs in the next dozen or so years because of A.I.
I know I am not the only parent wondering if I can robot-proof my children’s careers. I figured I would start by asking my own what they want to do when they grow up.
Toby, a people pleaser and born entertainer, is obsessed with cars and movies. He told me he wanted to be either an Uber driver or an actor. (He is too young to understand that those jobs are usually one and the same).
As for Uber drivers, it is no secret that they are headed to that great parking garage in the sky; the company recently announced plans to buy 24,000 Volvo sport utility vehicles to roll out as a driverless fleet between 2019 and 2021.
And actors? It may seem unthinkable that some future computer-generated thespian could achieve the nuance of expression and emotional depth of, say, Dwayne Johnson. But Hollywood is already Silicon Valley South. Consider how filmmakers used computer graphics to reanimate Carrie Fisher’s Princess Leia and Peter Cushing’s Grand Moff Tarkin as they appeared in the 1970s (never mind that the Mr. Cushing died in 1994) for “Rogue One: A Star Wars Story.”
My younger son Anton, a sweetheart, but tough as Kevlar, said he wanted to be a football player. Robot football may sound crazy, but come to think of it, a Monday night battle between the Dallas Cowdroids and Seattle Seabots may be the only solution to the sport’s endless concussion problems.
He also said he wanted to be a soldier. If he means foot soldier, however, he might want to hold off on enlistment. Russia recently unveiled Fedor, a humanoid robot soldier that looks like RoboCop after a Whole30 crash diet; this space-combat-ready android can fire handguns, drive vehicles, administer first aid and, one hopes, salute. Indeed, the world’s armies are in such an arms race developing grunt-bots that one British intelligence expert predicted that American forces will have more robot soldiers than humans by 2025.
And again, all of this stuff is happening now, not 25 years from now. Who knows what the jobs marketplace might look like by then. We might not even be the smartest beings on the planet.
Ever heard of the “singularity”? That is the term that futurists use to describe a potentially cataclysmic point at which machine intelligence catches up to human intelligence, and likely blows right past it. They may rule us. They may kill us. No wonder Mr. Musk says that A.I. “is potentially more dangerous than nukes.”
But is it really that dire? Fears of technology are as old as the Luddites, those machine-smashing British textile workers of the early 19th century. Usually, the fears turn out to be overblown.
The rise of the automobile, to cite the obvious example, did indeed put most manure shovelers out of work. But it created millions of jobs to replace them, not just for Detroit assembly line workers, but for suburban homebuilders, Big Mac flippers and actors performing “Greased Lightnin’” in touring revivals of “Grease.” That is the process of creative destruction in a nutshell.
But artificial intelligence is different, said Martin Ford, the author of “Rise of the Robots: Technology and the Threat of a Jobless Future.”Machine learning does not just give us new machines to replace old machines, pushing human workers from one industry to another. Rather, it gives us new machines to replace us, machines that can follow us to virtually any new industry we flee to.
Since Mr. Ford’s book sent me down this rabbit hole in the first place, I reached out to him to see if he was concerned about all this for his own children: Tristan, 22, Colin, 17, and Elaine, 10.
He said the most vulnerable jobs in the robot economy are those involving predictable, repetitive tasks, however much training they require. “A lot of knowledge-based jobs are really routine — sitting in front of a computer and cranking out the same application over and over, whether it is a report or some kind of quantitative analysis,” he said.
Professions that rely on creative thinking enjoy some protection (Mr. Ford’s older son is a graduate student studying biomedical engineering). So do jobs emphasizing empathy and interpersonal communication (his younger son wants to be a psychologist).
Even so, the ability to think creatively may not provide ultimate salvation. Mr. Ford said he was alarmed in May when Google’s AlphaGo software defeated a 19-year-old Chinese master at Go, considered the world’s most complicated board game.
“If you talk to the best Go players, even they can’t explain what they’re doing,” Mr. Ford said. “They’ll describe it as a ‘feeling.’ It’s moving into the realm of intuition. And yet a computer was able to prove that it can beat anyone in the world.”
In one, Albert Wenger, an influential tech investor, promoted the Basic Income Guarantee concept. Also known as Universal Basic Income, this sunny concept holds that a robot-driven economy may someday produce an unlimited bounty of cool stuff while simultaneously releasing us from the drudgery of old-fashioned labor, leaving our government-funded children to enjoy bountiful lives of leisure as interpretive dancers or practitioners of bee-sting therapy, as touted by Gwyneth Paltrow.
The idea is all the rage among Silicon Valley elites, who not only understand technology’s power, but who also love to believe that it will be used for good. In their vision of a post-A.I. world without traditional jobs, everyone will receive a minimum weekly or monthly stipend (welfare for all, basically).
Another talk by David Autor, an economist, argued that reports of the death of work are greatly exaggerated. Almost 50 years after the introduction of the A.T.M., for instance, more humans actually work as bank tellers than ever. The computers simply freed the humans from mind-numbing work like counting out 20-dollar bills to focus on more cognitively demanding tasks like “forging relationships with customers, solving problems and introducing them to new products like credit cards, loans and investments,” he said.
Computers, after all, are really good at some things and, for the moment, terrible at others. Even Anton intuits this. The other day I asked him if he thought robots were smarter or dumber than humans. “Sdumber,” he said after a long pause. Confused, I pushed him. “Smarter and dumber,” he explained with a cheeky smile.
He was joking. But he also happened to be right, according to Andrew McAfee, a management theorist at the Massachusetts Institute of Technology whom I interviewed a short while later.
Discussing another of Anton’s career aspirations — songwriter — Dr. McAfee said that computers were already smart enough to come up with a better melody than a lot of humans. “The things our ears find pleasant, we know the rules for that stuff,” he said. “However, I’m going to be really surprised when there is a digital lyricist out there, somebody who can put words to that music that will actually resonate with people and make them think something about the human condition.”
Not everyone, of course, is cut out to be a cyborg-Springsteen. I asked Dr. McAfee what other jobs may exist a decade from now.
“I think health coaches are going to be a big industry of the future,” he said. “Restaurants that have a very good hospitality staff are not about to go away, even though we have more options to order via tablet.
“People who are interested in working with their hands, they’re going to be fine,” he said. “The robot plumber is a long, long way away.”
Mountain View, Calif. — THE humanities are kaput. Sorry, liberal arts cap-and-gowners. You blew it. In a software-run world, what’s wanted are more engineers.
At least, so goes the argument in a rising number of states, which have embraced a funding model for higher education that uses tuition “bonuses” to favor hard-skilled degrees like computer science over the humanities. The trend is backed by countless think pieces. “Macbeth does not make my priority list,” wrote Vinod Khosla, a co-founder of Sun Microsystems and the author of a widely shared blog post titled “Is Majoring in Liberal Arts a Mistake for Students?” (Subtitle: “Critical Thinking and the Scientific Process First — Humanities Later”).
The technologist’s argument begins with a suspicion that the liberal arts are of dubious academic rigor, suited mostly to dreamers. From there it proceeds to a reminder: Software powers the world, ergo, the only rational education is one built on STEM. Finally, lest he be accused of making a pyre of the canon, the technologist grants that yes, after students have finished their engineering degrees and found jobs, they should pick up a book — history, poetry, whatever.
As a liberal-arts major who went on to a career in software, I can only scratch my head.
Fresh out of college in 1993, I signed on with a large technology consultancy. The firm’s idea was that by hiring a certain lunatic fringe of humanities majors, it might cut down on engineering groupthink. After a six-week programming boot camp, we were pitched headfirst into the deep end of software development.
My first project could hardly have been worse. We (mostly engineers, with a spritzing of humanities majors) were attached to an enormous cellular carrier. Our assignment was to rewrite its rating and billing system — a thing that rivaled maritime law in its complexity.
I was assigned to a team charged with one of the hairier programs in the system, which concerned the movement of individual mobile subscribers from one “parent” account plan to another. Each one of these moves caused an avalanche of plan activations and terminations, carry-overs or forfeitures of accumulated talk minutes, and umpteen other causal conditionals that would affect the subscriber’s bill.
This program, thousands of lines of code long and growing by the hour, was passed around our team like an exquisite corpse. The subscribers and their parent accounts were rendered on our screens as a series of S’s and A’s. After we stared at these figures for weeks, they began to infect our dreams. (One I still remember. I was a baby in a vast crib. Just overhead, turning slowly and radiating malice, was an enormous iron mobile whose arms strained under the weight of certain capital letters.)
Our first big break came from a music major. A pianist, I think, who joined our team several months into the project. Within a matter of weeks, she had hit upon a method to make the S’s hold on to the correct attributes even when their parent A was changed.
We had been paralyzed. The minute we tweaked one bit of logic, we realized we’d fouled up another. But our music major moved freely. Instead of freezing up over the logical permutations behind each A and S, she found that these symbols put her in the mind of musical notes. As notes, they could be made to work in concert. They could be orchestrated.
On a subsequent project, our problem was pointers. In programming language, a pointer is an object that refers to some master value stored elsewhere. This might sound straightforward, but pointers are like ghosts in the system. A single misdirected one can crash a program. Our pointer wizard was a philosophy major who had no trouble at all with the idea of a named “thing” being a transient stand-in for some other unseen Thing. For a Plato man, this was mother’s milk.
I’ve worked in software for years and, time and again, I’ve seen someone apply the arts to solve a problem of systems. The reason for this is simple. As a practice, software development is far more creative than algorithmic.
The developer stands before her source code editor in the same way the author confronts the blank page. There’s an idea for what is to be created, and the (daunting) knowledge that there are a billion possible ways to go about it. To proceed, each relies on one part training to three parts creative intuition. They may also share a healthy impatience for the ways things “have always been done” and a generative desire to break conventions. When the module is finished or the pages complete, their quality is judged against many of the same standards: elegance, concision, cohesion; the discovery of symmetries where none were seen to exist. Yes, even beauty.
To be sure, each craft also requires a command of the language and its rules of syntax. But these are only starting points. To say that more good developers will be produced by swapping the arts for engineering is like saying that to produce great writers, we should double down on sentence diagraming.
Here the technologists may cry foul, say I’m misrepresenting the argument, that they’re not calling to avoid the humanities altogether, but only to replace them in undergraduate study. “Let college be for science and engineering, with the humanities later.” In tech speak, this is an argument for the humanities as plug-in.
But if anything can be treated as a plug-in, it’s learning how to code. It took me 18 months to become proficient as a developer. This isn’t to pretend software development is easy — those were long months, and I never touched the heights of my truly gifted peers. But in my experience, programming lends itself to concentrated self-study in a way that, say, “To the Lighthouse” or “Notes Toward a Supreme Fiction” do not. To learn how to write code, you need a few good books. To enter the mind of an artist, you need a human guide.
For folks like Mr. Khosla, such an approach is dangerous: “If subjects like history and literature are focused on too early, it is easy for someone not to learn to think for themselves and not to question assumptions, conclusions, and expert philosophies.” (Where some of these kill-the-humanities pieces are concerned, the strongest case for the liberal arts is made just in trying to read them.)
How much better is the view of another Silicon Valley figure, who argued that “technology alone is not enough — it’s technology married with liberal arts, married with the humanities, that yields us the result that makes our heart sing.”
An interesting article from USA Today. Did you know that girls who attend single sex high schools are 6x more likely to consider majoring in math or sciences than girls who attend co-ed high schools?
Nenad Tadic, USA TODAY, November 2, 2013
Getting young women interested and immersed in computer science programs comes at a time when one million new jobs in tech-related fields will be created in the next decade.
Just 37% of this year’s freshman class at Georgia Tech is female.
And that’s increase over previous years, thanks in part to the school’s dedicated women’s recruitment team. Comprised of 75 current Georgia Tech students and an advisor, the team’s initiatives include speaking at high schools, hosting online chats and setting up campus visit events.
“It’s a whole, broad push,” says Laura Diamond, spokesman for Georgia Tech. We want girls to be thinking about STEM (science, technology, engineering, mathematics) overall, she explains.
At the Illinois Institute of Technology (IIT), a number of the school’s upper level leadership is female, including the dean of the engineering school, which means the school has the responsibility to recruit talented young women, says April Welch, acting director of graduate admissions.
Getting young women interested and immersed in computer science programs comes at a time when one million new jobs in tech-related fields will be created in the next decade.
Currently, a quarter of all Americans in computer-related occupations are women, compare that figure to countries like Oman and Qatar, whose governmentsemphasize girls’ education and STEM fields.
How can American colleges and universities get women interested in computer science and tech — and how can schools ensure their success?
These are questions Harvey Mudd College President Maria Klawe has been tackling for the past few years.
In 2005 when Klawe became Harvey Mudd’s president, 10% of graduates with computer science degrees were women. In 2011, that figure went up to 40%.
Klawe says Harvey Mudd fosters a collaborative and supportive environment, one that starts the minute students are enrolled.
Klawe explains that because “women are raised to be helpful and nurturing,” they tend to be interested in programs that can be framed in a real world approach to solve problems.
Consultation and advice from her school have helped Sabina Nilakhe, a senior computer science major at DePaul University.
“At first, it’s intimidating being the only girl in a class of 20-plus guys,” says Nilakhe.
But because of personalized classroom attention and a number of programs for women that the school provides — like tutoring by graduate students or weekly chats over lunch — she says it’s a lot less intimidating and has thoroughly enjoyed her experience.
At Columbia University, the Women in Computer Science (WiCS) organization hosts campus speakers who talk about what it’s like being a women in a top tech position. They also run a graduate-undergraduate mentorship program to aid underclassmen women in anything from study methods to applying for jobs.
Jiaqi Liu, president of the organization and Columbia senior, says the group does a lot of outreach to members of the school’s freshman and sophomore classes before they are required to declare a major. They want women know that computer science is a field they can flourish in.
At Harvey Mudd, support for its female students have led to outstanding graduation rates, Klawe explains, and gainful employment.
Harvey Mudd sends graduates to companies like Yelp and Microsoft, the former has 18 graduates working there and the latter has had 30 over the past four years, Klawe explains.
“It’s not surprising women work well here,” she says.
When I was in middle school, I was an unapologetic dork. I played the saxophone, practiced Spanish with zest, was always picked last for sports teams and couldn’t be bothered with typical “girly” pursuits. As a hobby, I created a newsletter (there were no blogs in those days), deciphering and analyzing lyrics of popular music such as New Edition, Prince and Culture Club. I also had a “band” called Rachel Goes To Epcot Center — I recorded multi-tracks of original songs using a Casio keyboard, a boom box and cassette tapes. This type of eccentricity did not make me popular with the “in crowd,” and I didn’t care.
But in high school, fitting in started to matter more. My self-esteem plummeted as other girls teased me for being different, and I struggled with peer pressure. By the time I was 14, I was nearly failing my biology and geometry courses because I was so consumed with “my outfit and boys,” as my math teacher put it at the time.
I am extremely lucky that my single mother — alarmed at that math teacher’s keen observation — forced me back into study hours and the performing arts. Once again, academics became a priority for me, and I was too focused on performing after school to become distracted. By my senior year, I was engaging with boys by tutoring them all in calculus.
I was almost a casualty of the “leaky pipeline” for girls in STEM. My concepts of femininity and being accepted by my peers temporarily interfered with my self-esteem and my grades. And, as I look at young girls today, I’m afraid that, 30 years later, these threats still exist for young women. We see this through continually decreasing numbers of girls pursuing education in math and science. And cutting budgets for the arts (my saving grace) doesn’t help.
Research shows that parental attitudes can play a role in preventing girls from dropping out of STEM education. Today, I’m a proud dork working in technology: My early newsletters are now my blog, and my recordings have transformed into a podcast. I’m surrounded by amazing smart people — many of whom are also proud dorks — and I founded L’Oreal’s Women In Digital program to celebrate the importance of women working in technology-related fields. And the person I have to thank for my success is my mom.
If you are a parent of a young girl, I urge you to support your daughters who love math, science, technology and the arts and encourage them to stay on their paths. (I am a supporter of what I like to call “STE[A]M,” as the arts helped me get through many tough times). Here are some tips to help them along the way.
Cheer Them On. My mother encouraged me to keep singing, dancing and performing — keeping me out of trouble and involved in my community. She always came to my crazy performances. If you ever meet her, ask her to tell you about my voice recital where I sang a song called “If My Dog Were Green” and the time I joined a Christian performance church group (I’m Jewish.) This encouragement helped me get my self-esteem in check and taught me to be proud of what I loved to do.
Teach Them To Code. Encourage your girls to responsibly express themselves through technology. These skills can provide them with economic viability and a career. My grandfather once told me, “If you learn to type, you will always have a job.” Today, girls need to learn how technology works to build a foundation for their future endeavors.
Encourage Failure. I know that it’s okay to fail, to make mistakes, to not know all of the answers and to doubt your own judgments. My mother instilled this in me. When I didn’t get the part I wanted in a play, she’d always ask me if I did the best I could and that was all that mattered.
Don’t Force A Plan. I see so many young girls under heaps of pressure, and I truly believe that sometimes honing your critical thinking skills is more important than focusing on a career path. I sure didn’t know where I’d be today when I was in college — it took me many years (and mistakes) before I felt like I was “in the right job.” But my professional skills included problem-solving, writing, speaking and math — these skills can be used for any successful career.