The New Work Summit, convened earlier this week by The New York Times, featured panel discussions about the opportunities and risks that are emerging as the use of artificial intelligence acceleratesacross industries. Here are some excerpts. They have been edited and condensed.
Can Technology Save the World (Before It Destroys It)?
Sam Altman
Co-founder and chairman of Open AI; president of Y Combinator
One of the things that surprises me most about sort of the criticism of the tech industry right now is a belief that tech should be the one to decide what you can and can’t say and how these algorithms work. And that, to me, sounds very bad. I think I’m totally cool with, for example, restrictions on free speech where it’s hurting people. And I think we’ve always had, you know, I can’t yell “Fire!” in this room. So there’s always been free speech with an asterisk.
But the idea that these companies who are not accountable to us or elected by us should get to decide sort of the new safeguards of society, that seems like the wrong way to do it. And I think we should let our — flawed as they may be — democratically elected and enforced institutions update the rules for the world. The world has changed a lot. Tech has changed the world a lot in a very short time. And it’s going to change it much more.
In the Picture
Evan Spiegel
Co-founder and chief executive, Snap Inc.
For example, just like a newspaper page, there really wasn’t a personalized version of the internet 20 years ago. There was just one page that everyone got — the same Yahoo home page, or something like that. And so this idea that the internet could be personalized by your friends was a huge breakthrough.
But this also came with some side effects. Because the content that is distributed by your friends from all over the internet, and that’s voted on by your friends in terms of likes and comments, sometimes that means that — because we’re human beings and we click on things that are outrageous or offensive sometimes — things that are negative actually spread faster and further than things that are positive.
And so we very quickly saw media start to change to fit this new distribution mechanism that was based on what your friends were sharing. So I think if we look at the evolution of social media, now people are thinking through sort of the ramifications of a lot of what’s happened as a result of that new wave distributing content. But I certainly think there’s a lot of opportunity to sort of course-correct here.
Full Speed Ahead
John Donovan
Chief executive, AT&T Communications
We paused for a period of time before we went into deployment on robotics, process automation, machine learning, to step back and build the Ten Commandments. And we came back with, I think, a view that was a human-centric set of policies.
And that is that everything in our business: Every outcome is owned by a human being. No one can say an algorithm did it or a machine did it. So everything that is launched, algorithmically, robotically is owned by a human being. A machine can’t control a machine without a process that’s involved.
These are like children, algorithms. If you change jobs you have to turn over to the new person that algorithm. Everything that’s put in has to have a rewind button. We have a Ph.D. in psychology who makes sure that they have a red button they can press when they feel like manipulation is happening in any way, shape, or form.
So we put a structure and a process in place to make sure that what we do programmatically, what we do algorithmically, and what we do with a machine doesn’t change the fundamentals of our accountability within the business.
Sebastian Thrun
Chief executive, Kitty Hawk
I look at A.I. as a tool, very much like a shovel or a kitchen knife. And when it comes to ethics, I think there’s ethical ways to use a kitchen knife, and there’s unethical ways to use a kitchen knife, and they’ve been around forever.
What, really, is A.I.? I think that’s what people are somewhat divided on. I think it’s something very, very simple. First of all, we talk about A.I., we talk about machine learning; we don’t talk about real intelligence. And machine learning is an innovation in computers.
Computers are dumb. To make computers do the right thing in the past, someone had to write down just an elaborate kitchen recipe of every possible step that the computer should do. And the computer would blindly follow those rules. The innovation in machine learning is that now, instead of giving a computer these rules, you can give it examples.
And the computer follows its own rules from those examples. So computer programming has become easier as a result. Children can now program computers by just giving examples. The implications, in my opinion, are groundbreaking for society.
Kent Walker
Senior vice president, global affairs, and chief legal officer, Google
The tech sector has always had lots of different regulations, from privacy, to copyright, to safe harbors and the like. We’re fans of regulation when it’s smart regulation.
And what do I mean by that? So, regulation that starts out with a really crisp definition of what’s the problem you’re trying to solve? That is then narrowly tailored to solve that problem and minimize blowback and side effects. And then third, a thoughtful analysis of how: what the second and third implications of those rules might be.
When you apply that to artificial intelligence, I think it’s most likely that we will start with the applications. It’s rare that you regulate basic research, because there’s so much risk of deterring innovation that could be really incredibly productive along the lines that Sebastian was talking about.
But as you get into applications — whether it’s financial services, or health care, or agricultural productivity — we already have a thoughtful series of laws and rules around that, and it would be natural to expand those into the issues that are raised by A.I.
Cutting Through the Hype
Deep Nishar
Senior managing partner, SoftBank Vision Fund
So, you start at a national level and say, “What should the new-age curriculum look like?” I started my work career in the heartland of America. It used to be an apprenticeship business. You worked in factories, you got trained by other people who were good at it. Then you got paid more, and you became good at it, and you trained other people behind you.
Well, then, companies should be training us to do similar things, and retraining us. My father’s generation, you had one profession, one job. Our generation, you have perhaps one or two professions, multiple jobs. Our children’s generation is going to have multiple professions, multiple jobs. That’s the reality. You know, we can keep fighting that reality, but it’s here to stay.
Adena Friedman
President and chief executive, Nasdaq
We always hired people into skilled positions that they had mastery in, whether it was market structure, or computer science, or whatever it was. And we didn’t really have a whole program of developing talent from within the organization.
Today, that’s a huge part of what we’re doing. So, we have a lot of university programs. We have a whole internship program. We do all sorts of really fun projects over the summer where people have tangible work product at the end of it that we actually then deploy inside our shop.
And it’s a very different way of bringing in, I would say, deep intelligence, and intellectual curiosity, and energy, and a new way of thinking and working that then you can kind of integrate into the organization. And they developed their mastery by being a part of who we are. And we teach them market structure. I mean, a smart person can learn market structure. But market structure is something they can then take through their entire career.
Leading in a Reshaped World
Dov Seidman
Founder and chief executive, LRN
The world has gone from interconnected to interdependent. We’ve never risen and fallen together the way we do today. The behaviors, plights, hopes, aspirations and actions of any one person are experienced viscerally and visually and directly in tiny screens in our hands.
It’s not just migration patterns: With one swipe, strangers are coming into intimate proximity on an accelerated basis, and into our homes, and into our cars. We are not just X-raying institutions; we are looking deep into their innermost workings and into the mind-sets and character of those in charge with M.R.I. vision. And we don’t always like what we see, and we’re amplifying it on social media for everybody else to spread for crimination. And then A.I. These unprecedented forces are asking us not just about how products are designed and how they’re being used, but they’re asking us to confront the most fundamental moral question of our time: What does it mean to be human in the age of intelligent machines, when we no longer share the planet with inferior mammals, we no longer have a monopoly on intelligence? What makes us special and unique?
That is a moral question. I submit to you that the business of business is no longer just business. The business of business is now society taking responsibility in this context as inescapable, and how you take responsibility is now a new source of competitive advantage. I believe that we are in a profound moral moment.
Making the Grade
Cindy Mi
Founder and chief executive, VIPKid
In 2013, I founded VIPKid with the simple idea that every child deserves a global classroom with global teachers and global content of online personalized learning style. So every day, we’re not literally shuttling our teachers — we’re teleporting them to our global classrooms, to their students in China and beyond.
VIPKid today has provided new work opportunities for over 60,000 North American teachers, the size of a football stadium. So this is really important, because all of our teachers come from all 50 states in the United States and all provinces and territories in Canada.
From the Valley to the Beltway
Ashton B. Carter
Director, Belfer Center for Science and International Affairs, Harvard Kennedy School; 25th secretary of defense
We have seen cases where technology is leading us to a dark place. And even as that can happen in the Department of Defense, it can happen elsewhere. And in fact, it’s happening in my judgment much more prevalently out there when it has to do with the rest of people’s lives and their quality of life, and their privacy, and the way they are led to behave with respect to their fellow citizens and so forth. That’s sadly rampant.
That’s one of the themes of this conference. I feel in my new life, that is my crusade. Technology and public purpose is the issue of our time. Whether it’s in digital, or biotech, jobs and training, we can’t have a good and cohesive society if we don’t get on top of that. So, of course I worry about it.
Alexa, What’s in Your Future?
Dave Limp
Senior vice president, Amazon Devices and Services
I grew up in a world where I always had television and my parents didn’t. So, television’s always been my new normal; it was learned behavior for them.
I have children, and you can imagine our house uses Alexa in a lot of ways. And when they go to their friend’s house for a sleepover or for a visit and they say, “Alexa, turn on the lights,” and it doesn’t work, they’re stunned. Because it’s their new normal; they expect that to work. Works in every room of our house.
But it’s not one demographic that it’s taking off with. You know, certainly kids love it, but I would say that at least once a week, often every day, I get an email from somebody that Alexa and Echo have changed their lives. Because they might not have all their mobility. They might be a little elderly.
And they find companionship with Alexa. They find the ability to interact in ways they couldn’t before.
Controlled Chaos
Reid Hoffman
Co-founder and executive chairman, LinkedIn; partner, Greylock Partners
I do think that over time, we will see a kind of transformation of industry — an enormous change in where the bulk of people find jobs and employment. The first worry is, what will that transition look like?
Even though the move from the agricultural to industrial age unlocked the productivity that created the middle class, that created all this prosperity, education, learning and science — and I think we have the same similar potential now — that intervening transition is super painful. And we want not as much suffering as we saw in the transition to the industrial age.
The Macro Impact of A.I.
Lawrence H. Summers
Charles W. Eliot University professor and president emeritus, Harvard University; 71st secretary of the Treasury
Automation and technology have been affecting our economy forever. People talk about this like it’s something that’s coming in the future. In 1969, between 4 and 5 percent of American men between the ages of 25 and 54 were not working. Today, about 14 percent of American men between the ages of 25 and 54 are not working — almost three times as many, even though we’re at a business cycle peak.
There are many reasons for that, but an important part of it is that technology has been able to substitute for the kinds of work that many of them do, making the rewards to them for working much less, and causing many more people to choose not to work.
Facing the Future
Brad Smith
President, Microsoft
One thing we have said is that we will provide our technology to democratically elected governments that broadly focus on the protection of human rights.
We will do that because we want these governments to have access to technology. We believe they need to have access to technology. But we also recognize that there are important ethical issues that will arise, and we will use our voice as a corporate citizen to not just actively — but proactively — engage on the issues of the day.
But where we don’t think that technology companies are rightly in the business of regulating democratically elected governments, we think democratically elected governments should be in the business of regulating technology companies — not the other way around.