Campus Conversation: Ryan Cunningham

 

Listen to this article

Computer scientist RYAN CUNNINGHAM has worked for the National Security Agency (as a hacker), for Dow AgroSciences (applying artificial intelligence to genetics research) and as a consultant for defense attorneys (explaining digital evidence to their clients).

Cunningham now teaches ethical decision-making at the University of Illinois. He sat down with staff writer Julie Wurth to talk about the ethics behind the tech revolution, free speech on social media, whether privacy is still possible and the risks of teaching computers to make decisions for us.

Here's a sampling:

Do the people who design software programs and new devices consider the human impact and potential pitfalls?

I think largely, until somewhat recently, our field has sort of neglected this. ... As a whole, we have sort of been, I would say, too excited about the flashy gizmos and viewed people who were asking about these sort of privacy concerns as people who were being, you know, "That'll sort itself out, don't worry about that, you're just being old fashioned, you're behind the times." ...

Science and technology people tend to sort of frown on humanities, things that are a matter of opinion. They don't like to consider the fact that sometimes logic isn't everything. Emotions, feelings matter, and the impact that you're going to have on other people is very important, and you need to put some thought into that.

What did you think of Facebook's decision to ban white nationalism and white separatism after the mass shooting at two mosques in New Zealand?

I haven't seen exactly what their policies are going to be. That's really where the rubber meets the road. I think it's probably a good thing. We don't do things like allow ISIS members on social media that are celebrating in the wake of a terrorist attack. I think we should probably have that same standard regardless of the politics of who's speaking. ...

By the same token, that is my opinion. I view it as my mission in the course not to teach students this is right and this is wrong. I want to teach them ethical decision-making.

I think we're starting to as an industry sort of wake up here and realize, "Oh, hey, we have some responsibility here and we were being sort of dismissive." So I will give Facebook, Google, Apple, all of the tech giants some credit here that they are recognizing these controversies. ...

Oddly, the biggest problem in privacy is not necessarily the tech giants. I think they're kind of a symptom of the problem. I think most people are not aware that the sort of bucket that all this data is going into are these big data-broker companies, like Experian and Equifax, but there's also other companies like them, Axiom, who are Hoovering up a lot of this data about us and selling it. ...

Silicon Valley is the place where it's present in our minds that our data is being collected. But there is a bigger industry profiting off of that data on the back end and making that data have value that I think we need to do some more thinking about regulating them.

We need to listen to the public. ... If we're going to find out what privacy means in the digital age, the thing that we need to do is have a conversation about it, and that conversation cannot be Silicon Valley dictating what privacy means now. It needs to be that we in the computing field listen to the public, and what the public wants and expects, and we try and meet that with the technology that we have. That's the future we need to get to.

Reporter/Columnist

Julie Wurth is a reporter covering the University of Illinois at The News-Gazette. Her email is jwurth@news-gazette.com, and you can follow her on Twitter (@jawurth).