Editor’s note: This is the latest in an exclusive UpTech series about Artificial Intelligence, Machine Learning and much more as part of a partnership between YourLocalStudio.com and WRAL TechWire. Previous posts can be found by searching “Uptech” at WRAL TechWire.com. Interviews are conducted by Alexander Ferguson, CEO of YourLocalStudio.com.

Did you know that Artificial Intelligence can discover and exploit personal and private information about you from playing a video game?

Dr. Chris Hazard, CEO of Hazardous Software, obtained his PhD in computer science from NC State on artificial intelligence for trust and reputation. He has worked in and been published in a variety of fields from wireless network infrastructure as a software architect at Motorola, to psychology as part of a post-doc at NCSU, to hypnosis with the National Guild of Hypnotists, to robotics at Kiva Systems, to privacy law working with the Future of Privacy Forum.

UpTech

In the second part of our UpTech interview, he describes some shocking ways that very personal and private information could be discovered and exploited—all from just playing a video game.

The interview
  • Dr. Hazard describes some shocking ways that very personal and private information could be discovered and exploited—all from just playing a video game.

If you are undervaluing or overvaluing positive utility events—so what that means is that if there’s a positive outcome in a game, like “Oh I got this reward. I got this treasure. Oh, that’s really awesome. I really value that. But oh, if I lose this thing or if I gain this one coin or whatever it’s not that much.”

It turns out, according to the two-month study, that depressed people more accurately value positive utility events than non-depressed people. So, think about that for a second.

Imagine that you wrote any game and install it on a bunch of people’s phones. They play the game a bunch, and it wasn’t that successful. Maybe you had a couple 10,000 people play the game and you didn’t make that much money.

And all of a sudden, there’s a company that’s slurping up all this game data, and they say “Hey, I’ll buy your game for $10,000.” And you’re like “okay sure that’s fine.” And now this company applies a whole bunch of machine learning techniques and extracts that.

It now can determine sensitive information about whether you’re depressed or not, whether you were dieting, all these sorts of things that you didn’t think were exposed in your data.

But when aggregated in just the right way, even if you apply differential privacy or different privacy techniques to some parts of the data, you know there’s always this sort of information leak.

And good models—good A.I.—can tease that out.