Film and technology have always been closely linked, but these days they’re beginning to influence each other in ways we can’t even foresee. Take Kristen Stewart’s first directorial effort Come Swim, for instance, which will be premiering at the Sundance Film Festival: Stewart, her producer, and an engineer teamed up to use her movie to author a research paper about artificial intelligence. Why not?

The paper, which you can read here (it’s short!), describes the process of “Neural Style Transfer,” which means using a machine’s learning capabilities to translate one image into a different artistic or visual style. Stewart based her movie on a painting that she also painted, and she and Adobe engineer Bhautik J. Joshi and Starlight Studios producer David Shapiro entered scenes from Come Swim into the program and told it to re-render the scenes in the impressionistic style of the original painting.

This kind of neural network thing has been done before, and I imagine it’s a fun way for scientists to play around with computers’ learning capabilities. We’ve seen those nightmarish visions of many-legged dogs Google’s Deep Dream AI came up with when it was taught to find specific images, and we also saw an AI recreate Blade Runner (naturally) from its memories and impressions. If you’re interested, there are actually apps out there that you can download and feed images into, recreating a photograph in the style of a specific painting.

It’s a totally weird and cool area of research, and what it does is it gives scientists an impression of what artificial intelligences can “see” and focus on so that they can build better robots when the time comes for the Singularity to end society as we know it. Hopefully Kristen Stewart and her army of androids will be there to help us along the way.

More From 107.3 KFFM