Twitter and Periscope are working on a real-time livestream scanning algorithm

Periscope

Periscope may soon be able to identify what’s happening in live broadcasts with the help of Twitter’s Cortex.

Cortex describes itself as “a team of engineers, data scientists, and machine learning researchers dedicated to building a unifying representation of all of the users and content on Twitter, to help build a product in which people can easily find new experiences to share and participate in.”

The team first showed its livestream scanning system off to MIT Technology Review, where it scanned and categorized two dozen streams at once. To achieve this, Twitter has built a proprietary computer made entirely of GPUs, which then feeds its findings to a deep learning algorithm.

Periscope_hd

What may be gleaned via this scanning algorithm is meant for in-app search. For instance: someone may tag a large building fire as ‘omg wtf,’ but you’d be searching for something like ‘building fire in Los Angeles.’

If Periscope and Cortex succeed, a scanned stream could identify the fire, and the location data would know that the broadcaster was in the Los Angeles area.

In a statement, a Twitter spokesperson had the following to say:

Periscope has been working with Twitter’s Cortex team to experiment with ways to categorize and identify content in live broadcasts. The team is focused on pairing that advanced technology with an editorial approach to provide a seamless discovery experience on Periscope.

Discovery is the important aspect here, but the flipside would be filtering content that violated Periscope’s guidelines. If you were streaming a pay-per-view event or displaying NSFW content, scanning may help Periscope shut it down automatically.

Recently, ‘dark Periscope’ has been in the news, with rape and suicide taking center stage. With a robust scanning algorithm, Periscope could sideline those streams before they were seen by a large audience.

Most Periscope users adhere to the rules, which is likely why search seems to be its main angle. The project is still very much an internal experimentation, but there are pretty big implications. As we edge away from hosted video and into a livestreamed world, discovery remains elusive. Cortex may help get us off of square one.

Read next: Google is beautifying its data centers by turning them into giant art projects

Here's some more distraction

Comments