PREDICTIVE

How Video’s Dark Data is Coming into the Light

Highlight videos are a perfect fit for our increasingly short attention spans. With new technology, we can now use dark data from longer videos to quickly create highlight clips.

These clips show up everywhere, from our social media feeds to late-night TV. But nowhere are highlight reels more exciting than in sports—and not just because of the action that they capture.

What is Dark Data?

In the past, sports broadcasters were responsible for pulling together these post-action clips, but now they’re getting some digital assistance from dark data. Computers “watch” videos, analyzing unstructured data—data that isn’t easy to organize and process, also called “dark data”—looking for telltale signs of an exciting moment, like the crowd cheering or a player’s victorious fist-pump.

IBM Watson is one example: the tech has been in the press box at many tennis and golf matches, keeping an eye out for exciting content to package into short recap videos. The process using dark data is called “highlight clipping.” Through dark data analyzation, the tech has the potential to spot things that a human might overlook—like identifying an up-and-coming player—and can also produce these videos up to ten hours faster than humans.

Because of videos linear format, previously there was no cost- or time-effective way to analyze huge amounts of video. But the rising class of video data-mining solutions offers both fast processing speeds and quick access to the dark data within videos, all of which will make video content easier to organize, process and leverage in new ways.

Unearthing the Dark Data Buried in Video

Videos are cataloged using metadata, or what amounts to basic tags.

In the past, says David Clevinger senior director of product and strategy at IBM Cloud Video, “you might have the producer’s name, you might have a short analysis of what the video is about, who’s in it, and that’s it.”

Now, rather than a human assigning these tags to the videos, computers are able to process videos frame-by-frame, analyzing gestures, transcribing spoken language and using facial recognition to shine a light on this otherwise “dark” data. This creates an exponentially greater number of metadata, or “tags,” allowing for a more precise organization—and therefore searchability—of the videos.

Clevinger estimates that over the last five years, the metadata available for long-form videos has increased by 5000 percent thanks to the help of emerging technologies like Watson.

Putting Video Dark Data to Work for Fans

One media company uses its dark data video algorithm to offer personalized highlights to sports fans. Through its network of APIs, the app gathers video data and sports analytics about what’s going on in each game. The heightened ability to gather video’s dark data allows it to process the pace of a game, the momentum of one team over another, and even the novelty of what’s happening in the game.

The algorithm then serves up a curated package of highlights to users based on their preferences. For example, if a user follows a particular player or group of players, the app will curate every clip worth watching—even if they weren’t featured as part of the broadcast or during post-broadcast commentary.

Brad Adgate, a media consultant, says that these automated highlight-clipping solutions could fill a void in sports programming. Between plays or during timeouts, for example, the tech can quickly pull clips or sports analytics in real-time.

“I think people would rather see those exciting moments than watch some talking heads analyzing what’s happening,” says Adgate.

How Video’s Dark Data is Coming into the Light

Driving Relevant Advertising with Dark Data

The “lulls” that Adgate refers to are typically filled by ads (if not the “talking heads”). But the ads too are up for an upgrade with the help of dark data-mining technology.

One question being asked is are the right advertisements always served at the right moment? While US advertisers spent over $30 billion on programmatic advertising in 2017, survey data suggests a majority of brand advertisers still find it challenging due to lack of transparency and quality of data. With digital video ad spending expected to grow by over 15 percent to $13.2 billion, the pressure to deliver on these data promises will increase.

These video dark data crawling and clipping tools offer a solution. In addition to processing video footage to include in a highlight reel, AI can be trained to understand the right time in a video to trigger an ad.

The industry at large is already moving toward this model, known as programmatic advertising. Data and machine learning are used to more accurately target ads to consumers to get the most out of ad dollars—while protecting brand image.

James G. Brooks, CEO of video advertising firm GlassView, says the excitement of seeing a team score a goal, “provides quantifiable brand uplift, such as brand recall, brand favorability, and even purchase intent.”

How Video’s Dark Data is Coming into the Light

Dark Data Integrates for a Better Fan Experience

In the future, Clevinger says there’s even a market for retroactively replacing ads into highlight reels. Tor example, if the center fielder makes a winning catch in the outfield, advertisers might want to place their logo on the outfield wall.

“That moment in time, for that logo, is worth a lot more, because it’s going to be seen everywhere,” adds Clevinger. “Such analysis could result in replacing logos in real time to the highest bidder.”

These artificial intelligence, dark data and automation tools are proving critical for sports broadcasting, which faces mounting challenges thanks to the rapidly evolving preferences of its audience. Industry leaders express a myriad of concerns, covering everything from keeping fans engaged to attracting younger audiences.

Using dark data and AI tools to better understand and leverage the video assets available, broadcasters will be able to serve relevant content more quickly and accurately—from a highlight reel, to sports analytics, to an ad. This will help keep viewers engaged—and help advertisers rest assured that they’re getting the most bang for their (millions of) bucks.


This content is produced by WIRED Brand Lab in collaboration with Western Digital Corporation.

Leave a Reply