Mining User-Generated Videos

With the popularity of mobile cameras, a huge amount of user-generated videos is generated every day. This big data can be used for various purposes, e.g., surveillance, news, disaster response, augmented reality. However,  one of the biggest issue with user-generated videos is that their content very often not interesting or useless, e.g., a user randomly records video around his/her location. Therefore, there is a need to quickly search for interesting/significant videos for a particular application.

In augmented reality (AR), we are interested in video contents that are appealing to AR users. Leveraging rich spatial metadata of user-generated videos (e.g., camera location, camera direction), the user-generated videos can be precisely registered on AR browsers, e.g., Layar, Wikitude, and Junaio. Using such metadata, we can search for a sequence of video segments that follow a particular camera shooting pattern, e.g, zooming, tracking, arching and panning. These shooting patterns is a strong indicator of an interesting video (i.e., the video is recorded with a specific purpose rather than randomly record).

Screenshot 2016-05-17 08.28.55

We developed efficient algorithms to search for these camera shooting patterns from the spatial metadata of the videos. We test the algorithms on a user-generated video dataset [1] and found a subset of videos segments that are interesting. We verified their interestingly/significance by watching them. We also found that tracking is the most popular way of capturing mobile videos while arching is the least popular.

This study [2] is a first step to understand user-generated videos from their geospatial metadata.

[1] Ying Lu, Hien To, Abdullah Alfarrarjeh, Seon Ho Kim, Yifang Yin, Roger Zimmermann, and Cyrus Shahabi, GeoUGV: User-Generated Mobile Video Dataset with Fine Granularity Spatial Metadata, In the 7th ACM Multimedia Systems Conference (MMSys), Klagenfurt am Worthersee, Austria, May 10-13, 2016

[2] Hien To, Hyerim Park, Seon Ho Kim, and Cyrus Shahabi, Incorporating Geo-Tagged Mobile Videos Into Context-Aware Augmented Reality Applications, The Second IEEE International Conference on Multimedia Big Data (IEEE BigMM 2016), Taipei, Taiwan, April 20-22, 2016


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s