vines are 5.5 seconds too long


There’s a problem that’s been bothering me for a while: how do we let users create consumable home video. If you’ve ever tried to use a video editing package, you’ll probably have:
  • given up because the editor was hard to use (have you seen the edits people choose with Vine?…), or
  • given up because the editor couldn’t import your footage, or
  • given up because the editor kept crashing, or
  • spent 10x the length of your final video, getting the clips lined up just right, or
  • had the result be entirely unwatchable to people who don’t know your friend, Fred.

I went to Thailand, and being a geek, took 3 cameras and came back with way too much mediocre footage, ~30Gb, or 3 hours or so. Trying to edit it all to anything my friends would actually want to watch (or, fantastically, recommend someone else watch), would have taken a long time, and was probably beyond my skill level and hardware. My solution was to pick 0.5 second long section clips from each video. I was really quite pleased with the result:

This is actually a quite interesting 4 minute video, as far as holiday videos go. Possibly about 3 minutes too long, but pretty succinct.

The thing that really struck me was that the process of selecting the clip from the (sometimes quite long) clip was almost trivial, to the point where an algorithm could make pretty good guesses.

  • the first 0.5 seconds of a clip is a good default
  • ignore lots of frames of video that are the same (when I leave the lens cap on).
  • when not-much moves, and then something moves, that’s what’s interesting
  • however, if the view wobbles for 1-2 seconds at the start before going steady, that’s me positioning the camera and you want the bit after.
  • changes in volume can be interesting.
  • blurry things generally aren’t interesting
So then I started considering whether this would be a viable company. Let’s call something like the above video a strobee (despite the fact the url was taken long ago), and each individual clip a bee.

So now we imagine a world in which everyone is uploading bees, from their phones, shiny new pairs of Google glasses (or do you wear a google glass?), cameras, etc… We pretty quickly come to the conclusion that strobees can, and should, be assembled on the fly from a large database. That is we could steam a strobee (endless video stream) to user based on:

  • users (user channels)
  • your current location (a stream that changes as you drive down the street!)
  • a particular location
  • most recent in time
  • a certain hashtag (#bobs_wedding, #election2015)
  • popularity (how do we judge popularity – votes? interaction with a stobee?)
  • colours (show me videos that are mostly red)
…there are enough possible use cases to warrant an expansive api.

There have to be a wide range of algorithms we can apply to public video feeds (nicely complying with the “substantiality of the portion” in american copyright law), to extract interesting bees. Strobee libraries might come from:
  • webcams
  • movies (imagine a 3rd party service that delivered a bee containing your chosen word from a random movie)
  • old fashioned TV streams
  • satellite images
Revenue seems to be a much simpler sell than twitter or facebook. We can limit ourselves to only showing 0.5 seconds of an advert every so often, targeted to the user, the search, or the location. Given that people will put up for anything for 0.5 seconds, addbees shouldn’t be too much of a disincentive  There’s always the option of paying to remove adverts. Since the adverts are part of the stream, we could let people embed strobees into other websites, or request a stream via an api without issue.
Problems:
  • This blog post basically describes the Vine ecosystem, but with a lower maximum clip length. It would be trivial for Vine to compete. Then again, twitter competes successfully with email.
  • There’s some deep technical work to do on compressing such short clips, the setup of the I-frames in certain short clips is problematic.
  • Can we compose a stream from a such a giant database in real time?
  • How do people give feedback on blink-and-you-miss-it bees?
  • We would want to disseminate everyone’s clips, and show adverts along side them. Perhaps we don’t show adverts to people who create popular bees? Should we ask (or set default to) a creative commons license for all bees?
  • If it were ever popular, people would use it for p0rn. How do we filter such short content? The ever-racist 70% pink/frame criteria? 
  • How does someone wearing google glass upload a strobee? We could take the 0.5 seconds before someone says “strobee”? (Until it became popular enough, then you would have people shouting “strobee” at you if you wore your google glasses into town).
  • A host of privacy/missing context lawsuits are likely…