I'm excited to announce that the upcoming 2.5 release of the Silverlight Media Framework (also known as the Media Platform Player Framework) will include support for playback of stereoscopic 3D video content. This powerful new functionality will allow developers to easily integrate 3D video content into their player experiences, and will allow dynamically changing, enabling and disabling 3D mode based on an end-user's preferences.
Content providers can now provide a single encode of a stream, which the player can then dynamically display in a range of different 3D modes or standard 2D mode based on the user's preferences (and hardware/3D glasses availability). This dynamism provides a powerful tool for online broadcast of 3D content in that it does not leave any users out in the cold, unable to consume the content. You can now reach your entire user base, with 3D as an optional feature, with only a single video source!
Here's a sneak preview of what will be included, as well as some background on how it all works:
Anaglyph Stereoscopic 3D video plugin, packaged with the SMF 2.5 source code
- Support for Anaglyph 3D, Greyscale 3D, Left eye only modes
Active Shutter Stereoscopic 3D plugin available for separate download
Support for 3rd party Stereoscopic 3D plugins
- Tools to build your own Stereoscopic 3d implementation
Simple implementation of 3D support for consuming applications:
- A new "S3DProperties" property has been added to the PlaylistItem class
- Simply set up the appropriate S3D Properties, add a reference to the S3D plugin you'd like to use, and 3D will work automatically!
Stereoscopic 3D supported for:
- Progressive Download (WMV, MP4)
- Smooth Streaming
Four new sample applications will be included in the Samples project to demonstrate how to use the Anaglyph 3D plugin:
- Simple Anaglyph 3D example
- Multi-mode Anaglyph 3D example (toggling 3D on and off, changing modes)
- 3D support via HTML playlists
- Full documentation will be available on http://smf.codeplex.com when SMF 2.5 is released
Building Your Own S3D SMF-based Video Player
The documentation to be released with SMF 2.5 will include full instructions on how to include 3D video content in your player, complete with sample applications. For a peek at how easy the setup will be, here's a quick how-to video: http://ecn.channel9.msdn.com/o9/content/smf/howto/v2/videos/smf3d.wmv
The code generated during this video will also be available on Codeplex on the release date.
Writing your own S3D Plugin
The SMF 2.5 documentation will also include details on how to build a Stereoscopic 3D plugin of your own that will integrate with the new I3DPlugin interface.
How Does Stereoscopic 3D Work?
Three of the most widely used 3D video display technologies are passive complementary color anaglyph (just referred to as "anaglyph" in this post), passive polarized, and active liquid crystal shutter, each with pros and cons. Let's take a quick look at all three, and discuss how they work in the Silverlight Media Framework:
The anaglyph approach to 3D uses two complementary colors as a filter for each eye. Most often, red and cyan are the colors used, and these are the colors currently supported by the SMF's Anaglyph S3D plugin. An anaglyph image (below) is displayed such that the left eye will view only red colors in the spectrum, and the right eye will view only blue and green colors (which combine to make cyan) to create the 3D effect.
The SMF's anaglyph plugin creates an anaglyph image by taking a base video source with the left eye and right eye images side by side or top and bottom, and uses Silverlight pixel shaders to apply color filters to the images and superimpose them together to form a single image. We based the red-cyan anaglyph formula on Peter Wimmer's anaglyph forumula – we used the ITU BT.709 color matrix coefficients instead of the ITU BT.601 coefficients used by Peter.
- No special hardware is required to display the image
- The glasses are very cheap (paper glasses can be purchased for less than 50 cents online)
- The technology has very wide reach, and makes it the most common 3D display technology for images on the internet.
The major disadvantages to anaglyph display are around the quality of the image being viewed:
- The left and right images are displayed superimposed on top of each other.
- Due to the color filters, color resolution is lost. One technique to counter the color loss is to display a grayscale or monochrome anaglyph image: a colorless image suffers no color loss. The anaglyph plugin in the SMF supports a grayscale option – the pixel shaders have a modified algorithm that will turn a video stream with color images into a monochrome image before applying the red-cyan color filter.
You probably experienced polarized 3D the last time you saw a 3D movie in a theater. Linear polarization (as used by IMAX), and circular polarization (as used by RealD) work in a similar manner: the left and right eye images are superimposed through polarizing filters. In linear polarization, the polarized lenses in the glasses are oriented 90 degrees offset from each other (one vertically, and one horizontally), and in circular polarization one of the lenses is mounted in reverse (the advantage being that you can tilt your head and still maintain left/right separation).
The Silverlight Media Framework does not currently have support for polarized 3D, since most users don't have interlaced polarized displays (LCD computer monitors such as this one). However, 3rd party developers are free to create a 3D plugin for interlaced polarized displays for the SMF. The approach would involve using pixel shaders to convert the frame compatible images into interlaced images.
- The color loss with anaglyph is eliminated using polarization, and the technology works well on the "silver screen".
- This achieves a superior image quality vs. anaglyph, though there is still resolution loss.
- The glasses are more expensive than anaglyph glasses, and special projection hardware is required to make this work.
- This limits the versatility and reach of this technology, particularly with online video!
- Polarization only works with silver screen projection (such as in a theater) or interlaced polarized digital displays. Silver screens are inconvenient for home use and interlaced polarized displays are very rare.
The anaglyph and polarized approaches are considered "passive" because to the glasses themselves don't have any active pieces. In contrast, Active LCD shutter technology uses powered glasses that filter each eye differently at different times. NVidia's 3D Vision solution uses a 120Hz monitor to present the left and right eye images on alternating frames, while still maintaining a full 60Hz signal per eye. The glasses shut out the image to the left and right eye every 1/120th of a second, syncing with the image on the monitor via an IR transmitter.
NVidia worked with Vertigo to create an active shutter 3D Vision plugin for the Silverlight Media Framework, which will be available on NVidia's site.
- Powered glasses are more expensive than polarized or anaglyph
Personally, I have to say that the NVidia Active Shutter technology is the most stunning 3D experience I've viewed so far – 3D video looks so much better in full resolution HD!
Sample 3D Video Content
Finally, Microsoft and NVidia have made the following sample video content available for developers:
Left Eye First, Side by Side: http://devplatem.vo.msecnd.net/3D/NVIDIA_3DV_PC_1080p30_SxS_LeftFirst.ism/manifest
Right Eye First, Side by Side: http://devplatem.vo.msecnd.net/3D/NVIDIA_3DV_PC_1080p30_SxS_RightFirst.ism/manifest
Left Eye First, Top and Bottom: http://devplatem.vo.msecnd.net/3D/NVIDIA_3DV_PC_1080p30_TxB_LeftFirst.ism/manifest
Right Eye First, Top and Bottom: http://devplatem.vo.msecnd.net/3D/NVIDIA_3DV_PC_1080p30_TxB_RightFirst.ism/manifest
Progressive Download WMVs
Left Eye First, Side by Side: http://devplatem.vo.msecnd.net/3D/NVIDIA_3DV_PC_720p30_SxS_LeftFirst.wmv
Right Eye First, Side by Side: http://devplatem.vo.msecnd.net/3D/NVIDIA_3DV_PC_720p30_SxS_RightFirst.wmv
Left Eye First, Top and Bottom: http://devplatem.vo.msecnd.net/3D/NVIDIA_3DV_PC_720p30_TxB_LeftFirst.wmv
Right Eye First, Top and Bottom: http://devplatem.vo.msecnd.net/3D/NVIDIA_3DV_PC_720p30_TxB_RightFirst.wmv
3D is exploding onto consumer devices like the Nintendo 3DS, and I expect to see a big jump in 3D video online over the upcoming years. I'm very much looking forward to seeing the applications consumers of the Silverlight Media Framework create using this new addition to the SMF!
Here at Vertigo, we often come across "Social Integration" as a feature request for an application, particularly in our premium online video experiences. All too often, we see apps on the web that treat social integration as a checkbox on a feature list: Did I provide a link out to Facebook and Twitter? Check!
What these basic social implementations often lack is a broader understanding of the client's overall social media and social marketing strategy. Companies with a clearly defined social marketing strategy understand that there's more to social media than just allowing your users to link to your application.
We've been fortunate to work with many clients who understand that the true power of social media lies in building tools that empower your fans to participate in the creation of your content, creating a two-way conversation that doesn't exist in the world of traditional broadcast media. They also understand that this level of participation and engagement has direct and measurable marketing benefits and impact on the monetization of their content.
in Fortune describes how Conan O'Brien discovered the power of social media in reviving his career. His show on TBS is now considered to be at the forefront of online social-savvy: clips from the show are released on YouTube to generate interest (a move that most players in the broadcast world oppose, taking licensed shows off of YouTube), and each show warrants a new Twitter hash tag. The TV audience and the live audience at the taping participate in a real-time conversation on Twitter and actually get to participate and drive the content:
"Between his bits, O'Brien would come backstage and ask, "How's the tweets? How's the audience?" By reading the hashtag stream, Bleyaert recalls, O'Brien and his team could see, for example, that "some guy in the fifth row was using Twitter to try and pick up a 'girl in the white hat, three rows in front of the stage,' " and O'Brien would instantly incorporate that into his next bit."
Conan's advertisers and guests promoting their books/shows/etc. also reap the benefits of this social engagement – when you make a guest appearance on the show, you gain access not just to a TV audience, but to a massive online social audience as well.
Bringing participation to premium online video experiences
Vertigo has recently had the opportunity to build premium online media experiences that take a similar approach:
- Focus on engagement
- Give your fans a megaphone to tell the world what they think
- Allow 2-way participation in the content being viewed
- Provide tools that make the content more relevant to specific users
- Allow rapid creation of clips to provide shareable content while it's still relevant
Some examples:NBC's Sunday Night Football Extra 2010 and NHL Extra 2011:
NBC Sports has done a great job driving forward a modern social media strategy via their online companions to sports broadcasts. Here are some of the features we worked to build with NBC to drive greater engagement from their fan base:
- Allow users to have a (moderated) conversation with each other via Twitter within the application
- Allow users to participate in an ongoing conversation throughout the game by asking questions of Mike Florio from Pro Football Talk
- Provide a way for fans to wear their colors online by voting for their team in a Twitter Battle, driven by Twitter hash tags.
- NBC posts highlight clips of key plays within a minute or two of their happening live (via the Silverlight Rough Cut Editor). This allows fans to very quickly share out that bone-crunching tackle while the content is still hot and relevant.
- Built a preview window that allows fans to share specific plays of their choosing with their friends. This empowers users to find the content they care about and drive what they discuss online, not restricting them to editorially-selected clip content. This is content that is relevant to each specific user, and provides a tool for users to create their own take on the content, allowing another avenue for participation in the event.
Microsoft's PDC 2010:
The 2010 PDC event moved a large physical conference almost completely online. Microsoft recognized from the outset that to enable the level of networking and personal connections that happen at a physical conference, an aggressive social media strategy would need to be put in place. To help Microsoft achieve that goal, we built the following features in the PDC 2010 experience:
- Added a Twitter hash tag per session to allow users to discuss the content live in real-time
- Allowed real-time Q&A and Polling to allow the online audience to interact with the presenter and affect the content being presented.
- Quickly created clips and on-demand content with the Rough Cut Editor.
- Allowed bookmarking of specific locations in each session to allow users to share and discuss content relevant to them.
More engagement = more value
At the end of the day, what is the intrinsic value of all this additional participation and engagement? The answer is that it creates far more involved users. Those users will watch more video, and generate more revenue. They will also become evangelists of your content, drawing in a much larger overall audience (and again, more revenue). Moreover, those users will transcend the concept of a user, and become fans. Those fans will become a built-in marketing base (and marketing mechanism via the megaphone you gave them) for any future projects that follow similar participatory models. Finally, just like Conan's guests gain access to a huge social following by appearing on his show, the content and advertisers in your video experience will gain exposure to your social audience, which has a great deal of inherent value. An example here is NBC's Mike Florio, who now has a daily online broadcast, Pro Football Talk Live
, that complements his appearance on the weekly Sunday Night Football online broadcasts.
You're not in control
One of the hesitations I often hear expressed about opening up your content to your users and fan base is the loss of control. What if my fans want the content to go in a different direction than I do? In my opinion, you're better off at least knowing that this is what your fans want, and having a 2-way channel available to discuss it with them. If you want to carve a bold vision and move in a direction you think your fans don't want now, but will like eventually, you have the tools to engage with them and let them know your thinking. And otherwise, maybe your fans have a good idea you haven't thought of: Betty White may not come to mind as a Saturday Night Live host, but perhaps listening to your online fans will net you a big success, as NBC did with the Betty White Facebook campaign.
Another concern is that if you open up the discussion to the world, people who don't like you can tarnish your image. While there are tools that can help mitigate blatantly offensive content appearing on your site, you'll probably have to accept a bit of risk that negative feedback will be made public. My opinion here is "that's OK". You're never going to please everybody, and the value of having a public discussion about your content most likely outweighs any negativity out in the ether. Furthermore, this kind of feedback can serve as a type of analytics system. If your fan base passionately speaks out about a buggy feature or a content decision they dislike, at least you know it! If you're committed to the path of building a great experience your users will love, you should have faith that your efforts will pay off and the love will drown out any negative feelings in the public sphere.
After giving a presentation on building premium media experiences up in Seattle earlier today, I'm sitting in SeaTac airport right now on a lengthy delay due to bad weather in San Francisco. I'm flying Virgin America, and the folks at the desk let us know that our 7:00 flight will likely be delayed until 10:20 due to weather. However, since the weather could change at any moment, the flight may take off at any time, and we should check in every 15 minutes to make sure we don't miss the flight. So, that means that not only is my flight delayed for 3 hours, but I'm stuck in the boarding area.
The agent (who I later learned is named Angie), also announced "Please check in every 15 minutes – we'll do everything we can to notify you if the departure time changes." I rolled my eyes at this statement – how many times have I heard airlines promise to do "everything they can" for me, when they actually mean they'll do absolutely nothing for me? Feeling a bit snarky, I walked up to Angie and told her that I'd like to go get some dinner at a restaurant in the airport. I asked her whether she would send me a text message if the departure time changed. She replied, "I can't text you, but I'd be happy to call your cell phone!"
I was floored. Perhaps I'm too jaded an air traveler, but it's so rare to hear a high-level promise like that from an airline and actually see people on the ground follow through on it with such personalized service. I happily went on my way to a nice relaxed dinner, and despite the nasty delay that will get me into SFO after the BART stops running, I feel like a very happy customer right now.
One of the greatest advantages that small companies in insurgent positions in the market have over their larger competitors is the ability to discard far-reaching policies and bureaucratic edicts. Instead, these smaller, more agile organizations can empower their employees on the ground to adapt to the situation at hand, and just do whatever it takes to delight each customer they encounter. These type of organizations understand that the best marketing mechanism they posses are the daily interactions they have with each of their customers. To quote Cake, these organizations "use a machete to cut through red tape".
I like to think that when we at Vertigo interact with our customers, we take a very similar attitude towards the way we work. Not being bound by the restrictive rules typical of a larger organization gives us the freedom to experiment with new approaches that make sense for each client, whether that be a new architecture, unusual support hours, or a design process that eschews the usual conventions. As an organization constantly on the leading/bleeding edge of new technologies, we're always trying to strike a good balance between breaking new ground and finding an efficient way to put our best practices into place on a larger scale. To me, the key is to preserve the ability of each project team to adapt to the circumstances of their particular client's needs, while still making use of established patterns wherever possible to save time and add polish to existing features.
Hats off to Angie and to Virgin America for continuing to serve as an example for nimble, small organizations who win by delighting their customers!
My wife is a big fan of Lush soaps, and I've noticed that the cans of lush products in our shower had stickers with people's faces on them.
This is what Lush's website has to say about these stickers:
"What has held us together for over 30 years is the pleasure we take from our work. Pouring a soap, mixing a cream, creating a fragrance, or rhythmically pressing 1,000 shampoo bars by hand give our lives meaning. This makes us proud of what we produce and so we like to put our own individual 'mark' on each product before we sell it. The face sticker smiling out at you from every LUSH product tells you when it was made, the 'best before' date, and who was proud to make the product, by hand."
This is a great way to personalize a product and engage with customers. I wasn't surprised to read that this attitude extends to Lush's social media strategy, which uses tools such as Facebook and Twitter to engage on a personal level with their customers. The linked article describes how this approach is handily beating the "Post corporate messaging on Facebook" approach being used by Lush's closest competitor, The Body Shop.
This got me thinking about software, and how we organize teams of people to build it. One of the most important factors in producing high quality software is to ensure that each member of team building it is intrinsically motivated and takes personal pride in the entire product they're producing. In other words, that each member of the team is willing to put their face on a sticker and proudly slap it on their work.
When building complex software systems with large teams, potentially across several different partner companies, one of the keys to bringing together a successful project on time and on budget is the cohesion of all the team members. The biggest pitfall in these large projects is often that each team member or partner company concentrates strictly on the piece of the system they are building, and disconnected silos of functionality end up forming. The left hand doesn't know what the right is doing, and when it's time for all the pieces to come together, they don't!
The way Lush operates ensures that a specific person is responsible for (and proud of) an entire product that they're delivering. This is in contrast to other companies that may see each employee as a siloed station along an assembly line.
Similarly, successful software projects have team members who take ownership of the project as a whole, and not just their silo. People in the mindset of putting a sticker with their face on a product will keep track of dependencies between the different silos, perform testing across multiple partner companies to ensure that the bridge between each component is reliable and proven, and ensure that they're proud of the entire product their team is building, rather than just their specific functional area.
This is one reason that I believe in assigning vertical areas of ownership to each member on a team. If a single person is responsible for an entire feature, from the UI to the middle tier logic to the database, they will know exactly how it is supposed to work, which design makes sense, and will drive the feature to completion and testing. Most importantly, it guarantees that the feature works end-to-end, and drives features to become real and working as fast as possible. Of course, even in the vertical approach, designers will still work primarily on the UI, and a SQL expert may do the bulk of the DB work. The key is that each vertical feature has an owner looking after it, regardless of the owner's particular skill set.
In horizontal areas of ownership, where one person owns the DB, another owns the cloud services, and another owns the UI, a siloed attitude of "that's not in my purview" can arise. There's also no driving force to making a feature work end-to-end, as people can see their part as "complete" even when it isn't wired up and working.
As a project lead, I want to see Jim's face on a sticker slapped on to the "Login" feature, and Mary's face on a sticker slapped on the "Post to Facebook" feature. That lets me know that those features have dedicated owners who take pride in the entire user experience of each part of the application, and who focus on delivering a real, working feature, rather than a disconnected component.
What does your team do to put your faces on the products that you build by hand every day?
When Microsoft announced that this year's Professional Developers Conference would be limited to 1,000 attendees and held on the Microsoft Campus in Redmond, some were surprised at the departure from the typical conference site at the LA Convention Center and the smaller than usual limit on the number of attendees. In fact, the 2010 PDC turned out to be the largest PDC that Microsoft has ever done. By making use of Silverlight and IIS Smooth Streaming technology, Microsoft was able to extend the traditional reach of the PDC to a much larger global audience than had ever previously been possible.
Vertigo is proud to have participated in this groundbreaking event by building an interactive Silverlight Media Framework-based video player and PDC conference application that allowed the widest level of participation ever for a PDC. The online consumption numbers that Steve Ballmer mentioned in his PDC Summary are pretty staggering, considering that only 1,000 people physically attended the conference:
"In addition to more than 30,000 developers at 250 PDC events worldwide, another 100,000 developers viewed the event online using Silverlight, with 10% of the online audience taking advantage of simultaneous translation into Japanese, Spanish, French and Chinese. This is incredible reach."
Beyond this amazing breadth of reach is the depth of participation that the online PDC experience enabled. By integrating real-time Q&A, polls, and Twitter conversations, users watching remotely don't just watch remotely – they actually get to participate in the sessions as if they were right there in the room. Additionally, international users enjoyed simultaneous translation into 5 different languages to allow a level of understanding of the content that may not have been possible outside the online experience.
Let's take a look at some of the features of the application:
Live HD Streaming of 2 Video Feeds per Session with DVR Capability
The PDC player shows 2 synced up video feeds simultaneously: the "slide deck/code view" feed, which shows what the presenter is actually doing on the presentation computer, and the feed showing the actual presenter in a smaller video rectangle. These feeds can be swapped so that the more interesting feed can be seen in the larger video window at any time. Users enter live presentations at the live point, but can DVR back to any point in the presentation they wish via the timeline. Users can also use the DVR controls to jump back 15 seconds if they missed something, or to rewind or fast forward while watching the video content.
Live Multi-Language Audio
In order to extend the reach of the PDC internationally, Microsoft Studios organized a phalanx of translators in their facility during the conference to provide live audio translation in 4 languages (Chinese, Japanese, French, and Spanish) for each of the 4 sessions – that's 16 different translations happening simultaneously and live. The player allows users to switch seamlessly to an alternate audio track to listen to these translations. This is a huge feature, as it allowed international viewers to understand the content being discussed in a way they may never have been able to except in this unique online experience.
Live Closed Captioning
The complement to live audio translation, live presentation of Closed Captions for each of the 4 simultaneous sessions (again provided by the folks at MS Studios) provided a way for users with hearing disabilities or those users who did not have the ability to play audio in their environment to consume the content.
Each of the PDC's sessions can be shared out via Twitter. In addition to this, a specific location within a video can be deep-linked to using the bookmark feature to allow users to direct their colleagues to specific segments of a video covering a topic of interest. For example, here's a deep link to Eric Schmidt discussing the PDC online experience on Channel 9 Live (this is a great discussion of what it took for all the partners involved to make the PDC online event come together). The Channel 9 feed is a 7 and a half hour video, but the bookmark/deep linking allows me to single out the topic within the Channel 9 show that I'm interested in.
Multiple Content Delivery Networks (CDNs)
Because the PDC's online video content is distributed internationally, the team set up a system to identify which CDN would provide the best video delivery experience to the user and have the player select that network as the video delivery endpoint. For instance, users in China were directed to stream video cached by ChinaCache. Users in the United States currently pull video content from Windows Azure's CDN, and live feeds were delivered by Akamai.
Real-Time Polling and Q&A
A key goal of the PDC online experience was to allow users to engage with the live sessions and participate in the same way that a user in the same physical room as the session would. To this end, the player allowed users to ask questions and receive back answers, and to participate in polls initiated by the presenter of the session. These were driven by services running on Windows Azure to ensure scalability for a large number of users.
The final piece of the puzzle in bringing an online conference into social parity with the experience of physically attending the conference is to allow attendees to engage with other attendees, and to foster discussions and conversations among the audience about the material. Each session at the PDC was given an identifying hash tag, and users could engage in a conversation with each other about the material in real-time as it was being presented via Twitter from within the PDC application. An advantage to the online approach is that these discussions can happen in parallel with the session itself without causing a distraction or interruption to the session (and anything a user might miss while conversing can simply be re-watched by DVR-ing backwards).
Ratings and Metrics
Another key goal of the PDC online experience was to track metrics data about both the health of the video ecosystem (number of video failures, number of video starts, etc.), and the value of each session's content to the users. At the conclusion of each session, users are asked to rate and comment on the session. Here's the cool part: that feedback is then consumed in real-time by the PDC application and used to guide other users to what the community considers to be the most important content. More on this in the next section:
The PDC Now page allowed for a central hub and landing page for the application that guides users to the most valuable content. Live content was always featured front-and-center, but the deep value of this page lies in its display of the ratings and metrics data gathered from all of the active players on the internet. As you can see above, users can browse content based on data such as the most popular or most watched videos, or view sessions based on which were the most highly rated by viewers. The player also displays data about the number of users currently viewing each session (see the yellow arrow above).
The guide section (of which both a Silverlight and HTML version were built) provided additional ways for users to explore content – by the PDC schedule, by session, or by speaker. While browsing for content on the Guide or PDC Now pages, the active video simply minimizes to the bottom-right corner and continues to play with audio. This allows users to explore content without losing track of the information being presented in their active session.
Each live session begins with a few minutes of video slate while the encoders are fired up but the session has not yet started. However, we don't want users to have to sit though that slate once the session becomes Video on Demand content. We also don't want to spend time editing out the slate, so a solution was devised to allow the player to use a "Skip Into" time to begin the video at the exact time the session actually starts.
Users can access all the relevant Powerpoint decks as well as downloadable WMV copies of the session videos in the Download Materials section.
The RSS-based news feed allows users to keep in touch with the latest news from the conference.
Windows Phone 7 App
Vertigo also built a Windows Phone 7 app that allows smooth streaming video content to be viewed using the Silverlight Media Framework for Windows Phone. See Ben Riga's blog post for more details.
Pulling off an event as large and broad-reaching as the PDC online experience is a huge effort involving many partners. For a look at all of the partners involved in this effort and their roles:
Encoding and delivery of the live and on-demand video.
Built and operated the PDC online player application, schedules, and Windows Phone 7 app.
Creation and operation of the Content Management System that allowed all of the complex data about the event to be managed in real-time.
Manufacturer of the Spinnaker encoders used for the live video content.
Content Delivery for live streams.
Natively delivering content into China via their CDN.
Delivering all on-demand video via the Azure CDN solution.
- MS Studios
Provided closed captioning, audio translation, and origination of the video streams in Redmond.
MS DPE Team
Organized all partners and deliverables and oversaw the PDC online event operation.
Provided support for smooth streaming technology.
Provided support for Silverlight technology used by the player.
Manufacturer of the Closed Captioning encoders.
I want to send a big thank you both to the talented designers and developers at Vertigo who helped the PDC application become a reality, as well as all of our partners on this project who worked very hard to make the streaming live video experience itself a reality.
Let's end it with a screenshot of what the player looks like in full screen, where the secondary video transitions to a Picture-in-Picture mode:
I'm proud to announce the launch of Vertigo's latest live HD video experience: NBC's Sunday Night Football Extra 2010! You can check out the live action every Sunday Night this season.
We first worked with NBC last year, building the Sports Emmy®-nominated SNF Extra 2009 application. We've since partnered with NBC to build the 2010 Vancouver Winter Olympics video experience, as well as this year's US Open (golf) and Wimbledon live HD video players. Vertigo's team for this project has returned to Sunday Night Football again this fall with the ambition to design the most engaging live sports experience ever created for the web.
A great way to get a feel for the application in only a couple of minutes is to watch this video http://bit.ly/snfextrapromo, which provides a quick high-level run down of all the features in the application. For some more depth, let's crack open the application, take a look at some of the goals we set out with, and discuss how we solved some tricky design challenges to create an industry-leading live video experience for the web!
Great looking, High Definition video
Using Microsoft's IIS Smooth Streaming technology and working with our partners at iStreamPlanet and Akamai, the video you see in this player will once again serve as the standard for clean, smooth, multi-bitrate, adaptive HD video on the web. The core video player is built using the open source Silverlight Media Framework, which Vertigo developed in cooperation with Microsoft.
One of the major goals for this project was to make the entire application more engaging to users. What better place to start driving engagement than in the actual video being consumed? The PIP provides the following:
- The PIP control allows viewers to see the action from an alternate camera angle while watching the simulcast feed in the main window.
- The menu on the PIP provides a streamlined way to switch to another camera angle, or to swap the main video into the PIP and start watching an alternate angle in the main window.
- Great viewing experience: This year, we decided to make the PIP a first-class video player in its own right.
- Full Screen Scaling: To provide a great experience on televisions and large monitors, the PIP will scale up in size when users move into full screen mode. This keeps the PIP relevant, even from across the room.
- Draggable: Don't like where the PIP is positioned? Drag it anywhere you'd like!
- Collapsible: If you'd like a totally clean video experience, you can use the Minus button to collapse the PIP and remove it from your field of view.
The Ultimate Timeline
One reason that Sunday Night Football is such a fun event to work on is all of the interesting data we have to work with. One of the biggest challenges we faced was how to visualize all of that data without creating a noisy or distracting user interface. We also had to envision how to allow users to socially interact with all of this data, and how to create data of their own that might be interesting to their own social circles. Here's how we addressed these challenges:
- Positional Thumbnail Preview: When mousing over the timeline, the player displays a preview image of what's happening in the video at the point you're hovering. This allows users to quickly browse an entire game and look for interesting action.
- Play by Play Data: We use team-color-coded markers on the timeline to indicate when key plays occur. When mousing over one of the key play timeline markers, data from the NFL displays inside the timeline preview image indicating the details of the play that happened.
Deep Linking to Plays: Users can select any point on the timeline to share a deep link into that location within the video with their social network. This feature allows a user to brag to her friends about a big sack on a 3rd down on Facebook, or to tweet the location of a close-up shot of a player who attended her high school a few years ago on Twitter.
Flexible Video Modes
We have so much great functionality this year that we faced a major design challenge in deciding where to put it all! Job #1 of a good online video player is to provide big, beautiful video with minimal distractions and visual noise, but our audience of football fans also loves statistics, social interaction, highlight clips and updates from the sidelines. To have our cake and eat it too, we decided to create 2 video modes: "Big Video", and "Engage".
The application starts in Big Video Mode, where you get a large space to watch the action with optimal video size and quality.
Big Video Mode:
When you want to dive in to all of the additional content and fun features offered by the player, clicking the reveal button transitions to a (still very respectably sized) video window, exposing all the additional functionality. This approach allows interaction with the statistics and social features while still watching the video.
Multi-speed DVR Controls
To give our users full control over the action, we provide instant replay, slow motion, and 4x/8x fast forward and rewind DVR controls. Direct every play, see that last catch over and over, and even decide on that last penalty call for yourself!
When it comes down to it, football fans tune in to see a battle on the gridiron every Sunday. To let fans be a part of that battle, we included the Twitter Battle as a leading social feature.
- Users are encouraged to Vote/Tweet for their favorite team.
- "Tug of war" results are shown within the app
- The Tweets contain links back to the player to garner larger audiences and add fuel the battle
Twitter Integration, Ask Mike Florio
Some of the most popular features in the 2009 SNF application were the interactive Chat with Mike Florio that allowed fans to see their questions answered in real time, and Andrea Kremer's tweets from the sidelines. However, those features were scattered across the application. To improve on the concept of interactive Chat with Mike Florio and tweeting with Andrea Kremer, we unified and aggregated all of the Twitter streams including: Q&A from Mike Florio, Andrea's tweets from the sideline, and even stand-out tweets from the fans in the Twitter Battle!
While SNF Extra 2010 gives users greater control than ever over their game experience, sometimes you just want a nice big thumbnail to take you to that last touchdown. NBC's editors can also use the Silverlight Rough Cut Editor to create drive cut-downs and other aggregated highlight content on the fly.
For those fans that love to keep up to date on all the details, the Statistics section provides in-game updates every 1 minute.
I'd like to thank the entire team at Vertigo for all of your dedication, enthusiasm, and creativity in creating a beautiful, innovative piece of software. I'd also like to thank our partners at NBC Sports, iStreamPlanet, Microsoft, Akamai, Ascender, Inlet, and TweetRiver for your commitment and expertise that make the Sunday Night Football experience possible!
Vertigo is in attendance at NAB, where we're talking about our recent work delivering High Definition Media content over the web. You can swing by the Microsoft booth to talk with myself or Mike Moser, where we'll be showing off some cool demos. On Tuesday, we'll run a quick tutorial every hour on the hour demonstrating how to build a SMF 2.0-based video player in just a few minutes.
Also – be sure to check out our video players running in Inlet and Omneon's booths demonstrating live ad insertion.
I just got back into the office after spending a week in Las Vegas for the annual Microsoft MIX Conference. Vertigo had a huge presence at MIX this year, where the media work we've focused on for the last year was on display. I personally had a great time, and I want to thank everyone who worked hard to make this conference a success!
Here's a summary of the videos on the MIX site that relate to Vertigo's work.
Vertigo was once again featured the main MIX keynote. A ~5 minute video about our Olympics player was shown about 6 minutes into it! You can see the keynote here:
http://live.visitmix.com/MIX10/Sessions/KEY01 (jump to 6:30 into the video)
Scott Stanfield, our CEO, later went onstage and did a live demo of an app we built for Netflix for the Windows Phone 7.
Vertigo also built the live Smooth Streaming player that allowed users to watch the keynote live.
Scott did a great session with my colleague Mike Hanley, where they demoed doing a “Do It Yourself” Camera to Screen smooth streaming demo – pretty entertaining if you’re into video, smooth streaming, or even cinematography – you can watch it here:
Scott and I presented a session focusing on the work we did for the Olympics, how things worked behind the scenes, and why it matters. You can watch it here:
Jason Suess (the Olympics PM at Microsoft) did a presentation focusing on the possibilities that the new open-source Rough Cut Editor provides, and how we used it during the Olympics:
Silverlight Media Framework:
Eric Schmidt ran a presentation on the Silverlight Media Framework that Vertigo has developed and released on CodePlex:
As we're nearing the opening ceremonies for the 2010 Olympic Winter Games, our team is busy testing failure and recovery scenarios. We want to ensure that even if something goes wrong in our video delivery stream, users will have as seamless an experience as possible, and will have their video quickly and automatically restored. In the event that a catastrophic failure occurs in the video delivery stream, we make several attempts to automatically re-connect to resolve the problem:
Failing that, we provide a means to manually retry the reconnection attempt, or to simply browse to other video content.
In order to test the different scenarios that can lead to these kinds of errors, we often use Fiddler as a debugging tool. We often use the Autoresponder feature to return "502 Unavailable" responses to requests for valid, working video chunks in order to trigger these error scenarios for testing purposes. The other day, I encountered a situation where I needed to trigger an error in full-screen mode. As you may know, in Silverlight full screen mode, clicking on another application outside the full screen Silverlight app will exit full screen mode. I needed to trigger a video failure via Fiddler in full screen mode, but could not use Fiddler without exiting full screen and invalidating my test.
The solution to this problem is to use Fiddler on another computer as a remote proxy. This allowed me to do anything I wanted on my main computer, controlling the responses to the network traffic on another machine. Thanks to Olga Lepekhina and David Woods for pointing me to this solution.
To make it work, open up Fiddler on another computer on your network. Open up Tools -> Fiddler Options, and open the Connections tab. Check the box for "Allow remote computers to connect", and restart Fiddler:
On your main computer, go to Internet Explorer's Internet Options and click the LAN settings button on the Connection tab. Check the "Use a proxy server for your LAN" box. You can then enter the IP address of the other machine (the one running Fiddler), along with port 8888. (You may need to click the Advanced button to enter these settings).
This works in Firefox too, via a similar procedure:
Last month, our team at Vertigo announced the release of the NBCOlympics.com HD video player and Deep Zoom photo experience powered by Vertigo for NBC's coverage of the 2010 Olympic Winter Games. These players are live today, and you can use them to explore a wide range of content from previous years, as well as interviews with athletes and other previews. See our post here for the full details.
Our team is pleased to announce that an additional set of features was released last week! A dive into those new features appears in the second part of this post.
Part 1: Original Features
The video and photo experiences for NBCOlympics.com released in November featured:
- Slideshow mode allows easy one-by-one viewing of automatically advancing galleries.
- Grid Mode allows free-form and quick navigation of larger galleries.
- Scroll to zoom in and out of photos
Part 2: Extra Feature Set Released Last Week
This set of improvements consists of a range of goodies that enhance the video experience by allowing you to explore the tremendous amount of content available much more deeply. Let's take a look:
Support for live video
The opening ceremonies on February 12th are almost here, and the player is now ready to handle streaming of live coverage of the Olympic games!
Video Content Explorer
The video content explorer (accessible from the "Explore More Videos" button) allows users to view the full range of Olympic video content within the full screen mode of the player. You can explore videos recommended to you based on what you're currently watching, the most popular videos across the site, featured videos, or hot clips under "Must See".
The Top Athletes tab allows you to explore video content from a different angle – you can browse through your favorite international athletes and find all of their related video content.
Similarly, the sports tab allows you to explore videos within a specific Olympic sport.
When the Olympic games start, there will be a lot of activity to keep track of! Found in the "Inside This Video" overlay, the schedule view allows a time/schedule-based approach to exploring video content, and also provides a glimpse of upcoming events.
If you're interested in more information about the athletes participating in the video you're watching, you can find full details in the Related Athletes pane, accessible from the "Inside This Video" overlay. If you want to explore even further, you can flip over each athlete's "trading card" to find a list of all videos in which they appear.
Keep tabs on the latest community buzz about the event you're watching with the built in Twitter client.
I'd like to send a big thank you out to everyone on Vertigo's Olympics team as well as our partner organizations for their dedication and hard work in bringing together this unique experience. We're looking forward to seeing it in action during the winter games in February!