The Verge Livefeed
At Vox Media, we redesigned The Verge’s Livefeed to streamline how people experience The Verge’s event coverage.
The Verge is a large editorial site for all things technology, culture and inbetween. The live feed is an editorial component that rapidly updates live as editors post to it during big events and keynotes, like E3 and Apple’s WWDC. There are two components of the livefeed app – the editorial dashboard where people post updates and the userfacing feed where content is shown in real time.
Working with our UX researcher, we wrote and refined a series of editor and reader facing interview questions.
Through two sets of interviews, we assessed what the shortcomings of the previous live blog where, and prioritized what we could accomplish in our timeframe. The major underlying goals were:
- Account for the wide range of live blog states (live, archived, etc.) and content types (images, text, embeds, etc.)
- Make post content accessible / valuable from multiple entry points and times, even after the event is done.
- Preserve important parts of editor’s workflows.
- Take advantage of a new brand design language and provide the best live experience for users.
A number of things were considered through early mocks – including using progressive web app techniques, AMP and different layouts and styles. Time permitting, it would have been ideal to take a bigger step back and approach more things from a user needs perspective.
Instead of a long stepped user flow, the Livefeed is a couple views with a wide range of dynamic states. Design moved quickly – we used Sketch mockups to explore overall look and feel while accounting for specific UI views and breakpoints. Looking back, it may have been worth it to increase the amount of design variations and introduction of functionality. We introduced a customizable background image in order to let the personality of individual events to come through – both during and after the events. The feed itself remains fairly information dense to give people context between posts.
Posts can also be linked to at any time of an event’s lifecycle (live, or archived) and show up in a modal. We redesigned it to make it more pronounced from the rest of the posts.
Prototyping & Motion
Because of the dynamic nature of the app, every interaction and UI state had to be considered. Posts could be added, removed and pinned to the sidebar by publishers at any point in time, and users immediately see the changes.
The primary motion piece was the addition of a new post into the stream. Because the design wasn’t dependent on user interaction, After Effects was the most appropriate tool for prototyping high fidelity animation. Because posts could be loaded concurrently, I choose a staggered downward motion that best conveyed the flow of information without being too distracting to the user.
Working with an engineer, we implemented the design across the stack. I focused on the front-end code which revealed more things to account for (in hindsight more of this at the very beginning would a been helpful). A good amount of time was spent making sure the site and animations were performant across breakpoints and UI states. The livefeed shipped in several weeks and is a minimal but exciting way to experience live events.
The new Livefeed is live on The Verge.