Dashboard 2.0

Rewatch

April 2024

Launched shortly after Rewatch’s inception and prior to my arrival on the team, the first version of the analytics dashboard was quite rudimentary. It offered basic stats like the number of videos added, plays, and searches—standard usage metrics that appear on many dashboards.

Original Rewatch dashboard showing basic usage metrics

Original dashboard with basic usage metrics

While these are certainly helpful for looking at product utilization, they weren’t telling a particularly compelling story.

So we set out to answer the question: How could we transform our dashboard to effectively communicate the value our customers were gaining from Rewatch?

Research and Discovery

To tackle this challenge, we began by gathering insights from our Sales team, Marketing team, and customers.

Sales Team Insights

We held a meeting with the Sales and Marketing teams to figure out what they were envisioning for the new analytics dashboard.

The Sales team envisioned a dashboard that monitored usage at the individual level. They wanted to show how much each person was recording their meetings, enabling managers to coach their team members and ultimately improve sales performance. This feature would be particularly appealing when selling to, well… sales teams. But it also led to some great ideas around how the product could be used to encourage feature adoption.

Marketing Team Insights

Marketing, on the other hand, sought a tool—an ROI calculator—to illustrate potential savings to prospective customers at the top of the funnel. This spun out a different project, which I’ll touch on in a separate post.

Customer Discovery

Leveraging Rewatch’s own capabilities, I accessed dozens of customer interviews recorded by our account managers and sales team. A common trend emerged: Rewatch account owners were interested in metrics that justified their investment in the platform, especially around the time their yearly contracts were up for renewal. They were all wanting to know the same thing:

Does my team's usage of the product justify the cost?

Merely showing usage in a bar chart over time doesn’t really paint a picture of the value your company is gaining from each of those uploads, searches, or engagements. We needed to start figuring out how much time was being saved; once we figured that out, we could begin to calculate the monetary value.

Benjamin franklin with a gold clock necklace

Shaping the Solution

With these insights, we moved into the Shaping phase of the project—a process championed by our Head of Design, Conor Muirhead. The shaping process involves defining the problem, setting boundaries, and outlining the solution’s core elements before diving into detailed design and development.

We outlined three key principles to guide the redesign:

  1. Demonstrate Clear Value:
    Show how much time (and money) companies were reclaiming by using Rewatch.
  2. Ensure Transparency:
    Stand firmly behind our metric methodology. No smoke and mirrors—just real, explainable numbers. Authenticity and accuracy are a must to build trust and brand credibility.
  3. Coach and inform:
    Educate users on the benefits of utilizing certain product features, such as enabling automations, which eliminate the manual work of recording meetings.

Designing the Solution

With these principles in place, we identified a new set of metrics that would help us showcase value:

  • Watch time saved
    How much time was saved by users reading the AI summary and by watching the video played back at greater than 1x.
  • Money saved (ROI)
    Convert time savings into monetary savings (given an average salary) to demonstrate financial impact.
  • Meeting minutes summarized
    Show how many video minutes were summarized using Rewatch AI
  • Auto-recording adoption
    Display how many users had enabled automations, to encourage use of this feature to save time before and after meetings.
  • Async uploads
    Track videos uploaded manually (from the Rewatch screen recorder). This would show how many demos or project updates were being uploaded asynchronously.

We also decided to keep the following usage metrics (with tweaks) to make them more compelling.

  • Conversations
    Reframed from “number of comments” to emphasize user engagement when multiple comments on a video were tracked.
  • Top searches
    Showing search trends can help identify potential knowledge gaps.
  • Top videos
    Highlighting popular videos shows what content resonates with your team. I added some segmentation so that you can see the total amount broken down by views, plays, and engagements.

Design Explorations

While the engineering team began implementing new data capture methods, I shifted focus to the design front. I focused on creating modular components that could accommodate these new data points.

Watch time saved
Design exploration for the Watch Time Saved component

Design exploration for the Watch Time Saved component.

ROI calculator
Design exploration for the ROI calculator component

I started exploring connecting the ROI calculator to the Watch Time Saved metric. Since the ROI value was calculated using the amount of time saved.

Auto-recording adoption graph
Design exploration for the auto recording graph component

This felt okay as a direction but it wasn't apparent at a glance that it was tracking individual user adoption.

Draft for the auto recording graph that shows avatars

Another exploration that looks nice, but doesn't scale well (and half of these would be default avatars anyway).

Draft for the auto recording graph that shows avatars

I eventually settled on this version. The avatar icons convey individual users while also serving as a repeating pattern that can be filled proportionally for larger teams.

Meeting minutes summarized
Artboard with lots of iterations of the minutes summarized graph

This one was a fun challenge. There were lots of iterations as I was trying to figure how to represent time being compressed.

A bar chart for meeting minutes summarized

I built this one out before I scrapped it. The bar chart made it feel like a comparison instead of a time compression or proportional visualization.

An area graph representing the meeting minutes summarized

The final version designed as a proportional area chart.

Prototype and Testing

Once the new data capture methods were up and running, I began putting the pieces together and assembling a complete view of the dashboard in both Figma and the browser.

Mockup of an iteration of the dashboard
Mockup of an iteration of the dashboard

Another version. Text wrapping in this version caused awkward spacing at various browser breakpoints.

As we continued to build out the front end, we validated our calculations by funneling the new data into a spreadsheet. This approach allowed us to simulate what our customers would see and refine our metrics before finalizing the dashboard components.

Challenges and Iterations

We iterated quite a bit to refine the metrics, here's an example of a few the issues we faced:

  • Accounting for repeat visits
    As we analyzed the data, we noticed the numbers were skewed higher than expected because we hadn’t accounted for repeat visits to a video page by the same user. For example, when a user revisited the page to comment or read the AI summary, we were mistakenly counting the full video duration again as watch time saved. We solved this by tracking cumulative visits, which properly accounted for diminishing returns and gave us more accurate results.

  • Low ROI Metrics for Certain Accounts
    For a few accounts, we noticed a large amount of cumulative watch time and low counts for watch time saved, which meant a low ROI number. A few of these were bigger accounts, so I met with our Customer Success Lead to discuss. After some digging, we realized the companies with low ROI numbers were using our platform primarily to host training or onboarding videos, and their employees were required to watch the full video, which was leading to the low watch time saved metric. To address this, we introduced a new metric:

    • Total watch time
      Cumulative hours of video that users have actively watched.
metric component showing total time watched and average playback speed

Showing total watch time reinforces the platform’s value even when the time savings aren't significant.

This helped emphasize the value of our platform for companies with a low Watch Time Saved, as their cumulative hours watched were often substantial.

We also considered scenarios where the “money saved” metric might be lower than a company’s annual spend on Rewatch. While rare, we identified these accounts and planned proactive outreach to provide additional support and education. Additionally, we added an insights component with supportive messaging for metrics below certain thresholds.

Lessons Learned

Designing a dashboard that delivers meaningful analytics was an ambitious task. While we faced challenges, the insights we gathered helped us form a clear understanding of what our customers wanted to see and allowed us to answer the questions they were trying to infer from basic usage data. By focusing on metrics like time saved and ROI, and ensuring transparency in how those metrics are built, we can better demonstrate the tangible value provided to customers and build trust along the way.

TLDR
  • Find solutions that serve your customer while reinforcing the value of the business
  • Create transparent metrics to build credibility and trust
  • Sometimes a spreadsheet is the best prototyping tool

That wraps it up. Feel free to reach out with any questions or comments. Thanks for reading!

Note: I’m planning to do a project write-up on the marketing site ROI calculator, which came with its own unique set of challenges, so stay tuned.