Shipped in Three Weeks
The Behind-the-Scenes Story of Building Verkada’s Viewing Station
by Martin Hunt, Alex McLeod, and Nathan Wallace
For most Verkada customers, part of the appeal of our cameras is not having to watch them. Video is stored on-camera and available at a moment’s notice from anywhere, ready for review when and if they need it. But some customers need to monitor their streams live. Security guards and receptionists, for example, rely on video security systems to provide real-time awareness of what’s happening in their physical space. Until recently, we didn’t have a good solution for them.
Web browsers aren’t optimized for decoding multiple video streams, so watching feeds via our Command platform was limited to a maximum of nine simultaneous streams — too few to meet the needs of our larger customers.
“develop a new product that would allow customers to stream 16 live video feeds simultaneously”
Last summer, we decided it was time to find a fix for these users. We needed to move quickly, as key customers and prospects were depending on this functionality. Our goal: develop a new product that would allow customers to stream 16 live video feeds simultaneously.
Early Experiments
The prototyping phase started with Damien Dunn, who began evaluating several ARM (Advanced RISC Machine) platforms, which are generally used for mobile devices like phones and tablets. Over the course of a month, Damien implemented Verkada’s code on each development board and benchmarked performance. He iterated through many combinations of hardware and software to maximize the number of simultaneous video streams that could play without interruption. Using ARM, we were able to improve upon browser performance, but not by much. Getting more than 12 simultaneous streams was still a challenge; our target of 16 seemed out of reach.
Thinking outside the box, we decided to evaluate the Mac mini® as a platform for our live-streaming product. Because of their use in video editing, Intel chipsets have made great strides in video decoding capability and allowed us to quadruple our previous capacity to 36 simultaneous streams. The mini® offered another advantage: we’d be able to use our existing decoding software written for iOS. Given our time constraint, the path forward was clear — we would build our live-streaming solution on the Mac mini® platform.
Customers needed a solution immediately, so we didn’t have time for exhaustive in-house testing. We had to get this in the field and test with real customers. So with a lean team of just three engineers — Jeff Zhan, Alex McLeod, and Nathan Wallace — we embarked on a whirlwind sprint. We ported over our video code, rebuilt the UI for macOS, and added custom layout features so an administrator could configure and arrange how video feeds are displayed. Within three weeks, the first units of what was now dubbed the “Viewing Station” were on their way to customers.
Testing in the Field
Our first beta customer was a hospital system needing to monitor patients at risk of self-harm. It’s hard to imagine a more critical use case — if our product didn’t work, even for a moment, a staff member would sprint to each room immediately to check on the patients.
As is always the case with new products, there were issues. Video frames dropped, and the streams often stuttered and or completely froze. Our customer became frustrated.
Testing with real customers and critical environments allowed us to discover bugs much more quickly, but it also presented a major challenge: how do we diagnose and fix issues on a system with which we can’t interact directly? We needed reliable remote access, while being mindful of security risks.
To solve this problem, we built our own SSH-like agent on the Viewing Station that enables interactive terminal access. Built with the Go programming language, it functions entirely over HTTPS and limits access to authorized Verkada engineers. This “support agent” runs as a separate process and works by continually making requests to a Verkada proxying service for new commands. If any commands are found, they are executed in a separate pseudo-terminal (PTY) by the support agent. Then, the output of the PTY process is sent back to the proxying service continuously via HTTPS. This way, we can use interactive shell programs like “vim” and “log” to solve almost all technical issues remotely — all without needing to open ports on the device or requiring our customers to return their test devices.
Remote debugging proved particularly useful for fixing issues with video streaming. For example, the Viewing Station might be streaming video from the cloud, when the customer would prefer to save on bandwidth by using the local network only. Alternatively, videos might stutter or freeze after streaming for some time. By tunneling into the Viewing Station, we were able to check if cameras are actually accessible on the local network and are reliably sending video to the device.
Building this tooling helped us iron out edge cases and ensure data reliably took the right routes, from the camera to the Viewing Station and into the video-decoding pipeline. In the end, we were able to remotely debug the Viewing Stations and clear up the stuttering and dropped frames within a week of the initial deployment.
Fine-Tuning and Tweaks
From there, we focused on additional tweaks to improve the user experience — like making it obvious when and if a stream did freeze, a critical notification for those monitoring videos that didn’t have a lot of motion. We were also able to overcome some of the challenges of working with a product that, out of the box, is not designed to be a managed system. By default, a Mac mini® behaves like a typical desktop computer, popping up dialogue boxes and update alerts, running screen-savers, and loading a home screen after it reboots. By contrast, we wanted the Viewing Station experience to be “headless” — no mice, no keyboards, just a device powering a screen that provides situational awareness to building staff.
Iterating the User Experience
Shipping to real customers with real use cases early meant we could incorporate feedback into our process from day one. Here are just a few of the improvements driven by that feedback:
- Small adjustments to fonts, borders, layouts, animations, etc. took the user experience from good to great.
- We built a system to intelligently position videos on a grid, optimizing layouts and size based on number of cameras, camera aspect ratios, and display dimensions.
- We built a key security feature to notify admins if a Viewing Station goes offline.
- Developed “Motion Tiles” that show video feeds with the most activity so users can focus on what’s most important.
Ready to Launch
Once major bugs were addressed and key features implemented, the Verkada team worked together to get Viewing Stations on the market. We stocked the warehouse, trained our sales engineering team, and made a formal product announcement in November 2019.
The Viewing Station VX51 is now available for purchase on Verkada’s website. Dozens of customers rely on the Viewing Station to secure their physical space and provide situational awareness to staff members, and we’re excited to see its use in the field expand.
What’s most exciting is what this project says about the Verkada team — the way we prioritize user experience, the creativity with which we solved this problem, and the speed with which we were able to get a solution out the door.
Interested in joining the Verkada team?
Check out open roles or email questions to recruiting@verkada.com