I know there was a divided discussion recently on analytics, mostly surrounding user behavior. I wanted to offer a bit more insight on what will be available through the customer support workstream in the future and find out how you want the data we will be collecting and processing delivered.
Don’t get too scared when I say “data”. Customer emails must be collected to work with customers via Zendesk, but we use a redaction app when customers slip and send us too much information (PII, private keys ((yes this happens)), etc).
We will however be collecting data to assist with the product roadmap like:
- bugs reported, how many users reported
- topics that require support and how many tickets issued based on topic (topics will be broken down by products and features)
- unhappy users and their feedback
- retention attempts
- Customer Satisfaction Surveys
and then reports around our agents’ performances to help shed light on the job we are doing. We have always run a customized QA program with our agents, but I understand as how the DAO works now, having a more 3rd party QA system would be the MOST transparent, but if we can continue doing the QA ourselves with our agents I can promise we’ll deliver reports weekly of their performance and any stand out interactions with customers. All of the data we collect on agent performance will also be available to the public.
…which brings me to my next point. How do you want the customer data delivered? And how do you want the agent performance data delivered? What kind of cadence? Is there any other information you’d like us to share weekly? Monthly?
I’d like to propose the following to get the conversation started:
Weekly: Via #general and at the DAO Sprint on Mondays we will share general performance (total support work (tickets), response times, outstanding issues/bugs)
Monthly: Via #general with a report to review and perhaps a Support Community Call where we can go over the report in more detail for those curious we can dive into more details about why customers are reaching out, where friction points are, what is making people happy, and a deeper look at our agents’ performances.
I’d love to hear of any other KPI’s you’d be interested in seeing. Being that we’ll be the touch point for almost all users who need assistance, there is a wealth of information we want to make sure is passed on to the creators and builders.