Reducing time to insight for investors through panel calls
Role
Design lead, Partial PM
Team
Design Intern, Engineering Manager, Director of Product
Timeline
6 Months, Q2/Q3 2023
Tegus
CONTEXT
Tegus is a research platform for investors that houses dense qualitative and quantitative content.
This project was to improve the experience for new qualitative content called panel calls. Panel calls are calls conducted by an investor with a panel of experts in a given domain. Our team was tasked with taking the transcript of these calls and adding them to our vast database of investor-led research calls.
What was wrong with panels calls?
Panel calls in a beta version accessible to 350+ users, was developed based on the existing 1-on-1 call format and process. However, this has revealed various challenges due to differences between panel calls and 1-on-1 calls, leading beta users to refrain from using panel calls.
The operations teams was working tirelessly to create an content flywheel.
Problem:
Beta users are avoiding using panel calls due to two primary issues—one related to product strategy and the other to design
Investors are unsure how to effectively use panel calls in their research
Panel call transcripts are hard to follow and not easily digestible (note below)
THE GOAL
How might we improve panel call engagement?
IDentifying opportunities
TURNING OPPORTUNITIES INTO SOLUTIONS
Utilizing an opportunity tree we identified potential solutions and created experiments we could use to test the solutions. We then kept track of those solutions and how we tested them.
TESTING
We conducted user feedback sessions and a pop-up Hotjar survey on Tegus with the goal of understand what resonated with users.
12 total user interviews
350+ beta users surveyed
Here is our list of experiments from our tree above with the experiments we carried out along with some notes.
TESTING OUTCOME
Folks loved the buzz around upcoming panels, but creating context around multiple speakers and being able to consume long form content more easily resonated most with users.
documenting our findings
Here is our list of experiments from our tree above with the experiments we carried out along with some notes.
biggest pain points identified
Difficult to keep track of who’s speaking
forgetting expert details while reading
Lengthy and dense paragraphs make it challenging for users to consistently keep track of who is speaking and understanding an expert's background and perspective is crucial for interpreting a panel call transcript, but it's easy to forget.
using pain points to create design goals
Ensure users remember the expert's background
Make it easy to track who is speaking
ideation
Armed with our research and design goals, our amazing intern created these iterations.
implementation
We were ready to get our designs to development after our intern had left. We had two considerations when thinking about how to streamline our designs.
Engineering Constraints: The way the backend was set up for documents added a lot time to implement the side panels and made responsive views more complicated
Beta expansion: In order to get the best feedback before a GA release, we wanted to expand to an unbiased pool. This also gave us quick wins like timing of hover to be 1.5 milliseconds
OUTCOME
In the original beta, 15% users were reading panel calls transcripts MoM, by the end 40% of users were reading panel calls.
We released to GA and sent out a feedback survey to all users. A great way to sum up our work:
“This is so easy to read - you should do this to all your transcripts!” - Anji Ran, Hedge fund Analyst