Building on the resume optimization feature at Winterview
Winterview | 6 minute read
Making it Easier for Students to Secure Employment
Winterview is a dynamic SaaS startup offering B2B services to universities and higher education institutions, while also serving B2C users, notably students aiming for their dream careers. The Winterview platform provides AI-driven resume enhancements, networking assistance, and interview preparation, equipping students for success in the job market. Winterview is currently in a closed alpha.
The information in this case study is my own and does not necessarily reflect the views of Winterview.
Filling the Gaps
As a Product Designer, I identified and addressed a critical gap in our resume optimization feature, specifically targeting users with resumes missing critical information or no resume at all. Over a five-month period, I experimented with different approaches for a resume builder from thought-provoking fill in the blanks to AI summarizing with clarifying questions. The final product resulted in a significant increase of 60% in Customer Satisfaction (CSAT) scores as well as a 33% rise in adoption in the resume optimization and creation feature.
30% of surveyed users did not have a resume compatible with the resume optimization feature.
Imagine you are a third-year student at the University of British Columbia. You hear about Winterview through a friend and eagerly upload your resume to their optimization feature. However, you hardly notice any improvements. "It's not my fault; I did everything I should have," you think.

In reality, your resume isn't even good enough to be improved. This was the case for 30% of our users, whose resumes lacked key elements or did not follow best practices like the STAR format (situation, task, action, result). This issue could reduce user engagement and negatively impact crucial B2B KPIs, such as student employment statistics.
For some students, this was their experience with Winterview's resume optimization feature.
The STAR format is not common knowledge.
We introduced a feature enabling users to build their resume from scratch using a standard template, integrating it with our existing improvement tool for use when resumes were irredeemable or nonexistent. Initially, we prompted users with questions to detail their experiences in STAR format, collecting responses in four fields and using AI for paraphrasing and summarization into bullet points. While some users appreciated this, others struggled with STAR prompts, resulting in less effective resumes.
Our initial solution was headed in the right direction, but unfortunately, many users did not fill out the STAR fill in the blanks "properly," meaning repeat information and a lack of KPIs. We tried adding example responses and detailed prompts, but it was still unsuccessful leading to bullet point outputs comparable to that on the right screen.
More detailed instructions won't fix this; high cognitive load is the real culprit.
After conducting usability testing, I realized that the issue was cognitive overload.
During usability testing, completing a task didn't necessarily equate to success. To ensure features were easy to learn, I established time limits as part of the testing criteria, determining how quickly a task should be completed to be deemed user-friendly.
From the post-testing debrief, I found there were 3 things that users had to focus on while filling in the STAR fields:
Given their divided attention, users often overlooked or misunderstood field instructions, leading to missing crucial information used in bullet point creation. Usability testing made me realize that longer, more detailed instructions would exacerbate the issue, not resolve it.
Design Principle: Make users think less and surefire ways to collect all necessary information.
As a product, our aim is to decrease the complexity of completing STAR fields used to create resume bullet points, with the ultimate goal of enhancing adoption rates and improving employment statistics. If needed, we are prepared to compromise on the quality of user responses to decrease cognitive load and boost usability.
Clarifying Questions: Filling in Our Blanks
I recognized the challenge of reducing cognitive load and collecting all required information without overcomplicated instructions. Our solution was to introduce a new section of clarifying questions to the feature. While these questions don't directly lessen cognitive load, they address issues arising from cognitive overload, such as missed instructions and the stress of precisely filling out fields.
Opting for a modular strategy, I employed clarifying questions in a subsequent step to directly inquire about specific details, thereby reducing cognitive load. For instance, if a user failed to include a metric in the result field, our AI would prompt with a targeted question such as, "By what percentage did you increase Click Through Rate (CTR)?" This approach elicits a straightforward percentage (X%) response, supplying the data necessary to create a compelling bullet point. Direct queries like this encourage users to quickly recall specific pieces of information, in contrast to broad questions that demand more time and cognitive effort to answer.
Overall, surveyed users were much more satisfied by this solution; however, there was still room for improvement, so we conducted user testing to uncover any other issues we may have been overlooked.
Adding some key supporting functionality
Following the implementation of clarifying questions, we allowed participants from our usability testing group to experience the entire resume builder, not just the STAR fields. Through observation and user feedback, it became clear that several supporting features were necessary to enhance the overall functionality.
Our areas of improvement were fairly straight forward. After implementing the necessary changes, we talked once again with the previous set of users to ensure our changes aligned with their expectations.
The idea of the supporting features was to make it easier for users to indicate common options; hence, we added checkboxes to streamline the process.
Changes are circled in red.
What I Shipped: The Final Solution
A complete walkthrough of the redesign.
Increasing CSAT by 60% and adoption by 33%.
The integration of a resume builder into our resume improvement feature proved to be a huge success. User feedback gave us the idea to add the capability to edit resume bullet points and the resume directly on our platform, instead of relying on an external text editor. We chose to enable users to edit the entire resume content at the download stage rather than individual bullet points, to avoid overwhelming them. This addition is currently under design.
Things to Try Next Time
While the project was a huge success, here are some things that I would like to apply next time:
1. Ask Users Directly: It's clear that lengthy instructions often go unread or misunderstood, as seen in our usability tests. If you need specific information, it's best to ask for it in a straightforward and separate way. This approach helps prevent it from being overlooked in the midst of other details.
2. If Possible, Offer Multiple Opportunities: Similar to the flexibility I incorporated in Dyne's meetup feature redesign, the resume builder required all essential steps within its flow. In future projects, along with flexibility, I'd focus on providing users with multiple opportunities to submit key information. For instance, if users initially omit crucial details in the "Action" field, we can't craft an impactful bullet point. However, with our clarifying questions, users get a second chance to provide this information if they forgot to earlier.
I (Robert Wong) am committed to keeping my site accessible to everyone. I welcome feedback on ways to improve this site's accessibility.
©Robert Wong 2024. ALL RIGHTS RESERVED.