Vettd began with a key AI capability. Our challenge was to find the minimum set of features that allowed users to experience this innovation.
Vettd is a startup that was founded in 2014 on a key AI capability. With a novel approach in the semantic area of Natural Language Processing (NLP), we were able to digest a job description and tell you which candidates were the best fits for that position. (Since then, Vettd has evolved into a full-fledged AI platform capable of much more, but I digress.)
Preliminary discovery had already been done before this project kicked off. We had a familiarity with recruiters, the problems they face, and what they were looking for in solutions. These insights are what inspired our data scientists to deliver the AI magic in the first place.
Vettd wanted to find the minimum set of features that allowed users to quickly experience our AI.
The project needed to be done in order to capitalize on Vettd's early discoveries in AI and Natural Language Processing (NLP). Our AI needed to be wrapped in a compelling experience that would solve key pain points of our target users.
By the end of the project, we wanted:
This was a collaborative effort with folks from all areas of the business. I worked with our founder (a former recruiter) who had done the early discovery work. He and I brought the wireframes together then worked on testing with users once our prototype was built.
I also worked with our developers and data scientists to implement the designs and build the prototype.
Limited company resources forced us to keep our initial focus narrow. The MVP needed to be lightweight and have just enough features to get people to see what our AI could do.
In terms of visual design, we just wanted something that “looked good enough.” It needed to appear trustworthy enough to handle sensitive user data, but we couldn’t afford to spend time chasing pixel perfection. We utilized the Bootstrap framework and applied some custom styling.
With early customer discovery done and an AI capability ready for action, we started with mapping out a user journey. From here, we looked at the step in that journey that would represent the biggest hurdle standing between our users and our AI superpowers. We decided to focus on the steps of inputting the job description and uploading candidate resumes.
We decided to address this part of the journey with a trusty ol' UI pattern - The Wizard. 🧙
Without files to process, our app would be entirely useless. Our AI required multiple resume files and a single 'job criteria' file in order to properly function.
We selected a wizard UI pattern for this job creation flow for a few key reasons:
We steered clear of having all required user inputs (position name, job description file upload, and multiple resume file upload) visible on a single page form. This would make the user flow appear more complicated than it really was.
The steps in our wizard were ordered according to how recruiters typically began working on a role. Early discovery showed they first needed to understand the role, then they could begin to analyze candidates.
Once all steps of the wizard were complete, the user was rewarded with seeing their candidates auto-magically ranked with AI. Candidates prioritized by a letter grade would be presented in a table format because early research suggested that recruiters were accustomed to working with tables.
I worked with our developers to create our prototype web app. We used the Bootstrap framework to help bring it together quickly.
There are 3 main sub-tasks the user has to complete in order to Create a Position (and thusly see our AI in action). We conveyed those steps on the right side of the interface with a "Summary" section.
We used UserTesting to do an unmoderated remote usability test. With the help of UserTesting, we were able to enlist participants who currently worked as recruiters (our target user).
They completed a series of tasks such as:
We also had them share their thoughts aloud as they worked. We prompted them with question like:
We created an end-to-end experience to use in testing and demoing. The experience highlighted our AI and helped push the business forward.
The wizard UI worked well in user testing. People had no trouble understanding what was required at each step and typically stayed motivated enough to continue to the end of the process culminating in a ranked list of candidates. 8 of 10 users completed with minimal issues.
Through a few cohorts of user testing, we were able to clean up the UX and get drive at the validation we were looking for.
We heard quotes like "I would sell this to my executive team immediately. This could be huge." Users also cited time savings, boosted confidence, and improved decision-making as key benefits. We knew we had something worth continuing to pursue.
We looked at the issues people had when interacting with our prototype and used these to map out our next move.
We discovered that recruiters typically didn’t have resume/job description files readily available. 2 of 10 didn't have access to raw resume files. 3 of 10 needed to take time to manually download resume files from another system. This insight would later lead to job board and ATS integration feature additions.
This was my first time working with really experienced devs and data scientists. It became a crash course in how design and development meet and compromise. There were several instances where the UX got negatively impacted because things like file upload and AI processing took longer than anticipated. With a well-defined focus and clear objectives, our decisions on what compromises to make were fairly clear.
We also learned to never underestimate the value of seeing users interact with our software. We ended up going back to use UserTesting for several more projects.
We were able to secure enough funding to continue product development efforts and AI research.
We continued experimentation like this as we iterated and expanded the feature set further. Eventually, we brought a full fledged, AI-powered Applicant Tracking System (ATS) to market.