Whether it’s project management, accounting and billing, or learning, software is integral to the productivity of any worker and the success of any workplace. While software is often a genuine blessing in our professional and personal lives, when it’s time to make a change or dip your toes into a different technology pool, where do you begin? How do you navigate software evaluation without feeling stressed and overwhelmed by the sea of options?
When it comes to the available learning software choices out there, the sky really is the limit. But pause. Take a deep breath (or three). Because we’ve developed and tested an eight-step software evaluation process that will help you make the right choice for your company and your learners.
Looking for a little more zen while mulling that next upgrade or new solution? Keep reading!
Let’s take software evaluation in strides
Recently, we decided our project management (PM) software needed more than a minor tune up. The technology had matured a lot over the years and the functionality of our current software wasn’t very intuitive or meeting the needs of the team beyond our project managers.
Because we believe in going the extra mile to understand what resonates with our end users, we started our evaluation journey at a critical launching point: discovery.
Step 1: Discover what your learners (aka end users) need
First and foremost, be sure to pick the brains of your users—i.e., your learners—so you can understand what they truly need out of the software experience. After all, when your users feel included and involved in the decision-making, they’re more likely to embrace your final decision, no matter where that ultimately lands. (Groans aside, teamwork really does make the dream work.)
During this step, you should come away with an answer to this key question: What are the biggest user pain points or challenges you’re hoping this new software will solve?
Step 2: Pinpoint your must-haves
Translate the problems and needs you uncovered in the discovery phase into tangible requirements. Ask yourself, what does your LMS or LXP have to do to solve those problems or address learner needs? For our PM software evaluation, we developed a comprehensive spreadsheet that included:
- Giving each of our requirements a brief description that explained what we wanted the software to do. (Feel free to also note if and where there’s any flexibility in your particular requirements.)
- Recording whether requirements were really must-haves or simply “nice-to-haves.”
- Weighting every requirement between one and five (five equals showstopper and one equals skippable).
Step 3: Curate your list of possibilities
To build that list, see what other people have said and think about the options by consulting online business software review platforms like G2 Crowd and Capterra. If your budget permits and there’s one available for the software you seek, you could also check into Gartner Magic Quadrant.
And believe it or not, you can and should go ahead and Google it.
As simple as it sounds, a Google search can give you valuable insight into what software solutions are out there, as well as what technology may be up and coming.
Step 4: Go window shopping
Window shopping is a great way to soak up information about a lot more than what’s trending in men’s or women’s wear. As we mentioned earlier, your list might become pretty epic, so window shopping websites is the first step toward culling it down. Feel free to linger at your own pace, but don’t feel pressured to take more than 10 to 15 minutes to complete this step. Here are some additional tips:
- In our opinion, a software product’s website often looks as good or even better than the actual software. If the website doesn’t feel intuitive or look very impressive, the product probably isn’t going to deliver what you’d like it to either.
- Check out the software product’s features while you peruse their website. A couple of questions and points to consider:
- At a glance, do the features generally meet your needs?
- You can get a good sense of a product’s target and point of view by examining the software feature they’re highlighting. Does that feature feel like it meets your macro-level needs?
- If the company has made their roadmap or release notes public, this will provide insight into the features they’re working to improve or will soon debut, and where their software is headed in the future.
- During window shopping, aim to shorten your list by 50%.
Step 5: Take the software on a trial run
Take the software out for a spin—and yes, choose the scenic route. After all, if you don’t make time to try it, you’ll never really know if it’s going to get the job done for you. Sign up for a free trial or reach out to a sales rep to receive a demo and trial account. In our trial phase, we spent two to four hours with each product to judge if it checked the box for our five most important must-haves. In trialing, you should trim your list by an additional 25-50%.
Step 6: Designate your testers (and put them to the test)
Now that you’ve narrowed down your options, it’s time to truly swim in them. Recruit a small group (six or seven people) to get up close and personal with your top software contenders. This test period will ultimately determine whether the product performs and streamlines your workflow.
There are a variety of ways you can put your test group to the test. We created a prototype stage where we gave a cross-functional team of individuals a scoring guide (rubric of use cases) for rating each product. To aid in the scoring process, we also provided the team with several prompts. (You might want to designate a test group leader who can take an even deeper dive, too.)
Combined with providing their overall perspective on what the PM software did and didn’t do well, Maestro’s test group scored each tool on a scale of 1 to 5 in each use case. For our prototype stage, we utilized weighted scoring. (The nitty-gritty: Test group participants scored each requirement between 1 and 5, with 1 equaling not important and 5 equaling most important. Then participants would also use the same scoring system to rate how well each tool within the software implemented the given requirements. That rating would be multiplied by the weight to achieve a final score.)
Once your group has “run the software gauntlet,” sum up the scores to determine a final number for each platform.
Step 7: When it’s close to liftoff, schedule a pilot period
At this point, you should have two (or maybe three) possibilities to pilot. A short pilot period gives you the opportunity to run a real-world situation through the software.
- For your pilot, you could run a workflow, project, or in the case of learning, a corporate training module.
- Make sure you’re engaging the individuals who will actively use the tool on a day-to-day basis. Remember, their level of engagement will affect their confidence and buy-in on the decided winner. In person or via Zoom, go through your task or process in “simulated time,” speeding up the normal workflow so all steps can be completed in an hour or two.
- Depending on your aspirations, business criticality of the software, and/or budget, you could opt to pay for one or two pilots through your top vendors. Then select a few members from your test group to see how the software performs over a short time period (e.g., three months).
- Structure your pilots so they capture constructive quantitative and qualitative feedback from all of your players. Again, provide this group with a rubric for scoring specific functions and features, and to gather their broader impressions and thoughts in live discussion, interview style.
Step 8: Reach your verdict—and the winner is?
If you’ve found a clear winner, congratulations! Your software evaluation work is done (when it comes to software selection anyway). If you haven’t reached a verdict, fear not. Gather and review all of the quantitative data you collected through your rubrics and score sheets to see if there’s an obvious favorite.
Still unsure? Talk to your test group and anyone else you asked to participate in your evaluation process. Have these MVPs rank your remaining options. Try to investigate beyond the average rank. This is where weighted scoring can be especially valuable in driving the conversation. (For example, sometimes a single person’s not-so-rosy ranking can drag down the overall average, so seeing that a tool was in the 1 or 2 spot on all ballots but one can provide greater perspective.)
Last but definitely not least, listen to your gut. There’s a reason people tell you to rely on it for more than hunger cues.
Looking for more about how to organize your learning for success?
We have more project management content just for youDiscover more