Ecosystem 2014 Resources
Next week is the eLearning Guild’s first ECOSYSTEM conference. I am very happy to be speaking.
A good story and a nice presentation are fine, but the sessions that always gave me the most value were those providing concrete take-aways I could bring back to apply to my work. With that in mind, I’ve committed to ensuring my session attendees (and anyone with interest) are offered a tool to use.
My tool is an Excel spreadsheet to help evaluate systems. It offers a structured format to define and prioritize requirements. Then, you send these requirements to vendors to demonstrate how their systems address your needs-not claims-make them show you (I suggest short Jing or Screenr videos). On each vendor tab, you input your rating of the vendor’s ability to address each requirement. You also evaluate usability from multiple perspective (end users, administrators, managers, etc.).
When you have completed your evaluation, the spreadsheet highlights top 5 performing systems for each category and overall. Another tab with charts will map functionality and usability ratings into a matrix so you can focus on the vendors in the “magic quadrant”.
Here is an example of a completed evaluation: sampledata.
The data in the spreadsheet is from actual evaluations. Vendor names have been changed. All samples were removed along with comments (with exception of one vendor whose ratings are not included in the spreadsheet). But other than protecting identity, all data is valid. You can apply filters to the charts to get different views, and change the ratings on a vendor to see how it would change their results. You can even change priority weighting for criteria which can significantly influence the overall outcomes. This allows you to see how certain priorities determined by a client drive which systems are a fit.
And that’s the point: if you get your criteria and prioritization correct, it should lead you to “best fit” systems (I find most systems evaluations far too light in defining criteria, let alone prioritizing).
Here is the blank Template of the EvaluationTool.
Don’t like my criteria? Change it! The sheet is intended to capture YOUR needs. Although I offer over 70 unique criteria, there are obvious biases. It’s just that some criteria has not been a consideration in evaluations I have performed (not much on social, nothing on gaming or badges, little on Talent Management, content management). Whatever doesn’t work for you, change on Tab 1 – Evaluation Sheet. All the other tabs read from Tab 1. So changing criteria on the Evaluation Sheet (Tab 1) will update criteria across all other worksheets. However, if you remove a row, you will have to do the same on all worksheets. You also can just zero out priority for the rows with criteria your organization won’t value.
The Excel document is completely open and unlocked. Adapt it in any way to fit your organization’s needs. If you make an adaptation to the spreadsheet that you think I should consider including in the template, please don’t hesitate to let me know.
The process to use the spreadsheet to perform an evaluation:
- Define your criteria in Tab 1- Evaluation Sheet (put it in terms of what you need it to DO for your business) and think of multiple perspectives/functions
- Remember- if you don’t like what is there, change it. Make this yours.
- Assign a Priority of 1 (low) to 3 (high) for each criteria (or zero it out if you don’t want to include it in the evaluation).
- Once your list is complete, copy the CRITERIA ONLY (don’t share priority weighting – it influences vendors’ response) and send it to vendors, requesting for a quick demonstration (5 mins or less) how their system can address each need.
- Tools like Jing or Screenr that limit recordings to 5 mins are ideal. Show me. Fast.
- Provide resources as necessary. If you use Articulate, and you publish to SCORM2004, provide sample files. I highly recommend generic examples (i.e. “ACME, Inc.”) since this is labor-intensive for vendors. Using generic files and specifying that your organization’s name is not to be used as part of the demonstration can allow vendor to post the demonstration publicly for marketing or support purposes.
- If a vendor balks at providing quick recordings, think about how serious they are about winning your business. Many will request scheduling hour-long demos to the full evaluation team. Most stakeholders will truly only need to evaluate a small portion of the criteria, so this is not a wise use of time.
- When vendors reply, select an empty vendor tab in the evaluation spreadsheet, put the vendor’s name in cell A1 (this backreads into the main Evaluation Sheet – Tab 1 and Tab 2 – Charts) and copy in their responses to the second column. ONLY paste in their response column, don’t overwrite criteria column A, which reads in from Tab 1- Evaluation Sheet).
- Before going too far with rating, do a QA to ensure you’ve put the correct responses in the correct rows, or you may be rating against the wrong criteria.
- Rate how their system fulfills the requirement. Grade critically.
- For any criteria that the vendor’s reply is confusing or absent, follow up as necessary to give them the opportunity to properly represent their system’s capability.
- In Tab 1- Evaluation sheet, provide a Usability Rating in the Vendor’s column for each area (End User, Manager, Reporting, Admin…).
Repeat this process for each vendor.
On Tab 1 (Evaluation Sheet) and Tab 2 (Charts), you will see the weighted scoring, charts, and highlighting of top-performing vendors to assist (not perform) your decision-making.
I could write a ton more about the tool, how to use it, and the process, but the best way to learn? Download. Tinker. Have fun.
And if you have any questions, Comment below. If you are attending LSCon or Ecosystem 2014, I will be in Orlando between the 19th and 21st. Track me down. Happy to chat you through it.
Although this is the big takeaway from our Ecosystem 2014 presentation, I assure you there are still more great tips and learnings to be shared.