|Questionmark provides an enterprise-level assessment and evaluation software (Perception) and powerful analytics platform.|
next phase of my career with a new organization after working with the same company for more than a decade. It promises to be an amazing experience. Key connections and friends from my prior role have said their goodbyes or made arrangements to stay connected. However, there is one group in particular that truly helped me in my career who I must bid a fond farewell to: a vendor, Questionmark.Soon, I start the
Within my first two years using the tool, I deployed over 10,000 questions across 3 languages for over 300 courses in a highly regulated environment. In fact, it was such an effective solution, that our department eventually deployed virtually ALL of our LMS content through Questionmark- even hacking Questionmark to embed content from other elearning tools (as well as PDFs or Office documents) from a central content repository. This gave me much more flexibility with updating content over constantly uploading SCORM packages to the LMS and encountering manifest errors. Our group also essentially abandoned LMS reporting, because the breadth and depth of information from Questionmark analytics was vastly superior to the data we could tease out of our LMS.
[blockquote type=”blockquote_line” align=”right”] If you are involved in assessment design or development, you owe it to your users and yourself to leverage these resources to improve your craft.[/blockquote]
The resources offered by Questionmark on their company website for writing valid assessment items or how to correctly interpret question data are the top in the industry. It is no secret that I am an Articulate fanboy and that I believe anyone in elearning design or development should engage in the Articulate community. However, if you create assessment and survey assets for your users, you should explore Questionmark’s site. From starting items like different ways to use assessments, to complex topics like understanding P-value or how to set an appropriate score threshold, they share a wealth of high quality, well researched resources.
My hacks with their platform expanded into doing the things our team hadn’t seen before and had difficulty delivering workable solutions. I developed observational assessments for salesfloor activity and a question-driven dynamic catalog for employees to filter and access assets scattered across various platforms across our company’s intranet. I could provide data on which exact step in a software simulation posed the most challenge to our employees and also identify that employees were not adapting their sales approach appropriately for our second most common customer type. I was using the tool in unique ways to answer the big questions managers were desparately seeking answers for.
It may seem strange to bid a vendor farewell, but I’ve benefited greatly from my 10+ year relationship with the product and company. They provided a great solution, and my experiences with their support team has always been an example of what support should be.
It is not likely that my new role will require me to perform significant assessment development. But my decade-plus long run in a world of compliance and certification with heavy regulatory oversight and audits, Questionmark was an integral partner for our learning solutions. I am very thankful for their partnership, expertise, and support while in my prior role and bid them a very fond farewell (as a customer; I will always be a fan and a friend).
Last week, I had the privilege of presenting as part of the eLearning Guild’s Online Forum for Assessments and Evaluation. I was able to attend several sessions from fellow presenters and was astounded by their offerings. It was truly an honor to present among such insightful, committed, and respected professionals. [blockquote type=”blockquote_quotes” align=”right”] Significant work and support is provided by the Guild to create high-quality offerings from presenters each and every time.[/blockquote]The high quality of Guild events is no accident. It is designed into the process. I wanted to share my experience as an online forum presenter. Significant work and support is provided by the Guild to create high-quality offerings from presenters each and every time. The quality is all about the support, and the Guild provides a tremendous amount of support. Christopher Benz, the Director for the Online Events for the Guild, approached me to discuss presenting as part of this forum after my live presentation at Learning Solutions 2011. Chris manages an amazing amount of detail for the online forums, from covering the initial expectations and timeline with presenters, right through to performing the final sound check just moments before presenters go live (and everything in between, which I won’t attempt to cover in this blog). Chris makes the process manageable for presenters and handles each question and issue with grace and ease. He is a true master of his craft. Every presenter is also assigned a mentor. Mentors orient presenters to Adobe Connect, keep them on-track with tasks leading up to and during the event, and participate in practice sessions to offer advice about enhancements. A minimum of 3 mentoring sessions are performed prior to the live event. I am very thankful for all the guidance and support provided by my mentor, Melissa Chambers. Her contributions and insights as my mentor resulted in key improvements to my session. She also partnered with me during the presentation to take on the task of managing the interactive polls, so I would not need to multitask. If you are ever extended the opportunity to speak on an eLearning Guild Online Forum, take it. I cannot speak highly enough about all the Guild does to support presenters in moving confidently through the process, and the expert advice and insight that is shared during this process. It not only improves the final deliverable, but also develops the presenter’s skills and insights about what it takes to construct and deliver a high-quality online forum.
I have been invited to speak as part of the eLearning Guild’s Online Forum: eLearning Assessment and Evaluation Approaches August 18 and 19. The event has an amazing lineup of professionals and I am honored to be included in the event. I will cover: [list style=”arrow” color=”blue”]
- techniques to provide valuable data for ALL levels of evaluation
- different content-embedding strategies to optimize flexibility and workflow without risking critical data or creating common LMS “hiccups”
- techniques to collect multiple measures to validate metrics and provide opportunity to aggregate or parse data collected
- how to stucture assessment assets at many levels for reuse in multiple contexts
- a simple trick anyone can use to expand one good question into deep question banks
Click the image above to view the one minute video about Internet Security Awareness training and how your organization can get: [list style=”arrow” color=”blue”]
- A free security audit to measure how Phish-prone™ your organization is
- First2Know™ Internet Security Awareness Training to make your employees less susceptible to phishing attacks (25 minute, high-quality training that is updated to keep pace with industry changes)
- Clear reporting and measures of the training’s effectiveness through training reports and scheduled phishing security tests
- ThreatApp™ daily smartphone updates with relevant threat intelligence
- Clearly understand your problem Stakeholders value training when they clearly understand the problem it will solve for their business. The free security audit measures how Phish-prone™ employees are. This is an actual test of your company’s employees-not an abstracted, generalized conclusion of risk exposure based on industry averages. The tests are real-world examples using the same tactics as cybercriminals. There is no risk for stakeholders to gain a clear and accurate measure of the true risk exposure for their organization.
- Design a system, not a course Too often, information security training is designed like a marketing campaign. A lot of information is blasted at users as one course to complete. Get the check in the box to “mark compliance”, and be on your way. This won’t produce meaningful or lasting behavioral change. Many courses don’t provide any type of experience in how to react to threats, opting instead to test recall of facts about cybercrime. Even courses with skills-application testing have a critical design flaw: users know that they are in training and that their actions are being measured. This heightened awareness of the fact their behavior is monitored in the training environment can influence users to act differently than they normally would in the work environment. In contrast, KnowBe4 uses the security audit to measure actual the on-the-job results to establish a baseline. The provided training presents scenarios to educate you on how to react to potential threats, and what to do if you suspect if your system is compromised. The training measures capture what you learned. What is more important is what transfers to the workplace-when you are not in training when cybercrime is top-of-mind as a core subject matter. Are learned skills being applied? Ongoing scheduled phishing security tests enable stakeholders to see how skills transfer to work, and enables the organization to take corrective action when necessary. Support tools such as Threat App™ compliment the training. To make cybercrime prevention in a business effective, a course alone simply won’t do. It requires skills application in the a real-world workflow, not a separated test experience as part of a learning event where learners realize they are being monitored as part of training.
- Repetition, reinforcement = results [blockquote type=”blockquote_quotes” align=”right”]Clients in a test campaign realized an immediate overall 74.55% reduction in phishing susceptibility after the first training session. But continued phishing tests and supplemental training reduced the Phish-prone™ rate to 0% for all these clients by the 5th cycle.[/blockquote] The results are transparent and undeniable. The security audit illustrates the starting point, the training reports clearly indicate what skills are developed, and the ongoing phishing test measures how thes skills are translating to applied results in the workplace. Dr. John Medina stated that most of learning is controlled forgetting and reminds us about the importance of reinforcement. These principles are applied by this training design. Let’s talk about results: Clients in a test campaign realized an immediate overall 74.55% reduction in phishing susceptibility after the first training session. But continued phishing tests and supplemental training reduced the Phish-prone™ rate to 0% for all these clients by the 5th cycle.
- It needs to be real The scheduled phish tests are as real as a true attack. The complex mechanics to perform a fake-phish are complex and were designed by white-hat hackers to exactly replicate all the components of a real cyberattack (just without the malware part). This is critical. A poorly designed fake-phish might be easily identified by users (i.e. if everyone in the organization got the same phish attempt email at the same time). Also, fake-phishing has to bypass all safeguards put in by an IT team to ensure they get to users and not blacklisted and blocked by network safegrards before it can reach the user to measure their behavior. By essentially replicating every move cybercriminals use to get to employees in the organization, stakeholders can trust that the results reported from phish tests are valid, reliable, and most importantly, specific to their organization’s security weaknesses.
- It needs to be relevant Relevance is a critical challenge to address in security training. Cybercriminals constantly change tactics and attack vectors at a blinding pace. The training must keep in lockstep with these changes to be relevant. This is no small task. It required the development of a proprietary Dynamic Content Updates (DCU)™ technology to enable the training to update with industry changes without disrupting user registration or completion data.
- It needs to work… …and that means work on many levels. It works to drive behavioral change. As for a training design that works, Stu envisioned something quick and focused despite the complexity of the topic (only 25 mins to produce the behavioral change needed), high-quality and interesting (let’s be honest-there is a lot of tech training that could be sold as a sleep agent and we needed to avoid that), and easy to navigate (we didn’t want to build a mini-course within the course to explain how to use the course… erm, that sentence was as painful to type as it is to experience one of these designs). Finally, we had to determine how this would work for deployment. Looking at potential clients, we realized that some would have LMS systems, and others would not. We needed a solution to allow clients with an LMS to use their system, while also providing access, tracking and reporting services to clients without an LMS. All of this needed to be done so KnowBe4 had centralized control to perform the content updates and manage access for the subscription-based service while keeping costs and administrative overhead for clients extremely low.
- First, they went to the archive for prior versions of Articulate no longer sold for us to perform the experiment. It was a very strange request and they had every right to refuse. We were essentially asking permission to explore “retired” versions of the product to crack open and mashup into a new solution for a very unique business purpose. They graciously honored the request.
- Second, and more amazing, was when we put the pieces together to get the DCU™ component operating, but then saw unexpected consequences in another part of the asset, they provided support. Again, they had every right to refuse. Not only was this on a retired product, but a mashup between components from their organization that weren’t designed to be put together. But, they stuck by our side and guided us to the results we needed to build something truly unique.
- No worker today works independently; we work in teams and need to be evaluated as an individual knowing how to perform in a team context
- The pace of change is accelerating; objectives shift over time; evaluation mechanics must adjust to fairly assess performance against these changes
- We change our individual actions based on all factors of a scenario which includes the actions of our other team members