Your address will show here 12 34 56 78
Soon, I start the next phase of my career with a new organization after working with the same company for more than a decade. It promises to be an amazing experience. Key connections and friends from my prior role have said their goodbyes or made arrangements to stay connected. However, there is one group in particular that truly helped me in my career who I must bid a fond farewell to: a vendor, Questionmark.
qmlogo Questionmark provides an enterprise-level assessment and evaluation software (Perception) and powerful analytics platform.
Within my first two years using the tool, I deployed over 10,000 questions across 3 languages for over 300 courses in a highly regulated environment. In fact, it was such an effective solution, that our department eventually deployed virtually ALL of our LMS content through Questionmark- even hacking Questionmark to embed content from other elearning tools (as well as PDFs or Office documents) from a central content repository. This gave me much more flexibility with updating content over constantly uploading SCORM packages to the LMS and encountering manifest errors. Our group also essentially abandoned LMS reporting, because the breadth and depth of information from Questionmark analytics was vastly superior to the data we could tease out of our LMS.  [blockquote type=”blockquote_line” align=”right”] If you are involved in assessment design or development, you owe it to your users and yourself to leverage these resources to improve your craft.[/blockquote] The resources offered by Questionmark on their company website for writing valid assessment items or how to correctly interpret question data are the top in the industry. It is no secret that I am an Articulate fanboy and that I believe anyone in elearning design or development should engage in the Articulate community.  However, if you create assessment and survey assets for your users, you should explore Questionmark’s site. From starting items like different ways to use assessments, to complex topics like understanding P-value or how to set an appropriate score threshold, they share a wealth of high quality, well researched resources. My hacks with their platform expanded into doing the things our team hadn’t seen before and had difficulty delivering workable solutions. I developed observational assessments for salesfloor activity and a question-driven dynamic catalog for employees to filter and access assets scattered across various platforms across our company’s intranet. I could provide data on which exact step in a software simulation posed the most challenge to our employees and also identify that employees were not adapting their sales approach appropriately for our second most common customer type. I was using the tool in unique ways to answer the big questions managers were desparately seeking answers for. It may seem strange to bid a vendor farewell, but I’ve benefited greatly from my 10+ year relationship with the product and company. They provided a great solution, and my experiences with their support team has always been an example of what support should be. It is not likely that my new role will require me to perform significant assessment development. But my decade-plus long run in a world of compliance and certification with heavy regulatory oversight and audits, Questionmark was an integral partner for our learning solutions. I am very thankful for their partnership, expertise, and support while in my prior role and bid them a very fond farewell (as a customer; I will always be a fan and a friend).

Workplace assessments are critical, for both the employee and organization. However, common practices are so poor, Learning & Development professionals are squandering their best opportunity to have meaningful impact. Assessments are almost invariably placed at the end of a body of content as a completion trigger.  Most are still multiple choice recall tests due to a deeply ingrained content fetish.  Tons of time is spent gathering information and agonizing over presentation treatments to make it “just right” for the target users. Assessments, however, are generally an afterthought additive which is slapped on at the end of the development process, driven by the content matter, not the expected performance outcomes. Even in cases where an assessment does focus on performance (such as a good simulation or application scenario) it is “a little too little, little too late”. The assessment itself may be adequate to evaluate whether or not an employee can apply skills, but this design has all the wisdom of having a baseball player read a “how to swing a bat” manual, sending them to the batter’s box, and then coming to a conclusion on batting ability. A wee bit late for feedback to really improve performance, no? Even if the feedback does note what the issues are, which most often, it doesn’t. The biggest problem with assessment is actually a common perspective on it’s purpose: it is almost always cast into the role of being judge at “the end” of a learning process. Far too seldom are assessments considered as a key tool for coaching and feedback embedded within a learning process. With more assessments designed as experience/feedback loops (practice with solid diagnosis and feedback), Learning & Development has the greatest opportunity to actually help a user improve their performance. A batter isn’t judged at their first “at bat”; coaches have them practice repeatedly and will tailor feedback to help the batter improve. Sports teams and individual players are surrounded by assessments: scoreboards, team and player ranks, individual player statistics. These are ultimately the measure of their performance, the judgement of skill. Employees in a business are no different. They have many metrics to which they must answer. Learning & Development doesn’t need to place themselves in the role of judge. There are enough scoreboards across the business. Except in cases where the employee must be evaluated as safe or compliant with a critical process, we should step away from the role of judging.  Even in these cases where judgement may be needed, this shouldn’t be the first (and often the only feedback) we have provided. Many have expressed concerns over our increasing abilities to collect and analyze more data. Perhaps it is our own doing since typical designs has Learning & Development talking at users by providing information to them, and only asking for enough dialog back from the employees to judge them in some manner.  That’s not a partner in development. That’s not coaching. It’s judgement. Workers deal with enough judgement and measures. They need partners and coaches to help them perform more successfully. Let’s have the critical and constant dialog needed so that our “final” feedback to users isn’t a crack of a gavel noting “you pass/fail”, but an encouraging slap on the fanny as they step into the stadium with our “final” feedback “You’ve got this. Go for it.”.


Last week, I had the privilege of presenting as part of the eLearning Guild’s Online Forum for Assessments and Evaluation. I was able to attend several sessions from fellow presenters and was astounded by their offerings. It was truly an honor to present among such insightful, committed, and respected professionals. [blockquote type=”blockquote_quotes” align=”right”] Significant work and support is provided by the Guild to create high-quality offerings from presenters each and every time.[/blockquote]The high quality of Guild events is no accident. It is designed into the process. I wanted to share my experience  as an online forum presenter. Significant work and support is provided by the Guild to create high-quality offerings from presenters each and every time. The quality is all about the support, and the Guild provides a tremendous amount of support. Christopher Benz, the Director for the Online Events for the Guild, approached me to discuss presenting as part of this forum after my live presentation at Learning Solutions 2011. Chris manages an amazing amount of detail for the online forums, from covering the initial expectations and timeline with presenters, right through to performing the final sound check just moments before presenters go live (and everything in between, which I won’t attempt to cover in this blog). Chris makes the process manageable for presenters and handles each question and issue with grace and ease.  He is a true master of his craft. Every presenter is also assigned a mentor. Mentors orient presenters to Adobe Connect, keep them on-track with tasks leading up to and during the event, and participate in practice sessions to offer advice about enhancements. A minimum of 3 mentoring sessions are performed prior to the live event. I am very thankful for all the guidance and support provided by my mentor, Melissa Chambers.  Her contributions and insights as my mentor resulted in key improvements to my session. She also partnered with me during the presentation to take on the task of managing the interactive polls, so I would not need to multitask. If you are ever extended the opportunity to speak on an eLearning Guild Online Forum, take it. I cannot speak highly enough about all the Guild does to support presenters in moving confidently through the process, and the expert advice and insight that is shared during this process. It not only improves the final deliverable, but also develops the presenter’s skills and insights about what it takes to construct and deliver a high-quality online forum.

I have been invited to speak as part of the eLearning Guild’s Online Forum: eLearning Assessment and Evaluation Approaches August 18 and 19. The event has an amazing lineup of professionals and I am honored to be included in the event. I will cover: [list style=”arrow” color=”blue”]
  • techniques to provide valuable data for ALL levels of evaluation
  • different content-embedding strategies to optimize flexibility and workflow without risking critical data or creating common LMS “hiccups”
  • techniques to collect multiple measures to validate metrics and provide opportunity to aggregate or parse data collected
  • how to stucture assessment assets at many levels for reuse in multiple contexts
  • a simple trick anyone can use to expand one good question into deep question banks
[/list] Please inform anyone responsible for assessment or evaluation of training programs about this event.

Click the image above to view the one minute video about Internet Security Awareness training and how your organization can get: [list style=”arrow” color=”blue”]
  • A free security audit to measure how Phish-prone™ your organization is
  • First2Know™ Internet Security Awareness Training to make your employees less susceptible to phishing attacks (25 minute, high-quality training that is updated to keep pace with industry changes)
  • Clear reporting and measures of the training’s effectiveness through training reports and scheduled phishing security tests
  • ThreatApp™ daily smartphone updates with relevant threat intelligence
[/list] If you didn’t leave my site to begin using this service (I’d forgive you), I would like to share information about the CEO’s design concept, and my involvement in the project, which was to help deliver the  Dynamic Content Updates (DCU)™ technology and to select a solution that can run on client LMS systems or through a KnowBe4 portal customized for each client. Cybercrime is a behavioral issue You’ve probably heard – a lot – about all the hacks that have occurred from Epsilon (and the long list of impacted companies), Sony, the CIA… … even the bombing and shootings in Norway had cybercriminals using the incidents to phish for profit within 24 hours. [blockquote type=”blockquote_quotes” align=”left”]Cybercrime thrives because of behavioral issues.[/blockquote]Cybercrime is a big problem; half a billion dollars in 2010 and growing. The amazing thing is that the weakest link in defending against cybercrime are end users. Cybercrime thrives because of behavioral issues. Yes, there are inherent weaknesses in antivirus software which only protect against a portion of threats (yes, only a portion-and perhaps not as high as you expect!). However, finding weaknesses in computer code, hardware, or antivirus software is difficult. It is much simpler for hackers to use social engineering to trick users into inviting them right into the system. The cost of a single breach can be crippling to a business. And it may only take one successful hack of the right account for an internet thief to hit the jackpot: just one. These criminals are flooding the internet with attempts to produce results. In fact, if the internet pipeline were a faucet in your home, 80% of it would be clogged with this garbage. Of all the tools in a hacker’s arsenal, phishing is the tool of choice. It is the optimal pathway for them to gain a backstage pass to your company’s network. The strategy depends on fooling a user into some action. It could be as simple as clicking a link. The user actions are the keys to success for cybercrime to thrive. This is why Internet Security Awareness Training is truly the best defense against cybercrime. But, I didn’t know any of this. I learned it all because I met Stu. Stu Sjouwerman, founder of KnowBe4 and an amazing training designer There are many ways to describe Stu Sjouwerman, founder of KnowBe4: Serial Entrepreneur, Author, IT Security Expert, Marketeer, and truly one of the nicest people I have had the pleasure of working for. All of these are accurate descriptions. Many wouldn’t think to describe Stu as a training designer, but he was the IT Security Expert who recognized that cybercrime is primarily a behavioral issue that can be improved through training -and he designed a great solution to address it. I wish I could take credit for the design, but Stu had it all worked out; my involvement was to help deliver on it. Stu’s vision for developing Internet Security Awareness Training illustrates key principles of good training design: [list style=”arrow” color=”blue”]
  • Clearly understand your problem Stakeholders value training when they clearly understand the problem it will solve for their business. The free security audit measures how Phish-prone™ employees are. This is an actual test of your company’s employees-not an abstracted, generalized conclusion of risk exposure based on industry averages. The tests are real-world examples using the same tactics as cybercriminals.  There is no risk for stakeholders to gain a clear and accurate measure of the true risk exposure for their organization.
  • Design a system, not a course Too often, information security training is designed like a marketing campaign. A lot of information is blasted at users as one course to complete. Get the check in the box to “mark compliance”, and be on your way. This won’t produce meaningful or lasting behavioral change. Many courses don’t provide any type of experience in how to react to threats, opting instead to test recall of facts about cybercrime.  Even courses with skills-application testing have a critical design flaw: users know that they are in training and that their actions are being measured. This heightened awareness of the fact their behavior is monitored in the training environment can influence users to act differently than they normally would in the work environment. In contrast, KnowBe4 uses the security audit to measure actual the on-the-job results to establish a baseline. The provided training presents scenarios to educate you on how to react to potential threats, and what to do if you suspect if your system is compromised.  The training measures capture what you learned. What is more important is what transfers to the workplace-when you are not in training when cybercrime is top-of-mind as a core subject matter. Are learned skills being applied? Ongoing scheduled phishing security tests enable stakeholders to see how skills transfer to work, and enables the organization to take corrective action when necessary. Support tools such as Threat App™ compliment the training. To make cybercrime prevention in a business effective, a course alone simply won’t do. It requires skills application in the a real-world workflow, not a separated test experience as part of a learning event where learners realize they are being monitored as part of training.
  • Repetition, reinforcement = results [blockquote type=”blockquote_quotes” align=”right”]Clients in a test campaign realized an immediate overall 74.55% reduction in phishing susceptibility after the first training session. But continued phishing tests and supplemental training reduced the Phish-prone™ rate to 0% for all these clients by the 5th cycle.[/blockquote] The results are transparent and undeniable. The security audit illustrates the starting point, the training reports clearly indicate what skills are developed, and the ongoing phishing test measures how thes skills are translating to applied results in the workplace. Dr. John Medina stated that most of learning is controlled forgetting and reminds us about the importance of reinforcement.  These principles are applied by this training design. Let’s talk about results:  Clients in a test campaign realized an immediate overall 74.55% reduction in phishing susceptibility after the first training session. But continued phishing tests and supplemental training reduced the Phish-prone™ rate to 0% for all these clients by the 5th cycle.
  • It needs to be real The scheduled phish tests are as real as a true attack. The complex mechanics to perform a fake-phish are complex and were designed by white-hat hackers to exactly replicate all the components of a real cyberattack (just without the malware part). This is critical. A poorly designed fake-phish might be easily identified by users (i.e.  if everyone in the organization got the same phish attempt email at the same time). Also, fake-phishing has to bypass all safeguards put in by an IT team to ensure they get to users and not blacklisted and blocked by network safegrards before it can reach the user to measure their behavior. By essentially replicating every move cybercriminals use to get to employees in the organization, stakeholders can trust that the results reported from phish tests are valid, reliable, and most importantly, specific to their organization’s security weaknesses.
  • It needs to be relevant Relevance is a critical challenge to address in security training. Cybercriminals constantly change tactics and attack vectors at a blinding pace. The training must keep in lockstep with these changes to be relevant. This is no small task. It required the development of a proprietary Dynamic Content Updates (DCU)™ technology to enable the training to update with industry changes without disrupting user registration or completion data.
  • It needs to work… …and that means work on many levels. It works to drive behavioral change. As for a training design that works, Stu envisioned something quick and focused despite the complexity of the topic (only 25 mins to produce the behavioral change needed), high-quality and interesting (let’s be honest-there is a lot of tech training that could be sold as a sleep agent and we needed to avoid that), and easy to navigate (we didn’t want to build a mini-course within the course to explain how to use the course… erm, that sentence was as painful to type as it is to experience one of these designs). Finally, we had to determine how this would work for deployment. Looking at potential clients, we realized that some would have LMS systems, and others would not. We needed a solution to allow clients with an LMS to use their system, while also providing access, tracking and reporting services to clients without an LMS. All of this needed to be done so KnowBe4 had centralized control to perform the content updates and manage access for the subscription-based service while keeping costs and administrative overhead for clients extremely low.
[/list] Strategies to deliver on the design This is the design vision Stu shared with me when I walked into his office during our first meeting. My task was to find opportunities to deliver on this vision. Challenge 1: Deployment with SCORM Cloud The first key challenge was to determine how to deploy the content to the client base. Some have LMS systems; others did not. KnowBe4 needed central control over the content for the critical and frequent updates and to administer access controls. For our clients without LMS systems, we needed to provide a customized portal for access, tracking, and reporting. After a review of over 50 potential solutions (LMS/LCMS/CMS vendors, assessment systems, portal tools, and other cloud-based services) SCORM Cloud was chosen as our solution. SCORM Cloud allows KnowBe4 to centrally perform the content updates and  administer access permissions. This was preferable to providing SCORM packages to clients to upload into their systems which would require immense levels of effort to coordinate. It was also was a far superior alternative to purchasing an LMS system and working to coordinate access with client LMS systems (because when either vendor updates, it often requires reconfigurations to maintain the ties between the systems). The SCORM Cloud pricing model proved much more cost-effective than LMS systems. Another key element of SCORM Cloud is the ability to use it’s API to create custom portals for access, tracking, and reporting and to tie SCORM Cloud to other critical business services. The KnowBe4 site is a mashup using SCORM Cloud’s APIs, customized parts of the site coded by the development team, and other backend services like Salesforce™ to provide customers an integrated, seamless experience. Challenge 2: Creating a Dynamic Content Updates (DCU)™ Engine to Keep Pace with Industry Stu provided a development script to the team at Prometheus Training. They produced a great piece of engaging, easy-t0-navigate content in Articulate Studio. Now, we had to figure out a way that we could expose pieces of the SCORM packaged content in a manner that we could update the content with frequency while ensuring we wouldn’t have to upload updated packages that risk impacting user registration, progress, or completion data. For those of you who have worked with SCORM, you know this is quite a trick. Without giving up any secrets of our DCU™ “secret sauce”, I can share that this required a mashup between elements of Articulate Studio 9, and elements of the prior versions of Articulate. It was one of the white hat hacker security pros, Brian, that had the vision of building a utility to expose content for updating “from the side” without really breaking open the core SCORM assets.  I was very lucky to have Brian to really look at how the different versions of Articulate package content to expose a pathway for this to be possible. Who else to expose the opportunity, but a professional hacker (note: Brian uses his powers for good; this is hacking for a good purpose)? After looking at a few options, we finally found a mashup of tool versions with an option that would work. Brian built a proprietary tool to feed the updates from the DCU™ into the course without cracking open SCORM and risking user data. We did hit some challenges in pulling off the trick. By substituting portions of different versions of Studio that weren’t designed to go together, we  experienced some unexpected side effects. Some of the interactions other parts of the training (parts we left in the original Studio 9 engine) stopped working as expected. Support matters most: Kudos for the Articulate Team I have said often in LinkedIn and ASTD chatrooms that support is the key differentiator for any vendor. I have yet to use a system that I haven’t experienced a problem with. Therefore, it’s your partner’s response to the problem that matters most. High praise must go to the support team at Articulate for two reasons: [list style=”arrow” color=”blue”]
  • First, they went to the archive for prior versions of Articulate no longer sold for us to perform the experiment. It was a very strange request and they had every right to refuse. We were essentially asking permission to explore “retired” versions of the product to crack open and mashup into a new solution for a very unique business purpose. They graciously honored the request.
  • Second, and more amazing, was when we put the pieces together to get the DCU™ component operating, but then saw unexpected consequences in another part of the asset, they provided support. Again, they had every right to refuse. Not only was this on a retired product, but a mashup between components from their organization that weren’t designed to be put together. But, they stuck by our side and guided us to the results we needed to build something truly unique.
[/list] [blockquote type=”blockquote_quotes” align=”right”]Each individual of your organization can be trained to eliminate the risk exposure to cybercrime for less than I paid for my last two cups of coffee. [/blockquote]The superb customer support and technical insights from the team at Articulate guided me to the options to fix the issues. So, the DCU™ was successfully created, all content was back to operating as designed, and we had a way to update content successfully without uploading new SCORM packages that could risk critical user data. I am immensely proud of being involved in the project and grateful for the support of all involved. I truly believe in the product. KnowBe4 is an extremely valuable service with a great design that delivers real results. Each individual of your organization can be trained to eliminate the risk exposure to cybercrime for less than I paid for my last two cups of coffee. That’s not marketing claim, that’s fact you can measure in your business.

The recent Electronic Entertainment Expo (E3) conference showcased the hottest trends in video games from motion controls, social gaming , 3D, and mobile options. With Gamification as a hot buzzword in training, it’s good to be aware of developments (after all, we all want to Level Up, right?). For training gamification, I hear a lot about engagement, interaction, tablets, mobile, 3D, immersion, and occasionally social (usually social and gaming are separate discussions, but you do find a few social gaming concepts like checkins, badges, and leaderboards enter into training discussions). [blockquote type=”blockquote_line” align=”right”] a powerful element of gaming mechanics: assessing performance against shifting objectives in a dynamic team environment.[/blockquote] These things are great, but what I don’t hear is discussion around a powerful element of gaming mechanics: assessing performance against shifting objectives in a dynamic team environment. Currently, I don’t see any instructional design construct that can come close to fairly assessing performance when the user is placed in fluid scenarios and must play a different role in a dynamic manner. The change in role is not just based on variances in the situation. It is largely dependent on how other players, whether fellow team members or opposing players, are interacting in the system. Massive Multiplayer Online (MMO) games seem to have game mechanics and algorithms that allows the system to track each user’s contributions as they perform within the context of the system as well as the actions other players. This rating is done continually despite frequently changing roles. Current training development systems don’t do this, which is a serious gap. The trends are clear: [list style=”arrow” color=”blue”]
  • No worker today works independently; we work in teams and need to be evaluated as an individual knowing how to perform in a team context
  • The pace of change is accelerating; objectives shift over time; evaluation mechanics must adjust to fairly assess performance against these changes
  • We change our individual actions based on all factors of a scenario which includes the actions of our other team members
[/list] All of these characteristics mirror the world we work in. Targets shift, and team member actions impact how we contribute. MMOs seem to have more of the secrets to measuring all these factors worked out than any traditional training development tool or measurement construct I have seen. I think exploration into these techniques can allow training designers to more fairly assess performance in a way that reflects the work landscape today: dynamic and team-based. Your thoughts?