Your address will show here 12 34 56 78
We’ve all heard about Artificial Intelligence, but less commonly mentioned is Artificial Stupidity. The first I heard the term was during the Learning Analytics webinar featuring Ryan S.J.d. Baker from the International Educational Data Mining Society. After several minutes of participants providing feedback to Ryan of poor audio quality, Ryan checked his settings. The mic was set to allow the computer to interpret input and adjust as it saw appropriate. After a successful correction, Ryan noted it was a great example of artificial vs genuine stupidity. This really illustrated a powerful issue regarding analytics. In Ryan’s case, it was a single dimension, and a very simple miscalculation. Yet, it took many points of feedback to go an check that the “computer thinking on our behalf” was, in fact, the source of the core problem. When questions arose during the session, such as “how are decisions made about proper intervention?” and “when would a teacher be preferred to the system itself?”, the response was essentially that validation is key. This may illustrate a bias of the focus of Educational Data Mining (EDM) to automation. Ryan noted that EDM’s focus is on automated discovery leveraging human judgement, component-focus, and automated adaptation (no human intervention). The focus is on model itself. He contrasted EDM’s focus to that of Learning Analytics which he described as supporting human judgement, facilitating automated delivery, evaluating systems as wholes, and informing/empowering learners, multiple stakeholders with information drawn from data. I see a few potential challenges with EDM as opposed to Learning Analytics. [list style=”arrow” color=”blue”]
    • First, is the automated adaptation without human intervention. Yikes. We saw a very simple, single-dimension variable where “computer adapts based on it’s inputs” that failed. Ryan noted “validation is key”. I trust that sound engineering is a much more mature science with less variables than big data and analytics, and if they can’t quite get it right in a very discreet range of variables,I don’t think I will be entrusting computers to do the bulk of “thinking and/or adapting” for some time.  Heavy lifting on collecting, clustering, summarizing data? Sure. Triggering actions based on analysis without having someone attempt to validate that the fomulatic result of number-crunching made any logical sense? No thanks.
 
  • Second, is the issue with component focus. I see a few issues here. Some of it is merely mechanical, and something I have seen many times in switching from LMS system X to system Y or HR platform A to B.  Mapping data tables. When systems upgrade and change, it is not uncommon for them to make tweaks to their database schema.  Even if you ultimately are porting the data to a warehouse or a federated model, when systems update, there may be information that doesn’t map properly. The challenge is this: as we attempt to map more an more sources of data and fields, it becomes much more complex and more of a risk. Vendors or departments managing systems and databases are not always obligated, nor are they focused on the systemic view of the impacts to changes to their data. I understand what EDM’s focus is in terms of component-focus, but there has to be interplay between components and the larger system to sustain. Just as no man is an island, in today’s environment, no application or database is an island either (although many of them were built in a time where the focus was silos, not sharing).
  • Another major issue I see with EDM is the focus on the model itself. As we tie in more databases, fields, feeds, correlations, and triggered actions, we begin to build things so complex and big that we may not be able to truly manage, nor comprehend what is was built for. There are great examples in this TED Talk from Kevin Slavin.
[/list] [list style=”arrow” color=”blue”]
  • Kevin’s talk also illustrates another key issue: awareness. By blindly following models, or being trained to do so, often I find users don’t stop to question. Ryan not stopping to check settings is a very small, but illustrative example- folks don’t often assume that the model itself is broken and question it. Kevin shows in his TED talk where algorithms go unchecked. In my own business, I see employees blindly follow templates, processes, and machine prompts instead of thinking critically. This is mainly is an issue of how they are trained (to follow a system or process) and not being trained to truly understand the underlying logic or benefits the system is supposed to provide. Deeper awareness enables folks to recognize when the model, it’s input data, or results, are compromised. If data feeds change, business rules change, outlier results get introduced into the system, the model cannot evaluate and re-write it’s own algorithms. In the world of change we live in, the rules, logic and thresholds that makes sense today has a short shelf-life before it needs to be adapted. Without intervention, even the best validated model can quickly become obsolete. With the complexity of systems being developed, it will take more than just a small core of authors to be aware of and manage the system.
[/list] Learning Analytic’s focus on  informing and empowering stakeholders with information drawn from data at this point in the development cycle of Big Data and Analytics, seems to have a more appropriate fit. It seems the science of writing, adapting and managing big data and decision algorithms is not quite mature enough to keep us from harms way when removing a large amount of human involvement. Computers are great at thinking within a defined logic set and variables, and superb at handling and number crunching large amounts of data that would prove problematic and cumbersome to a person. An individual’s judgement can see grey areas and critical thinking can think outside prescribed algorithms and evaluate whether a model itself is broken. This is why I think decision support as the primary role is the best role for big data in learning currently. It seems that this is the focus of learning analytics. I suspect as the science matures, we will move more to EDM’s focus of automation. My bias could be largely influenced by a system I saw presented at Learning Solutions 2011. Ray noted how the key success point for the project was presenting the information for managers and employees to determine the intervention as opposed to purely prescribing actions triggered by data.  
0

I just wrapped up week 1 of the Learning Analytics and Knowledge MOOC. The focus of week 1 were the key questions “Why Learning Analytics?” and “Why Now?” My experience with this MOOC illustrates the key drivers at a micro level. A MOOC, or Massive Online Open Course, is designed to have no true center for all the activity. We use twitter, a wiki, distributed contributions across participant blogs (using many different blogging platforms in different languages), and a diigo group. Even the core resources posted to the course wiki are many and varied in format: Week 1 activities included many readings in PDF and web page formats, a few hours of video, and two live webcasts. It’s a microcosm of the web; a small sample of how the web works (and this isn’t even counting physical assets that provide data exhaust). How does a MOOC participant manage to transform all this data to gain insights for his/her specific context? Some structure was provided in a wiki, and daily email update alerts with summaries are sent to participants. But even with this support, it was noted on the first webcast that the way to approach the dailies was to skim and pursue items of interest. Thus, it’s essentially manual labor (wish I had some data mining and analytics skills to help me explore and discover the items most relevant to my focus- corporate learning application as opposed to academic). The challenge was truly illustrated because of business priorities and some family illness; I didn’t get to the content until the weekend.  The data was piled pretty high- some great contributions from my peers in the course (shameless plug for participants to read this thoughtful piece from Wolfgang Greller’s Weblog). I was already behind and had no system to analyze and prioritize. It was difficult to anything more than baseline consumption of required data, and a few additional morsels. Of course, with the amount of effort to just consume the data in that timeframe, time for reflection was inadequate. Even on this relatively small scale, the information was quite a bit to manage. Despite daily emails with updates to provide some structure and only being a week in, the information comes fast, grows quickly, and is loosely structured. At it’s most simple level, this is why analytics is needed now: information is growing at a pace we cannot effectively make meaning from-let alone decisions-without analytics methods and tools.[blockquote type=”blockquote_line” align=”left”]At it’s most simple level, this is why analytics is needed now: information is growing at a pace we cannot effectively make meaning from-let alone decisions-without analytics methods and tools.[/blockquote] The course itself had amazing examples of the demands on institutions and the strategic benefits being realized by organizations applying analytics effectively; it is clear that if data is harnessed and analyzed effectively, it generally supports better decision making. And in the current climate of accountability for results, we have to use the arrows in our quiver to deliver. The one thing that I think is very different in a corporate setting versus an academic setting is this: Despite other metrics like completion and retention, it is clear in an academic setting that learning is at the core of the institution and the ecosystem is focused on supporting learning. Corporate settings often don’t have this focus. Learning is on part of a larger business ecosystem. Learning is often treated as a very separate entity from the other parts of the business. Using learning analytics to forward business goals as well as using external data from the business as part of the learning analytics process may present more cultural challenges for the corporate context.
0