Monday, July 16, 2012

Educational Analytics: A #bbw12 Idea that Might Not Be Ready for Prime(number) Time

JDs note: This was originally going to be my final blogpost from #bbw12. Turns out I had a lot to say about analytics. Khan Academy Keynote review later in the week.

I am sitting on a porch lakeside in Michigan trying to get motivated to write the last blog on #bbw12. Although, as we get farther and  farther away from the sultry weather, the dazzling lightshows, and amazing food, I realize that there are only a few more things to cover...Away we go...

But first...

Interlude: G-drive and Chromebooks go great together
When I powered up my chromebook yesterday (yes, I have been thinking about typing for over a day now), I was happy to see an upgrade for the old Chromium OS. Clicked the restart and when I logged in, didn't notice much difference. But as I started to flip through G+ photos to prepare for the blog and some updates to the IHSFA website (yeah, I got a webmaster gig), Look what I saw:

Screen capture, saved to G-Drive, Edited in Aviary, Linked to Blog - Cloud Win
first mentioned way back in my "To the Clouds" review of G-drive and apps, I longed for my G-drive to be integrated into the file system of the chromebook so that other apps would see the files with save&open functionality. Now its a reality. Haven't played much to see if it hold cross-users, but this boosted my productivity a lot.
End Interlude

An Introduction to Analytics: What they are, why they matter and what educators need to know

I wish that title described what this blog was going to be about, but I am not sure if I was able to glean from this session what the succinct description said it would be about. So here's what i got:

Ellen Wagner, @edwsonoma, gave the presentation. All credit to her, she made me think a lot about analytics. The session began with a picture on the screen of a tornado barreling down the road -- the simple message: analytics are not going anywhere. We need to learn how to deal with them in education.

She began by describing the levels and roles of analytics. It is not just seeing patterns in data...in fact, that might be considered the lowest level of analysis. It is also looking to find actions that can be taken based on those patterns to influence outcomes. Even better, to be able to predict actions that WILL BE TAKEN based on the digital breadcrumbs left of actions that were taken in the past. Sounds like a lot of science fiction? It probably should: This is what Google, Amazon, and other big-techs have been working on for years (Note from speaker: I won't even go to a shopping website that doesn't have a decent suggestion engine). As data migrates to the cloud and computers have the power locally or cloud-based to crunch the numbers, these predictions become a lot more useful than knowing that JD will buy anything featuring JoCo or @feliciaday.

OK: lots of data. Lots of computing power. We will have the ability to predict and work in advance instead of constantly playing a reactionary game of catch-up. No problems so far...

Question One: To Mine or not-to-mine
Wagner draws a distinction between research, which is predominately empirical analysis and business use, which is a lot of predictive data-mining. "We have been trained that data mining is bad," she notes. She then goes on to explain differences on the structure, collection, and tools based on these two different outlooks. But she begged a question that was not ever answered? Is Data Mining Bad or were we all taught wrong in our master's classes? It is not enough to say, "Business does it". If there are inherent problems with data-mining, those need to be addressed BEFORE we adopt these processes as the educational norm.

She reiterates that educational data is already being tracked and crunched now and that there is no way to avoid it. She talks about analytics maturing to be the biggest thing to hit the classroom since computers


She points out that we (and by "we" she means "the data whisperers" -her AWESOME term - who do this analytic magic) are not sure where to start, which metrics will be most useful, or which permutations to use. -- No wonder some educators are nervous!


She lays out some of the challenges: siloed data (LMS, transactions and outcomes, latent data, demographic data, perceptual data, financials, operations... *whew*). She lays out some of the methodological differences that need to be addressed (including that data-mining question above). She refers to this and the half-hearted attempt to analyze this as leading to "bowls of data spaghetti"

Question Two: Which is the Dog? Which is the tail?
Another challenge she draws out but does not pin down is the move to personalize education. She acknowledges that personalization does (and may continue to) play directly against the need to extract normalized and useable data for the prediction engines. Wow. When put that way, i have a sinking feeling that all of these data acolytes might just be paying lip-service to Student-Centered learning -- say it ain't so Gates Foundation!

Her penultimate example was Moneyball -- a baseball team using analytics to analyze what it takes to win so that they could hire accordingly. "They solved their problems through recruiting...we will have to do it through other means" -- don't get excited. She didn't say what those means were.


Reflection: Lights and Shadows

The Good: I am intrigued by the power of real-time data being applied in a way that give teachers a heads-up about learning as-it-happens. She made a compelling case that we could change lessons, projects, expectations mid-stream so that learning is turned-around before it becomes a post-mortem analysis of "what went wrong" I am a little intimidated by the amount of training that is necessary to teach this type of on-the-fly flexibility. I don't think it is something that will come naturally or easily, but I could see it being a game-changer.

The Bad: At some point following the moneyball story, discussing the ways in which we will use this data, she made the joke "We are not using this [data] to get rid of you [teachers]...that's another story"...She got quite a few chuckles. Combine that with her above admissions that those who are doing the research on analytics: determining the proper metrics, debating permutations, discussing how to normalize the verities of data from so many different sources...

Analytics will be a game changer in education -- the same kind of paradigm shift as when computers were introduced to the classroom. But I unboxed my first school computer when I was in kindergarten 32 years ago. It took awhile for the change to happen and a number of educators are still debating the value of that revolution or the way in which it should take form with our students.

Analytics is in its toddler-phase when describing education. Yet, the Holy Mantra of Educational Analytics is already being invoked to determine teacher compensation and retention, school value-add, and whether there is a need for educators at all (the next blog will address the Khan Academy Keynote). How can we laugh at the idea of teachers being worried about the spectre of analytics when that ghost is being invoked by those in power as a mystic force which contains all the answers -- answers which always seem to serve the agenda of the politician, the educational technology company, or the administrative agenda doing the invoking.

The expert said it: 
There is a fundamental difference between doing research on analytics and applying those analytics to help our learners succeed -- based on everything I heard, We are firmly at the former, but some are pretending that we are well in the latter.

We do a disservice to say that educators are afraid of data. 
  • We're afraid of that data being manipulated and used to push agendas which harm students, our institutions, and our colleagues. 
  • We sense that the analysis is incomplete and that those patterns aren't clearly known, despite what some say. 
  • We are willing to learn and understand what these patterns are telling us about students and learning. 
But we also want the tools and ability to call foul when the numbers don't add up. 

We need the analytics to be open and transparent to all. 

We need the data-whisperers to be unbiased...but if they can't do that, I suggest that they should  always be biased toward student choice and independence...at least that puts the power in the hands of the least advantaged -- a noble goal for a bunch of number crunchers.