Skip to main content

Reflections on ELI 2011

By Lee Ann Gillen & Cleo Magnuson

Washington Reagan airport with Washington Monument in Jefferson Memorial in background
Washington Reagan airport with Washington Monument and Jefferson Memorial in background

The 2011 ELI Conference was held in Washington, DC on February 14 – 16. The theme of this year’s conference was Educating in the Open: Philosophies, Innovations, and Stories and referenced not only the concepts of openness to resources such as program code, learning objects, courseware, and other educational content, but also openness to new ideas, emerging technologies and new practices. ELI’s new Seeking Evidence of Impact (SEI) program was featured in many presentations. There were also many hands-on sessions, which showcased emerging technologies such as iPads for teaching and learning.

Some of the highlights included (1) David Wiley’s talk on “Openness, Learning Analytics, and Continuous Quality Improvement”. Dr. Wiley is an Associate Professor of Instructional Psychology and Technology at Brigham Young University in Provo, Utah . One of his talking points focused on how much data is actually available within the learning management system (LMS). This data holds the key to providing educators with quantitative evidence that can inform curriculum redesign using both performance and behavioral data.

As Wiley discussed, much of the data that is frequently sought in research studies is assessment data. This data demonstrates evidence of student performance (for example, test scores, final grades). However, he purports that there are also other useful data types such as, “construct-relevant behavioral data”. For example, this type of data may include how many times the student logs in, how often, how long they stayed, what they read and worked on, and what social network pathways were established. By mining this data, educators learn not only about the class as a whole but more about the individual student.

Wiley showed an example of a “waterfall” diagram that illustrated the relationship between login frequency and time on task for high school students within an LMS. Each data point on the diagram represented one day. The length of time that each student stayed logged in was represented by a circle or pinpoint on the diagram. The longer the student stayed logged in, the darker the circle. It was clear that those students who were logged in longer were the students who had higher performance levels. The take-away from data suggests that the instructor can use this information to provide intervention.

Wiley then took this one step further as he examined how both assessment (performance) data and behavioral data can be part of a feedback loop in which curriculum redesign can be informed. The data within the LMS can inform the instructor as to who needs tutoring and about what. View David Wiley’s presentation as part of the “Free Conference Streaming Sessions”. To read another perspective on David Wiley’s talk, go to MIT’s Ed Tech Time’s blog http://oeit.mit.edu/blog/david-wiley-philosophy-strategy-and-accountability

Cleo Magnuson and Thomas Angelo
Cleo Magnuson and Thomas Angelo

(2) A featured presentation that looked at the SEI initiative was Thomas Angelo’s, “Gathering Evidence: How Less Can Be More”. Dr. Angelo is Pro Vice-Chancellor and Director, Curriculum, Teaching and Learning Centre at La Trobe University in Australia. Quotations in this section are from Angelo’s presentation slides. Some of the key points of his talk were to “design backward, and plan forward”. He urged participants to ask certain questions before they began to gather evidence. Some of these questions were (a) “why? – what improvements or changes do you want and what evidence do you hope to gather”; (b) “who? – are the main actors / stakeholders you are hoping to convince with that evidence?”; (c) “what? – will you do with the data or evidence you generated and how will you package or share it?”; and (d) “How: – will you get your target audience(s) interested, engaged, and invested in attending to and using the outcomes?” He advised using a gap-analysis approach, that is “find the gaps, mind the gaps, and close the gaps” in your own understanding. Finally, he left us with his seven axioms for gathering useful evidence. 1. “Don’t ask if you don’t really want to know.” You can’t bury data you don’t like. It will come back to haunt you. If you’re going to ask, then you had better be ready to do something about it. 2. “Don’t collect more data than you can easily and quickly turn into useful information.” There is an optimal amount of data to collect. Do sampling. You need to collect less data, but the right kind of data is critical. You must have a plan of how you will handle the data, before you collect it! 3. “Don’t simply adopt evaluation methods and techniques from others, adapt them to your subject and students.” 4. “Before asking a question, always ask yourself: How might responses to this question help me and others improve learning? (If you really can’t answer that, then the question is probably not worth asking).” 5. “Take advantage of the “Hawthorne Effect”, which says when you shine a light on something (examine it closely), the individual being examined becomes more mindful of their actions and therefor thinks more about their behavior and often performs better. For example, if you were asking your students about how long they study, they might study longer because they are aware that you are looking at that data. So, “you should let respondents know why you are evaluating and how you hope it will benefit them.” It is also useful, to whenever possible, piggyback on existing data or data collection opportunities (registration for example). 6. “Remember: If an evaluation is worth doing, it’s worth making sure that respondents know how to answer.” The respondents often don’t know what kind of information you are asking for. They don’t understand the kind of questions we are asking of them, or the kind of data we want back from them. For example, instead of asking for one example, ask for three or four, and that way you will get at least one good answer. We may need to model the kind of answers we want – not the content, but the type of answer. It also helps for them to know why you need these answers, so they can understand why they should take their time to respond. 7. “Make sure to close the “feedback loop” by letting respondents know not only what you’ve gleaned from their responses – but how you and they can use that information to improve their learning and success.” One reason why students don’t take evaluations of teachers or teaching activities seriously, is because they never see the results of the study – there is no closure. View Thomas Angelo’s presentation as part of the “Free Conference Streaming Sessions”.

(3) The New Media Consortium’s Horizon Project is a “comprehensive research venture established in 2002 that identifies and describes emerging technologies likely to have a large impact over the coming five years on a variety of sectors around the globe”. The Horizon Report, which is a collaboration between the Educause Learning Initiative (ELI) and the New Media Consortium has traditionally been presented at the annual ELI convention. It discusses Key Trends and Critical Challenges for the next five years, and selects six technologies to watch along three adoption horizons, 1) the next 12 months, 2) within two to three years, and 3) four to five years. This year the near-term horizon encompassed electronic books and mobiles, the two to three years adoption were augmented reality and game-based learning, and the far-term adoption were gesture-based computing and learning analytics. The Horizon Report examines each one of these emerging technologies and gives an overview of the application, its relevance for teaching, learning, research, or creative inquiry, a sampling of the application across various disciplines, the application in practice, which shows examples of its use in higher education settings complete with descriptions and hyperlinks, and finally, a further reading section on the application. Previous years’ Horizon Reports can also be reviewed online for some very interesting reading.