Home » 2012 » March

Monthly Archives: March 2012

Objectives and Innovation

In the consultations I often provide for various online programs, I’ve seen a particular problem over and over again in regards to the integration of technology: educators begin integration from a focus on the technology. At conference after conference I hear educators talking about reaching students through new technologies, with once again the focus on the technology. I’d like to give some real life examples of how this can sometimes be short-sighted and  problematic.There are two questions that should guide the integration of any technology: what problem is it meant to solve and/or what objective does it match to?

Plenty of instructors are adding mobile components to courses because “students want to use their cell phones” or because they read its an upcoming technology.  What the educators do not understand is that this is about accessing their coursework using their mobile device, instead of a computer or laptop.  It does not mean they want you to create an assignment that requires the use of a mobile device! If you are requiring students to have a mobile device for a particular course, then it better well have a measurable objective associated with that requirement. One example, would be the need for majors in Geographical studies to use GIS applications. This is related to program objectives for their career.

Here are two examples of a courses that required the use of mobile devices–one that was a good integration, and one that was a poor integration. School A offered a course on Ethical Uses of Technology for Educators. This course required students to have a mobile device. The objectives associated with this requirement were developed to insure that teachers became familiar with mobile devices and the unethical ways they could be used (advertently or inadvertently) in a classroom.  The students in this course were given activities that required them to test how easy it would be to use a cell phone in unethical ways, and to reflect on how this would impact the classroom. This is a good integration of mobile technologies.  The second example is a poor use: the course was on American Music.  There were four objectives to the course, all of which required an understanding of different aspects of music. A mobile component was added because the developers wanted an innovative course, and that intent was to enable  students to upload and download music on their handheld device. Not a single one of the objectives of the course had any reason to require knowledge of this skill, nor anything related to mobile technologies in general. The purpose in including this activity was so that the professors could research whether students would use a mobile device.  This is an example of a very poor understanding of integration. To make it clear, a better way to do this would be to insure the course was hosted on a site that could be accessed and interacted with via a mobile device.

Support for third party applications can also become a problem relatively quickly, and once again I’m referring to unnecessary third party programs such as various web 2 programs. Instructors get angry that the helpdesk can’t or won’t provide support for whatever application they choose to use, but there are thousands, if not millions of them out there. At smaller colleges, where courses are taught exclusively by the faculty member that developed them, the instructor should insure that they are familiar with the application before requiring students to use it. At large colleges, where courses are developed by a team and taught by adjuncts, the problems are much bigger, and third party applications need to be selected more carefully. Adjuncts assigned to teach the course may not be familiar with the program, or they may have their own favorites, and may not be willing to learn it a third-party  because its a program the developer likes. It is also unfair to expect the helpdesk to learn them all and be prepared to assist students. Again, third party apps should be chosen when they are needed to solve a particular problem, when the helpdesk is willing to support it and/or there is a particular course objective tied to the use of that application.

Access and course objectives should always be the first considerations. Activities and assessments need to be directly related to those objectives. Technologies should be chosen with those in mind.  True student-centered teaching does not require particular technologies because they are cool, but because they will assist the student in achieving the course objectives and provide greater access, otherwise we may be putting undo demands on the students.

Here are some important questions that can help guide the integration of technology:
1. Is the addition of the required technology needed in order for students to achieve course objectives?
2. Will the technology decrease or impede access in anyway?
3. Is support available for students who have difficulty with the technology?
4. Will learning to use the technology detract from students learning the required content of the course?

We all need to make our courses more collaborative and more engaging for our students. We also need to have students exposed to the various technologies they will encounter in the work environment. We just need to insure that the technologies we choose help, not hinder, learning.

Learning Analytics

A few weeks ago, I attended a symposium on learning analytics, jointly sponsored by NERCOMP and Educause ELI.  It was a day well spent. The morning session consisted of  presentations on Learning Analytics; the afternoon session was hands-on. Each team was asked to design an application that would do some form of learning analytics. What was at stake was a $10,000 check from a venture capitalist (all in good fun, of course).  Each team presented their application, at the end, and the group voted.  My team developed an application that would follow students eyes as they read course materials, as they interacted with the course. The thought was it would reveal the effectiveness and use of various course materials. (It tied for first place).

What I came away with was lots of questions, lots of ideas regarding the use of learning analytics.  The power of learning analytics is that they allow us to truly understand our students. They help us target intervention with greater accuracy. One institution shared that information gathered from their LMS showed a very distinct correlation between failure rates, and dates of first logins to a course.  This knowledge allowed them to develop interventions that could be put into effect the first weeks of the course.

The presentation that most interested me, was on the development of software that could “read” essays and determine the extent of comprehension. I can see this as an extremely useful tool, most especially for MOOCs and other courses with large enrollments.