As an instructional systems designer/education researcher, I attended sessions at the Sloan-C Emerging Technologies for Online Learning with the goal to inform my professional development. In some organic fashion, my interests somehow coagulated into three main areas:
1. How do people learn: what's the latest in educational psychology and how is it applied to online education?
2. How is instructional design and development enhanced by new technologies?
3. Scholarship of online learning and teaching: What interesting research is being done? Using what methods or instruments?
Instinctively, I feel that these topics form the foundation for learning and instruction to all audiences. So here are a few highlights of what I've learned in each area from the conference.
1. How people learn.
Dr. Michelle Miller's session, Leveraging Cognitive Psychology to Create Compelling Online Learning Experiences, helped me recall some things I've learned about cognition. She shared some useful tips on applying concepts of memory and attention to online learning.
My conclusion about research in this intriguing area of learning is that we are still trying to figure out how to best motivate students to learn and to get students' undivided attention in a digital multimodal age.With student and faculty variability, motivation of learning is an absorbing area of research.
2. How is instructional design and development enhanced by new technologies?
Instructional systems design has changed a great deal with the availability of a whole slew of tech tools. Assessments still require a lot of careful crafting but the process of implementing assessments is changing. Remote and online testing is challenging instructors on how to best ensure authenticity and academic integrity. We have vendors like ProctorU, Kryterion, and Software Secure vying for the business of providing online proctoring of assessments. Knowing about all these tools and knowing which to use will become increasingly useful for online professional education based on meeting competencies.
I attended 2 sessions about instructional development that were time and labor-intensive:
- Dr. Wasim Barham, Southern Polytechnic State University, presented his use of a games-based virtual lab simulation for the Strength of Materials lab for civil engineering students.
- Dr. Cammy Huang, Google, created an interactive and simulation-based teaching approach for human physiology that was developed at Stanford University where she was formerly working. It also won the 2013 MERLOT Biology Classics Award. Check out the Virtual Labs: http://virtuallabs.stanford.edu/demo/
I attended one session about an app for data mining: Data Depository for EdLab's applications (by Teacher's College, Columbia University). I'm fascinated by learning analytics. I think it appeals to the research part of me. EdLab's concept is similar to Google Analytics but Manav Malhotra, the presenter, says it is more flexible and capable of tracking learner behavior in a "holistic and granular fashion." I'm waiting for it to be made available for public use. Meanwhile, I wonder how I can encourage educational data mining for faculty research on teaching.
Three words came to mind when I thought about what these presenters produced: Belief. Effort. Passion. These instructors/designers/developers believed in what they were doing: they wanted to connect with learners, to motivate them, and to bridge the learning distance. They invested a great deal of effort to produce these materials with a team. Their drive and passion sustained them. They talked about still wanting to do more despite or in spite of what they have done. I left wanting to work on a project of this magnitude. I want to do so much.
3. Scholarship of online learning and teaching.
There was one session on researching team cognitive constructs that entailed the use of meta-analysis. It wasn't so much about online learning, I think, but it is about shared cognition in learning spaces and communities. I could see it was a difficult study. The meta-analysis explored "the predictability of different shared cognition constructs on team performance." As Dr. Miller said, passive exposure is not much use. :) I learned about meta-analysis and will need to do meta-analysis before deep learning occurs.
John Turner, the presenter, used an instrument by Gall, M. D., Gall, J. P., and Borg, W. R. (2010) [Applying Educational Research: How to Read, Do, and Use Research to Solve Problems of Practice (6th ed.). Boston, MA: Pearson.] to rate the quality of the articles. It looks like an instrument for future use.
1. How do people learn: what's the latest in educational psychology and how is it applied to online education?
2. How is instructional design and development enhanced by new technologies?
3. Scholarship of online learning and teaching: What interesting research is being done? Using what methods or instruments?
Instinctively, I feel that these topics form the foundation for learning and instruction to all audiences. So here are a few highlights of what I've learned in each area from the conference.
1. How people learn.
Dr. Michelle Miller's session, Leveraging Cognitive Psychology to Create Compelling Online Learning Experiences, helped me recall some things I've learned about cognition. She shared some useful tips on applying concepts of memory and attention to online learning.
- Forget about traditional concepts of short-term memory. Stay within working memory and attention limits. Promote practice and automaticity (Bargh).
- Passive exposure is not much use. Effortful, attentive practice is what counts.
- Forget about perceptual learning styles. Instead pay attention to multimedia/multimedia research (e.g. Richard Meyer's).
- Harness the testing effect and encourage spaced study. See tests as learning activities.
- She recommends the gamification of learning.
My conclusion about research in this intriguing area of learning is that we are still trying to figure out how to best motivate students to learn and to get students' undivided attention in a digital multimodal age.With student and faculty variability, motivation of learning is an absorbing area of research.
2. How is instructional design and development enhanced by new technologies?
Instructional systems design has changed a great deal with the availability of a whole slew of tech tools. Assessments still require a lot of careful crafting but the process of implementing assessments is changing. Remote and online testing is challenging instructors on how to best ensure authenticity and academic integrity. We have vendors like ProctorU, Kryterion, and Software Secure vying for the business of providing online proctoring of assessments. Knowing about all these tools and knowing which to use will become increasingly useful for online professional education based on meeting competencies.
I attended 2 sessions about instructional development that were time and labor-intensive:
- Dr. Wasim Barham, Southern Polytechnic State University, presented his use of a games-based virtual lab simulation for the Strength of Materials lab for civil engineering students.
- Dr. Cammy Huang, Google, created an interactive and simulation-based teaching approach for human physiology that was developed at Stanford University where she was formerly working. It also won the 2013 MERLOT Biology Classics Award. Check out the Virtual Labs: http://virtuallabs.stanford.edu/demo/
I attended one session about an app for data mining: Data Depository for EdLab's applications (by Teacher's College, Columbia University). I'm fascinated by learning analytics. I think it appeals to the research part of me. EdLab's concept is similar to Google Analytics but Manav Malhotra, the presenter, says it is more flexible and capable of tracking learner behavior in a "holistic and granular fashion." I'm waiting for it to be made available for public use. Meanwhile, I wonder how I can encourage educational data mining for faculty research on teaching.
Three words came to mind when I thought about what these presenters produced: Belief. Effort. Passion. These instructors/designers/developers believed in what they were doing: they wanted to connect with learners, to motivate them, and to bridge the learning distance. They invested a great deal of effort to produce these materials with a team. Their drive and passion sustained them. They talked about still wanting to do more despite or in spite of what they have done. I left wanting to work on a project of this magnitude. I want to do so much.
3. Scholarship of online learning and teaching.
There was one session on researching team cognitive constructs that entailed the use of meta-analysis. It wasn't so much about online learning, I think, but it is about shared cognition in learning spaces and communities. I could see it was a difficult study. The meta-analysis explored "the predictability of different shared cognition constructs on team performance." As Dr. Miller said, passive exposure is not much use. :) I learned about meta-analysis and will need to do meta-analysis before deep learning occurs.
John Turner, the presenter, used an instrument by Gall, M. D., Gall, J. P., and Borg, W. R. (2010) [Applying Educational Research: How to Read, Do, and Use Research to Solve Problems of Practice (6th ed.). Boston, MA: Pearson.] to rate the quality of the articles. It looks like an instrument for future use.
No comments:
Post a Comment