Saturday, September 23, 2017

Hungry for the Truth

"Reeses Peanut Butter Cups to Be Discontinued by Hershey's Before..."

Have you seen a headline like that recently?  Not only did I see it in my Facebook feed, I heard morning show DJs talking about the issue.  That makes it real right?!

As a teacher, I have spent countless hours coaching students to validate information.  Look at the website address.  Investigate the author.  Beware of adds.  Research the mission of the of the website.   In short - do your research.

When the delicious chocolate, peanut butter treat was threatened in headlines and when challenged to determine "real or fake" by Dan Russell's November 2015 blog post, "Fake or real?  How do you know?", I followed my own advice.  I let Google do the interrogation.
Szerelmi gyilkosságok (3. évad) by Lwp Kommunikacio.
Made available by CC by 2.0

I read blogs of others on the search to answer the challenge posted by Russell (2015).  I opened web search results to determine whether I could trust the source.  I researched authors.  I dug and read and researched.  And in the end, I fell back on my math/science background (you can't prove theories true you can only prove them to be false).  I determined that all signs supported the artifacts as fakes.

A healthy dose of skepticism a day keeps the frauds away.

In fair disclosure, I incorrectly surmised that a quote attribution to Thomas Jefferson was fake.  In a numbers game, however, 7 out of 8 is a decent rating when tallying the number of times I was misled by online content.

So my takeaway to share?  HURRY - go buy those Reese's Peanut Butter Cups!  Not because they are going to disappear from shelves, but because in this world of fake news and Photoshop, you are going to need some sustenance to fuel you while you research to uncover the truth.

A Reese's Peanut Butter Cup, Big Cup. by Evan-Amos.
Made available by Wikimedia Commons CC0 1.0
References:
Russell, Dan (2001.November, 11) Fake or real?  How do you know? [Blog post]. Retrieved from: http://searchresearch1.blogspot.com/2015/11/search-challenge-111115-fake-or-real.html

Tuesday, September 12, 2017

SAMR’s "Teach Above the Line" Goal Misses the Mark

Disclaimer: While this post starts with a heading that seems to bash the ed tech model SAMR, read on to see that it is not the tool, but the use of the tool that is evaluated.


Welcome to take three of the writing of this blog - a critique of SAMR.  
Round one was soapbox slam on the tendency of this model to be used as a challenge for teachers to “Teach above the line.”  (I have a hard time with any educational challenge that doesn’t start with learning.)  
Round two, I became so buried in research trying to uncover the impetus and intent of SAMR that my work sounded more like a research paper than a blog.

Take 3 - SAMR Uncovered
With a teacher’s drive to transform education, Dr. Ruben Puentedura developed the SAMR model to help teachers understand the applications and effect of technology use in the classroom.  A relatively simple four tier system, SAMR is further divided into two levels.  
The first level of SAMR is where technology is used to adapt manual tasks (Substitution - Tier 1) and to improve upon manual tasks (Augmentation - Tier 2).  The effect size (amount of learning and growth that can be attributed to a teaching technique) is relatively low for these two uses of technology.  

The second level is where Puentedura feels real transformation exists.  Modification (Tier 3) is the opportunity for technology to totally change the way typical tasks are executed (Think blogs where peer and public commentary drives editing and revision instead of teacher-centered instruction of a hand written or even typed journal).  Redefinition, the highest tier in the second level, describes a scenario where technology opens the door for entirely new learning tasks not previously possible with textbooks and research based Internet use.
By Lefflerd (Own work) [CC BY-SA 4.0 (http://creativecommons.org/licenses/by-sa/4.0)], via Wikimedia Commons


The Value of SAMR
I mentioned that research has been done to measure effect size on learning of each tier of SAMR. While this research isn’t easy to find and is not explicitly revealed in the infographic by Dr. Puentedura, the effect size increases through the progression of Substitution up through Redefinition. Effect size research can help an instructor create lessons and identify resources that will maximize student learning, and SAMR, in turn, could be used as a reminder of these effect sizes when a teacher evaluates an ed tech tool.

As a clear example of the applications of technology in instruction, SAMR can also be used to help create professional development opportunities for teachers.  The trend of micro-credentialing could occur at each tier of SAMR.  This could support the development of technology pedagogy as described in the TPACK model.  (I’ll refrain from diving into TPACK, but here is a link to a video on the topic created by educator Travis Bohon.)

The Shortfalls of SAMR
Too often, it seems that professional development in the area of ed tech leans toward how to use specific technology tools (totally an opinion, I have done no research on this). I think this tendency overshadows tools like SAMR and causes a parallax where viewers overly concerned with HOW to use ed tech tools instead of WHY to use ed tech tools see SAMR as a way to determine the worthiness of an app instead of the appropriateness of an app.  

At risk of offending the author, I’ll illustrate this shortfall with a post from Getting Smart - a website I value as a professional.  In her 2013 piece, "Using SAMR to Teach Above the Line", ed tech guru Susan Oxnevad shares her experience attempting to find a model for technology integration that would stick. Enthusiastic about Apple's adoption of SAMR, Oxnevad describes examples of her personal exploration of lesson design and technology integration using Dr. Puentedura's model. At the close of her article, Oxnevad celebrates the SAMR model and applauds a Chicago-area instructional coach who challenged educators in her district to "Teach above the line." The "line" referred to by the instructional coach and Oxnevad is the division between the Substitution/Augmentation levels of SAMR and Modification/Redefinition.  The line that delineates useful technology applications from transformative technology applications.
Teaching “above the line” ignores pedagogy which would have teachers choose tools that best support a learner.  Instead, this mantra pushes teachers to find good tools instead of finding good strategies.

Even Dr.Puentedura writes:
It is important to note that no particular "quality" label should be attached to any of the tiers. Thus, the introduction of a Tier I tool rather than a Tier IV tool may be perfectly appropriate, if it best suits the pedagogical goals at hand. (Puedentera 2003)

I’ll use the words of Associate Professor of Library & Information Science Lucy Santos Green to sum up my thoughts on this shortfall:
If the misuse of technological models hurts our ability to be effective technology leaders, then the emphasis we place on technology over pedagogy may negate our influence altogether. (Santos Green 2014)

In closing, SAMR is reference tool not an evaluation tool and learning needs to be the focus of instructional planning. I’ll leave you with a video where students explain SAMR, but leave viewers with the very important reminder, “...the ultimate outcome for integrating technology should be simple: maximizing student success.”  



References:
Puentedura, Ruben (2003). A Matrix Model for Designing and Assessing Network-Enhanced Courses. Retrieved from http://www.hippasus.com/resources/matrixmodel/index.html

Santos Green, Lucy (2014). Through the Looking Glass - Examining Technology Integration in School Librarianship. Knowledge Quest, 43(1), 36-43. Retrieved from: http://www.lucysantosgreen.com/uploads/6/8/3/3/6833178/through_the_looking_glass.pdf

Wednesday, July 5, 2017

Integrated Design - A Model Mash-Up

Old-School Instructional Design
If you have done any work with instructional design models, you likely know Merrill's First Principles of Instruction:
Merrill's First Principles of Instruction by Bailey, L. (2016, May 27)
And, in the event that you didn't read my previous blog, you should know that Merrill came about these five guideposts looking to uncover similarities among a plethora of instructional design models (Merrill 2002). One of of the most accessible models, in term of simplicity without sacrificing valuable instructional concepts, Merrill's principles lack one, major detail - assessment. 
Not included in Merrill's work is the design model first introduced by Jerrod Kemp (Kemp, 1985). Like a tasty, chocolate-coated treat, this model was later updated to include a candy coating of assessment (Kemp, Morrison & Ross, 1994).
(Papadakis, 2014)
Mad scientist.svg
By J.J. at the English language WikipediaCC BY-SA 3.0Link
What might happen if we combined the work of Merrill and Kemp into a model that can be adapted for a single-subject classroom or a multi-subject program? (insert evil laugh here)


The Mash Up
Kemp had a thorough design with his nine elements in the inner circle of his instructional design model. It would feel negligent to take away this guidance, so at a low-elevation examination of this mash-up - the nine core components remain as a security blanket around Merrill's first step: Task.

With this new model, designers are encouraged to take a comprehensive look at the desired task. Using the nine elements, the task can be thoroughly evaluated and dissected to prepare for the instructional delivery.

As provided for in Merrill's model, the instructional delivery (as opposed to the planning/design at the "Task" stage) starts with Activation of prior knowledge.  Included for designers is the suggestion that a formative assessment might be helpful to ensure students have the prerequisite knowledge
After delivery of Activation another formative assessment may be need to ensure successful "activation".
This pattern repeats through the Demonstration phase, again including formative assessment to ensure development of knowledge and skill is accurate.  
I'll take a moment to quote a college professor, "Only perfect practice makes for perfect."  This professor asserted that it is easy to inadvertently practice a skill wrong and develop at best bad habits and at worst erroneous neural paths that would make it difficult to re-learn the correct way.
My favorite phase of Merrill's design is that which teenage and adult learners beg for Application.  "Application" is where learning becomes relevant and answers the famous question, "When will I ever use this outside of high school?!"
It is by intentional design that Kemp's nine elements of instructional "planning" (my word, not his) are left out of the Activation, Demonstration and Application phases.  These phases and their elements are all planned for in the "Task" phase.  Merrill's stages are pulled out of the Task phase in order to suggest timing for formative assessment.
However, Merrill's final stage, Integration, comes back to engage with Kemp's nine elements because truly understanding how a task integrates with other subjects and/or tasks can take careful review and collaboration.
This collaboration element is revealed at the highest elevation of my mash-up model:
Trends in education are emphasizing collaboration and integration.  Competency-based education, place-based education, and project-based education are just examples of educational models that push teachers and students to identify connections in curriculum.
With my mash-up, the "Integration" phase intentionally hangs over the capsule of the phases for a particular task.  Integration serves as the attachment point for other concepts (on a micro scale), other course work, and even other content areas (on a macro scale).  Surrounded by summative evaluation, this outer layer encourages educators and instructional designers to consider how the concepts relate before designing a final assessment of mastery.

Critical Review
By "Chris" at "Friends in the Freezer"
It would be easy to cite failed combinations in the world, as clearly we can't all be the mixologists that work for Ben and Jerry's.  So, I'll take these final moments of your attention span for a critical comparison of my mash-up against the model's tested and touted by Kemp and Merrill.

References:
Bailey, L (2016, May 27). Digital Natives, Media and Learning: Implications for the Future of Army Training. www.armyupress.army.mi. Retrieved from http://www.armyupress.army.mil/Journals/Military-Review/Online-Exclusive/2016-Online-Exclusive-Articles/Digital-Natives-Media-and-Learning/

Kemp, J. E. (1985). The instructional design process. New York: Harper and Row.

Kemp, J. E., Morrison, G. R., & Ross, S. V. (1994). Design effective instruction, New York: Macmillan.

Merrill, David M. (2002) First Principles of Instruction. Educational Technology Research and Development, Volume 50, (Number 3), 43-59

Papadakis, Jenny (2014) The Kemp Model of Instructional Design. Retrieved from: http://etec.ctlt.ubc.ca/510wiki/The_Kemp_Model_of_Instructional_Design

Saturday, June 24, 2017

Toe-may-toe or Toe-mah-to - Are the creators of instructional design models just being persnickety?

In homage to my classroom mantra, "Fake it 'till you make it", when I was recruited by my employer to turn our entire two year, multi-subject curriculum into a curriculum that could be delivered through online modules, I embarked on research to learn as much as I could about instructional design.

What did Google reward my efforts with? Instructional Design, a website compiled by Greg Kearsley and Richard Culatta, both are education gurus working to bring innovation to education.  To be honest, I wasn't sure if I had found a gold mine or a land mine.

Twenty-four models of design philosophy later and my mind was spinning.  Could there really be more than two dozen unique models of instructional design?  Apparently my questioning was as redundant as the instructional design models.  Published in 2002, 13 years before my quandary, in the journal Educational Technology Research and Development, M. David Merrill questioned, "Are all of these design theories and models merely alternative ways to approach design?" (Merrill, 2002)


Seventeen pages and 40 referenced authors later, Merrill concludes, with a handful of exceptions which he lists for further research, "No theory or model reviewed includes principles or prescriptions that are contrary to those described in this paper." (Merrill, 2002)

In the scientific model of proving the validity of a theory by trying to disprove it, let's compare Merrill's identified commonalities to the instructional design model offered by Lev Landa: Algo-heuristic theory. Proposed nearly twenty years prior to any other constructivist theory reviewed by Merrill, Landa suggests that true learning occurs when students piece together concepts of a process to better understand how to apply knowledge. (Culatta, 2015).

Training Cartoon - Image of Sloth and Froth - Perspectives on Algo Heuristic Theory. by Shafali Anand

Created for a page of her website that is dedicated to algo-heuristic theory, Shafali Anand illustrates the seemingly antagonist components of the method which compel instructors to help students uncover learning through experiences (heuristic means to learn for one's self...I had to look it up too) and then to solidify this learning through the establishment of rules, or an algorithm (hence, "algo") that help students replicate the process in the future without having to re-learn through experience.

I offer the explanation of the model below:



Did you catch it?  Are you mind blown?

Shouts Shock Sing According To Cry Call Sing by Max Pixel
 is licensed by CCO Public Domain
If you missed the overlap between Landa's theory and Merrill's, let's review Merrill's principles to help you achieve a "light bulb moment".

If you will recall, Merrill spent his free time reviewing a surplus of instructional design models in an attempt to "identify and articulate the prescriptive design principles on which these various design theories and models are in essential agreement" (Merrill, 2002). The info-graphic below summarizes his findings.
Merrill's First Principles of Instruction by Bailey, L. (2016, May 27)
Now, let's compare Merrill's common elements of instructional design with Landa's algo-heuristic model:
  1. Merrill asserts that a commonality among instructional design models is a task; and this is clearly illustrated in Landa's work.
  2. While you can assume students would have to activate prior learning - at least subconsciously - in order to complete the task, Merrill's second phase of "activation" is not overtly present in Landa's algo-heuritic theory.
  3. The models reconnect with a demonstration and then application of the desired skill.
  4. Merrill does not include any mention of reflection in his findings, but thankfully, Landa reminds instructors of the value of reflection as a metacognative skill that reinforces learning.
  5. Finally, just as Merrill's instructional principles predict, Landa's model ends in integration - or the opportunity for student to use their learning as it applies in a new context.

Merrill never reviewed Landa, but clearly Merrill had Landa's number.  The real questions, that Merrill didn't examine is the "completeness" of any of the models.  He identiified the commonalities, but he did not take any time to identify gaps outside of his principles.

Before I leave you with the words of Merrill, let me leave you with a cautionary evalatuation of the both Merrill's principles and Landa's algo-heuristic model.  Both models of instructional design focus on making learning meaningful for the learn.  With a focus on learning toward the higher end of Bloom's Taxonomy and student centered approaches, both gentlemen describe ideal experiences with no room for error.

The gaping hole in both models comes when an instructor looks for the data trail to indicate student understanding and accurate formation of a mental model for the lesson.  Neither structure provides for formative or summative evaluations.  While Landa does carve a role for the teacher in demonstrating a method for achieving desired results, neither takes into account the responsiveness a teacher can provide through casual monitoring and assessment.

So, if you are on an epic journey to find the holy grail of instructional design models; the short cut to the journey is to spend some time with my friend, M. David Merrill.  Just remember, even as Indiana Jones encountered booby traps on his quest for holy grail, you also will fail if you fall into the trap of thinking Merrill has captured a complete picture of instruction.

His work is a terrific start, though:


References:
Bailey, L (2016, May 27). Digital Natives, Media and Learning: Implications for the Future of Army Training. www.armyupress.army.mi. Retrieved from http://www.armyupress.army.mil/Journals/Military-Review/Online-Exclusive/2016-Online-Exclusive-Articles/Digital-Natives-Media-and-Learning/

Culatta, Richard (2015). Algo-Heuristic Theory (L.Landa). Retrieved from: http://www.instructionaldesign.org/theories/algo-heuristic.html

Merrill, David M. (2002) First Principles of Instruction. Educational Technology Research and Development, Volume 50, (Number 3), 43-59

Thursday, April 27, 2017

Methods for Online Teaching and Learning (EDU 654) - Personal Reflection

The close of my first semester as a graduate student (attempt number two) has arrived and in turn, the expectation of reflection on my experience and learning.

The course outcomes for Methods for Online Teaching and Learning can be synthesized to the development of online instructional strategies and the development of best practices in encouraging multi-directional communication and feedback between students and instructors.  If I begin my journey here, it is easy to see what I have learned (and what I still need to learn!).

Online Instructional Strategies
Online instruction is a dichotomous world of asynchronous events and synchronous events. Current learning models throw environment into this mix - where a student may be learning from home or from a brick and mortar school, or from...  The models are varied and the best "use" of the instructor can be argued, but my learning distilled two important characteristics of effective online instruction: student engagement and class discussions.

Discussions It could be argued that discussions are one way to engage students, but my reading and experiences in EDU 654 made it obvious that discussions are the wormhole to deeper learning that also allow teachers to build community, perform formative assessment, and help students form the connections with material that enable them to do more than regurgitate learning.  Choosing the format (casual versus formal), the location (in a learning system versus social media), and even the roles of participants (leader, participant, questioner, antagonist, summarizer, etc.) has a great impact on the educational outcome.  As if those decisions aren't big enough, teachers still have to find ways to engage students in the conversation.

Which leads me to characteristic two of an effective online learning environment:

Engagement - Learning style inventories and movements such as Universal Design for Learning have come into play because research has solidified the belief that student learning is proportional to student engagement. (proportional - my first attempt at a master's degree was a masters in math)
Engaging students in online learning environments is more challenging from the perspective that an instructor can no longer use volume/tone of voice and proximity to maintain a student's focus.  However, because online learning can be anonymous, social stigmas that inhibit participation in face-to-face learning can be reduced to allow a broader audience.  Ultimately, the same rule for student engagement in a face-to-face classroom applies to an online environment: consider the needs of your students and antagonize and/or incentivize their participation while meeting their needs.

Self-Assessment of My Development of Online Instructional Strategies - Experience has taught me how to build relationships and use learning styles to personalize learning in a traditional classroom. This skill set translates fairly easily to an online environment.  Where I would like to continue to develop personally is finding ways to get students to take more of a leadership role in asynchronous discussions.  Building personal connections and seeking information relevant to their own ideas feels like a great way to maximize student learning.  I would also like to continue to develop stronger skills in facilitating synchronous, online learning experiences.  This is a deep skill set that requires TPAK at a mastery level that I have NOT yet achieved.

Communication and Feedback
While not part of Methods for Online Teaching and Learning, one of the best examples of the power of communication and feedback on learning is the video Austin's Butterfly: Building Excellence in Student Work. First, this video is a clear demonstration of the power of meaningful feedback.  Through continued pursuit of a high standard, a first grade student constructs an image of a butterfly that begins as a rough image of a winged insect but, with the help of peer feedback, becomes a textbook-worthy illustration.  You read that I said first grade student right?!  That means the feedback came from peers in first grade also. Peer feedback is the second part of the importance of this video in illustrating my learning.  Coaching of the instructor to build a community of high expectations led to peer feedback eliciting outstanding results.

The connection between this video and what I learned in Methods for Online Teaching and Learning comes down evaluating the best possible role of the teacher in the classroom.  Is it the sage on the stage?  Is it the guide on the side?  Can it be both? Communication and feedback are the key to answering these questions.  Communication pathways between students and between teacher and students are what establish the classroom culture.  If cultivated well, the vital feedback necessary for growth and learning can come from peers or the teacher.  The role of the instructor then becomes that of facilitator - a sometimes leader, a sometimes participant, an all-time instructional leader.

Self-Assessment of My Development of Communication and Feedback in My Classroom
A lot of the activity in the classrooms I supervise are evaluated with performance rubrics.  This inherently requires communication between student and teacher to help the student ensure the performance or artifact of learning demonstrates proficiency.  What doesn't come as naturally is the peer feedback and communication.  As I continue in my role of curriculum and instructional design manager, I will endeavor to provide the professional development necessary to my staff to establish this multi-dimensional world of communication and feedback.  Based on my learning from this course, I believe this will amplify the learning in the classrooms.

In Conclusion
Teaching is an art form.  There is no scientific formula that makes a perfect teacher.  Online and blended instruction only add to variables that a teacher must balance for an effective classroom. Methods for Online Teaching and Learning illustrated this art form and its complexity.  As it is only my first course in online pedagogy, I am in awe of the remaining techniques to be learned.

Thursday, April 20, 2017

Feast or Famine: Providing students with the FEEDback They Want Without Starving Yourself of Freetime


From the joy of the little gold star to the dread of red ink, we have been trained from an early age to seek feedback from our instructors as a means of growth.  Whether teaching to a live and/or lively class of 26, to a blended class of 30, or to an online class of 12, providing feedback can be a challenge.  Gone are the times that right/wrong will suffice for feedback.  With the growing trends of competency-based education and growth mindset, ensuring our students have meaningful guidance is at the forefront of our teaching role.
"Feedback is an important intervention for the online educator because it is an opportunity to develop the instructor-learner relationship, improve academic performance, and enhance learning." Leibold and Schwarz (2015)
Delegate
In this age of instant gratification, it is not enough for students to get meaningful guidance eventually, students are seeking feedback that they can use to improve NOW. Douglas et al. (2016) found that a delay in feedback from the teacher resulted in negative class ratings from students because they felt the feedback came too late to actually use.

The good news is that research is pointing to students to help relieve the grading burden. Ching and Hsu (2013) conducted a study that engaged students in providing formative feedback to each other in a project-based online learning environment. While the work of Ching and Hsu( 2013) did show that coaching students on how to give effective feedback would be necessary, the results clearly indicated that students appreciated the feedback and were able to use the suggestions from their peers to improve their projects.  

And for those teachers worried about being judged for the use of "child labor", Liu and Carless (2006) have great research that supports the use of peer feedback to promote learning to higher levels:
"One important way we learn is through expressing and articulating to others what we know or understand. In this process of self-expression, we construct an evolving understanding of increasing complexity. One aspect of this process is providing learners with opportunities to explore and articulate criteria and standards in the context of working on specific assessment tasks. In order to clarify notions of quality, learners need to analyse real, illustrative exemplars. This is where examining the work of peers offers meaningful opportunities for articulating discipline-specific knowledge, as well as criteria and standards. " Liu and Carless (2016)
Big Picture Grading
Once you are able to train your students to give feedback and reduce your workload, the next "letting-go" moment comes in with the use of big-picture, rubric-based feedback.  Whether you are reviewing the steps a student took to complete a math problem or circling fragments in an essay, the inclination to go line-by-line in student assignments to provide specific feedback may be a time-waster.  Jones and Blankenship (2014) found that students preferred and felt more able to use big picture feedback.  The students in this research study indicated that in-text feedback was less helpful than comments on their overall work.  When used with a rubric - which had more finite details - students reported that big picture feedback gave them ideas on how to improve.

Tech Support
Finally, when you have embraced a new paradigm of grading, you may be ready for one-last suggested tweak to your practice.  Orlando's (2016) research into the use of video for providing feedback shows that students felt they retained more of the suggestions when they heard the teacher's feedback - as opposed to when they read the teacher's feedback. Instead of inserting comments for specific incidents of feedback, imagine using a program like Screencastify to record your thoughts and comments for students.
The cautionary tale behind this suggestion, however, is that Borup et al. (2017) found no link between increased social presence and use of video feedback when compared to text feedback.  And, while Borup et al. (2015) showed that video feedback fostered supportive and conversational feedback, both students and teachers preferred the efficiency of text feedback.
In Summary
While changing times have increased the demand for feedback, challenging students to become part of this process has the benefit of reducing teacher workload and amplifying student learning. Furthermore, whether through pen and paper or video dialogue, using rubrics and big picture feedback to guide student performance results in more digestible applicable information for students while also saving teachers from having to insert line-by-line commentary.  

References:

Borup, J., West, R., & Thomas, R. (2017); An analysis of instructor social presence in online text and asynchronous video feedback comments; The Internet and Higher Education, Volume 33, pages 61-73
Borup, J., West, R., & Thomas, R. (April 2015); The impact of text versus video communication on instructor feedback in blended courses; Educational Technology Research & Development, Volume 63 (Number 2), pages 161-184
Ching, Yu-Hui; Hsu, Yu-Chang; Peer Feedback to Facilitate Project-Based Learning in an Online Environment; IInternational Review of Research in Open and Distance Learning, Volume 14 (Number 5), pages 258-276
Douglas, Tracy; Salter, Susan; Iglesias, Miguel; Dowlman, Michele; Eri, Raj (2016); The Feedback Process: Perspectives of First and Second Year Undergraduate Students in the Disciplines of Education, Health Science and Nursing; Journal of University Teaching and Learning Practice, Volume 13 (Number 1)
Jones, Irma S.; Blankenship, Dianna (2014); What Do You Mean You Never Got Any Feedback?; Research in Higher Education Journal, Volume 24
Leibold, N.; Schwarz, L.M (2015); The art of giving online feedback; Journal of Effective Teaching, Volume 15(1), pages 34-46
Ngar-Fun Liu and David Carless (2006); Peer feedback: the learning element of peer assessment; Teaching in Higher Education, Volume 11 (Number 3), pages 279-290
Orlando, J (2016); A comparison of text, voice, and screencasting feedback to online students; American Journal of Distance Education, Volume 30 (Number 3), pages 156-166

Wednesday, March 22, 2017

Is there hope for synchronous learning?



“Sit and Get"
Is there more to synchronous classes than sit-and-get learning?  I mean, you are online - frequently without video to hold you accountable for paying attention - trying to listen and learn, but the lure of social media is strong.  

Listen to author and motivational speaker Simon Sinek talk about the dopamine release we get from social media:

Simon Sinek – Social Media & Dopamine from Daniel Maurer on Vimeo.


How is an online teacher even supposed to compete with a brain chemical response so powerful it can lead to addiction?


Even searching for journal articles on how to improve synchronous learning was a less-than-stimulating experience.  While topics may have seemed to vary - how to effectively use chat….how to use video, etc. - the overall theme I encountered was article after article of researchers seeking a panacea to student engagement.  Not that student engagement is exclusive to online learning. Kathy Dyer (2015) wrote a blog post for the Northwest Evaluation Association citing 20 years of research in bricks-and-mortar classrooms on the connection between student engagement and student learning. Digital proximity, however, makes student engagement an extraordinary challenge.

Visual Stimulus

One promising technique for increasing student engagement in online classes is a case study that tested the use of visual stimulus in synchronous learning. Lai, et al (2016) showed that humorous images used to indicate whether a student response is correct both increased engagement and reduced student stress levels.  In fact, the humorous images even increased participation in students who received feedback that their responses were incorrect.

I can personally attest to a positive learning experience with such images. I recently co-led a synchronous class that utilized the web app Quiziz. Both right and wrong answers were followed by immediate feedback in the form of these humorous images. I had fun while participating in the quiz. It might have been the most engaging, online learning experience I have had the pleasure of engaging in. Even good jokes get old though. There has to be more to visual stimulus than memes.

In a summary of educational research, Tello-Acosta (2015) reported that short, focused video clips and/or slide show presentations can help engage students in online classes. Tello-Acosta (2015) warns however that “... it is vital to supplement the PPT (PowerPoint) with a live running commentary by the professor. PPT slides, therefore, should contain as few words as possible and could include graphics or a background.”

"Running commentary by the professor"!?!

"Asleep at the Wheel" by Aaron Jacobs,
licensed under CC BY-SA 2.0

While the intent of Tello-Acosta (2015) was surely not intended to be read as "teacher should talk the whole time," I definitely heard Charlie Brown's teacher in my head when I read this. I had flashbacks to professional development sessions where the facilitator read his slideshow to the educators in the audience. There is a reason for the PechaKucha movement!

All jesting aside, know you can trust that the use of visuals to support learning is backed by research suggesting that this technique increases engagement.

Immediate Feedback

Just as  Lai, et al (2016) found success with immediate visual feedback, Rodríguez-Bonces and Ortiz (2016), found success with immediate peer feedback delivered via the chat feature of the learning management system. Students readily reported that they appreciated the opportunity to engage with their peers in a "live" format and liked receiving immediate feedback on their actions from other classmates. In fact, the only complaint was that students wished there could be MORE time in class(Rodríguez-Bonces and Ortiz, 2016)! More time in class wished no student ever! That must have been a great facilitated chat discussion!

Universal Design for Learning
One final research paper worth noting, for those seeking the elixir of student engagement, an instructional design tool used most notably in special-education-K-12 realms - Universal Design for Learning (UDL)- has been shown to increase engagement in online learners simply by...wait for it...meeting student learning needs. The 51 teachers in the study participated in focused training on UDL techniques to help them plan lessons that provided options and supports for personalized learning(Coy and Marino, 2014).

Now that paper is chalk-full of shocking information - directed teacher training and personalized learning supports increase student engagement?!

The KISS Principle

Hours worth of research on synchronized learning only to come full circle to education 101; adults and youth learn best when instructors focus on the supports each student in their classroom will need to maximize learning. We didn't really need to wait until 2017 for Personalized Learning to be formally defined by the United States Department of Education. Those are just common sense teaching practices that can be used in any setting.


References:
Acosta-Tello, E. (2015). Enhancing the online classroom: effective use of synchronous interactive online instruction. Journal of Instructional Pedagogies, Volume 17, 1-6


Coy, Kimberly and Marino, Matthew T (2014).Using Universal Design for Learning in Synchronous Online Instruction. Journal of Special Education Technology, Volume 29 (Number 1), 63-74


Dyer, Kathy (September 17, 2015). Research Proof Points – Better Student Engagement Improves Student Learning. Teach.Learn.Blog The education blog. Retrieved from https://www.nwea.org

Lai, Chia-Hung; Liu, Ming-Chi; Liu, Chia-Ju; Huang, Yueh-Min (2016). Using Positive Visual Stimuli to Lighten the Online Learning Experience through in Class Questioning. International Review of Research in Open and Distributed Learning, Volume 17 (Number 1), 23-41



Rodríguez-Bonces, Mónica; Ortiz, Kris (2016). Using the Cognitive Apprenticeship Model with a Chat Tool to Enhance Online Collaborative Learning. GIST Education and Learning Research Journal, (Number 13), 166-185