Learning to learn online: A work in progress in helping students to learn self-regulation

Assorted Electronics on table

Today’s post is by one of our Institute Fellows, Dr. Susan Lowes, who is the Director of Research and Evaluation at the Institute for Learning Technologies, Teachers College, Columbia University. Enjoy her post about Learning to Learn Online.

In the early days of online learning at the K-12 level, one of the biggest concerns was the high rate of attrition. My particular interest is in what I call virtual classrooms—where the courses are paced by week, most communication is asynchronous, and there is a heavy emphasis on student-student communication. Virtual classrooms saw a lot of attrition, from students dropping out to students slowly fading away to students falling so far behind they couldn’t catch up. This made the courses harder to teach and harder to participate in. As an evaluator researching the effectiveness of these courses, I saw many complaints from students who posted to discussion forums but never got answers or from participants in group work who waited forever for fellow group members who never showed up.

At the time, there was a great deal of discussion of screening, including not only how to screen but who to screen in or out. This led to the development of predictive instruments, such as Peggy Roblyer’s ESPRI, and to self-paced “orientation” modules that asked students to reflect on their own ability to self-manage their learning. One issue with this type of orientation is that it is voluntary so it may well be the students who need it least who actually complete it. Another issue is that they really don’t replicate the experience of an online course and it is difficult for students who have never been in one to imagine how the demands will affect them.

I had two issues with the notion of screening, one philosophical and one from my own experience surveying students in these kinds of courses. The philosophical one was that many of us who embarked on online learning did so because we hoped it would open up opportunities for learning to those who had previously, for whatever reason, been excluded. (These reasons could range from the lack of availability of higher level courses, or courses in specific subjects, to discomfort with face-to-face classrooms.) However, the screening instruments suggested that certain behaviors, in particular, the ability to manage your own time, were necessary to be successful online learners. If this was the case, then many would be excluded.

The big surprise for me, though, was when I found that a huge majority of students who were asked to list the greatest benefit of taking an online course wrote that it had helped them to learn to manage their time well. In other words, they had to have had the opportunity to take the course if they were to learn this “soft” skill. If we had only allowed well self-regulated students into the course in the first place, the attrition rate might have been lower but this skill would not have been learned.

I, and many others became convinced that rather than screening students out, we had to change our focus and figure out ways to keep students in. Although Peggy Roblyer hoped that instruments like hers could be used to identify where help was needed, in practice very few schools had the time to do this type of analysis. However, some schools, including those I studied, began to put considerable effort into supporting students, in particular by having someone on site to monitor student progress. This focus on the student external support system helped a lot, but the more I thought about it, the more I thought that we also needed to focus on the student’s internal environment. The bottom line is that students need to learn how to learn online. I felt there was too much of an assumption that this simply happens by osmosis.

There are a lot of aspects to learning online, from learning to read on the screen to learning how to communicate clearly in a discussion forum, but experience told us that a fundamental aspect is what psychologists call “self-regulation” — in education, the ability to take control of and evaluate your own learning.

There is a fairly extensive literature on self-regulation, much of it from a few decades ago, and one of the most tested approaches was developed by Simon Rotter in 1966 and called the “locus of control.” Locus of control is based on a social learning theory that posits that individuals who feel that they can control their own environment are likely to adapt more easily to new situations and new environments than those who feel that they are controlled by forces outside of their control. Locus of control scores are on a continuum, from high internal to high external. Those who feel very much in control of what happens to them are said to have a high internal locus of control while those who feel what happens to them is controlled by forces outside of themselves are said to have a high external locus of control.

It seemed likely that the concept of locus of control could be useful for assessing students who were being asked to adjust a new type of learning in an unfamiliar virtual environment. In addition, it seemed possible that locus of control scores could be not only be used as a diagnostic, identifying students who need help learning to learn online but could also help students learn to learn online by providing them with an opportunity to reflect on their own learning. Rotter’s locus of control instrument asks the respondents a series of questions to see if they perceive certain actions or events to be more influenced by their personal decisions and choices (which would be an indication of an internal locus of control) or by forces beyond their control (an indications of an external locus of control).

In research (or my research anyway), serendipity often plays a role. The first serendipitous moment was in Spring 2012 when I happened to come across an article on self-regulation that discussed Rotter’s work. It occurred to me that his instrument could be used to show students where they were on the locus of control continuum and then give them an opportunity to reflect on the result. I have been evaluating the online courses created and delivered by Pamoja Education for the International Baccalaureate for a number of years and I proposed this at a planning meeting in Summer 2012. The second serendipitous moment came when I found that one of the Pamoja Education staff had recently heard a program on Rotter on BBC4’s Mind Changers series ( and loved it, so he enthusiastically embraced the idea, as did their Faculty Advisor, who also taught the online Psychology course.

The result was that in September 2012, we asked all incoming Pamoja students to take the locus of control quiz and think about their scores, which we emphasized were not set in stone but could change. We also noted that it seemed likely that having a high internal locus of control would be an asset in taking online courses but not necessarily for other aspects of life.

We had interesting findings, some of which were contrary to what we expected and all of which will need to be tested in the coming year with more complete datasets. For example, we found that students in Mathematics had a statistically significant different Locus of Control scores than students in Psychology (p = .03), with the Mathematics students having higher internal scores, but that there were not statistically significant differences among the other 10 courses. We had expected to find that students who subsequently dropped their courses would have higher scores (more external), but this was not the case. This may have been because the drop group was small or it may be because the drop group was exhibiting self-regulating in making the decision to drop. We found that those with lower comfort levels with computers had higher scores (higher external) (p = .053), perhaps feeling overwhelmed by the technology demands of an online course, but we also found that those who met often with their site-based coordinators tended to have higher scores than those who met infrequently with their site-based coordinators. As with the drops, this may have been because those who met less frequently were already more self-regulated and did not need to meet so often. Similarly, those who stated that one of their concerns was time management had lower scores (higher internal) than those who did not have this concern, which suggests that having that concern is a necessary step in self-regulation.

We asked them to do the same thing later in the year, but this time we asked them to reflect on the following three questions in their course blogs:

  • Do you agree with your locus of control score?
  • If you disagree with the score, why do you disagree?
  • Does learning your score make you think differently about your approach to your online course?

We got an amazing set of reflective responses. Most students agreed with their scores and only about one-third said that learning their scores had made them think differently about their approach to the online course. Here are a few examples of their posts:

  • “The score makes me realize that taking this online course is not just about completing the work quick and easily but really putting thought behind it and then using those online experiences into real life ones. Example of this would be in my relationships or with other homework or at work or on my sports team.”
  • “I think that knowing my own score it makes me realize a bit that if I want to do well, I should put in more effort, and if I don’t really care, then I shouldn’t put in a lot of effort.”
  • “As a student, I believe that what we do in class is tipped to the internal factors and is more based on hard work and skill. However, in a more general perspective, I believe that the locus of control can be fairly external if you take the idea that our genes would be considered as an outside factor.”
  • “In fact, this score made me think about the guideline to face what I have done and what I will have to do. I need to focus more on myself and what it attributes about the course, thus that I can have better understanding of my attitude.”

Statistically, we found almost no change from year-beginning to year-end, but we suspect that may have been because we had too many non-responders for the second quiz.

This leads to the changes we made for this academic year.

The most important change was to add the reflection activity into the beginning of the academic year as well as the end because we realized that asking students to reflect without giving them a venue to do so was not pedagogically sound. We also built a discussion of self-regulation into the mandatory self-paced orientation and let teachers know that they could introduce a discussion of self-regulation into their courses that week if they wished. We then rewrote the reflection questions to make them simpler and more pointed:

  • Do you feel your score accurately reflects where you are on the continuum from external locus of control to internal locus of control? Why or why not?
  • Do you agree that having a high internal locus of control is an advantage for online learners? Why or why not?

We made three additional changes to deal with the three research-related problems we ran into. The first was the lack of responders. The quiz was not required—we did not want it to seem like a test–so about 20 percent of the students did not answer the first time around and even more did not answer the second time around. This probably introduced bias and made it difficult to use the entire data set for analytic purposes, such as linking pre-quiz results to post-quiz results or correlating the results with information we had from the background surveys. The second problem was that many students did not complete the reflection activity because the reflections were to be completed as a separate blog activity and if a class was not using the blogs regularly for other purposes, then the students tended not to bother with this aspect of the exercise. The third problem was that students often did not remember their previous scores, even though they had been instructed to write them down, and so had difficulty reflecting on any change.

For this semester (Fall 2013), we, therefore, revised the protocol to account for these issues. First, since we still did not want to make the quiz required, we did a much more extensive follow-up. For example, students received several reminders via email and Pamoja Education made a huge effort to get site coordinators and teachers involved in reminding students. Second, we built the reflection questions right into the quiz itself, which meant that everyone who responded to the quiz also posted responses to the questions. And third, we set up a system where the quiz results were sent to the students via email right out of the survey system. We will see if that helps when we do the second round next spring. The results are currently being analyzed, but most students again appear to agree with their scores. Most important, it appears from the responses to the reflection questions that the students are taking the exercise seriously. Let me, therefore, end with a few of their responses from this fall:

  • “Yes indeed. I believe that my future is in my own hands and my actions are my responsibility. However, there needs to be a balance between external force and internal force because not everything is in your control some things you just need to let go.” [high internal]
  • “I feel that the result is quite accurate to how I view my life. I have responsibilities and chances to decide over my life to some extent and for my age, it is enough. If I make a mistake it is usually due to my own errors in thinking or doing. But I do also believe in a little luck in life, not everyone gets what they deserve in this world.” [mid-range]
  • “Yes, because there are a lots of things that are out of my control. That being said I don’t completely think that. One must work hard, and keep working hard if they really want something.” [high external]
About the Author

Susan Lowes is Director of Research and Evaluation at the Institute for Learning Technologies at Teachers College, Columbia University. She conducts research at both the university and K-12 levels, focusing on technology’s impact on teaching and learning, and directs evaluations of multi-year projects funded by the U.S. Dept. of Education, the National Science Foundation, state and local departments of education, and private foundations. She is interested in online learning and evaluates online professional development initiatives for teachers and administrators, as well as online courses and programs for students. Her recent focus is on teaching students how to learn on online, using locus of control, and on using LMS data to discover patterns of student-teacher interaction. Dr. Lowes is also Adjunct Professor in the Computers, Communication, Technology, and Education Program at Teachers College, teaching courses on online schooling and research methodologies. She received her Ph.D. in Anthropology from Columbia University.

Picture of Michigan Virtual Learning Research Institute

Michigan Virtual Learning Research Institute

The Michigan Virtual Learning Research Institute (MVLRI) is a non-biased organization that exists to expand Michigan’s ability to support new learning models, engage in active research to inform new policies in online and blended learning, and strengthen the state’s infrastructures for sharing best practices. MVLRI works with all online learning environments to develop the best practices for the industry as a whole.

The Digital Backpack

Get our latest articles sent straight to your inbox every week!