A study by Mullen and Tallent-Runnels (2006) found significance in the differences between online and traditional students' reports of instructors' academic support, instructors' demands, and students' satisfaction. They also recognized the limitation to their study was their demographic data. In an original report funded by the Alfred P. Sloan Foundation, Dziuban, Hartman, Moskal, Brophy-Ellison, and Shea (2007) identified multiple student satisfaction groupings into dimensions that can provide structure to studies in online learning. In the present study, differences between academic standing and ethnicity were added to broaden the Mullen and Tallent-Runnels (2006) demographic deficiencies and used several student satisfaction dimensions identified by Dziuban, Moskal, Brophy-Ellison, and Shea (2007) and Moskal, Dziuban, and Hartman, (2009). The differences between academic standing and the studied dimensions were found to be statistically significant: the largest effect was Facilitated Learning and academic standing, which accounted for nearly 15% of the student scores’ variance; Engagement and Information Fluency had variance effects of, respectively, 4.5%, and 5.1%. However, error accounts for the majority of total variance in all the tests, which implies other variables’ influence.

(Bradford, 2011) This study sought to explore if a relationship exists between cognitive load and student satisfaction with learning online. The study separates academic performance (a.k.a., “learning”) from cognitive load and satisfaction to better distinguish influences on cognition (from cognitive load) and motivation (from satisfaction). Considerations that remain critical to the field of instructional design, as they apply to learning online, are described and used to guide a review of the literature to find directions to fulfill the goal of the study. A survey was conducted and 1401 students responded to an instrument that contained 24 items. Multiple analysis techniques found a positive, moderate, and significant (p < .01) correlation between cognitive load and satisfaction. Most importantly, the results found that approximately 25% of the variance in student satisfaction with learning online can be explained by cognitive load. New constructs emerged from a Principal Component Analysis suggest a refined view of student perspectives and potential improvement to guide instructional design. Further, a correlation, even a moderate one, has not previously been found between cognitive load and satisfaction. The significance of this finding presents new opportunities to study and improve online instruction. Several opportunities for future research are briefly discussed and guidelines for developing online course designs using interpretations of the emerged factors are made.

(Hirumi, Bradford, Rutherford, 2011) Like others, the US Army, is examining the use of emerging instructional technologies, such as but not limited to blended learning (BL) to optimize training. The problem is there are neither established formulas nor published algorithms for determining which aspects of a course to put online and to administer face-to-face (f2f) to facilitate blended learning. To address the problem, a team of Instructional Designers examined existing training and formulated, tested, refined and transferred a process for analyzing and nominating specific aspects of military intelligence (MI) coursework for either residential (f2f) and/or distributed (online) delivery.

(Bradford, Kehrwald, Dinmore, 2011) In this session, a conceptualized framework is presented to provide an organization a tool by which to self-evaluate their online learning initiative. The tool is a methodology that leverages Ellis and Goodyear’s framework, as well as leverages activity theory (Cole & Engeström, 1993), and a new categorization of online learning as described by Norberg, Dziuban, and Moskal (2011). The methodology is presented in sufficient detail to permit application to most any online learning implementation. A case study will be forthcoming, as this framework will be initiated in 2012 at the University of South Australia.

(Dinmore, Kehrwald, Bradford, 2011) This paper outlines the ePortfolio implementation process at the University of South Australia. The eP system, powered by the open- source ePortfolio Mahara, is one element of an integrated suite of technology enhanced tools for teaching and learning at the University and will be available to all students and staff from the second half of 2011. This ePortfolio system has been chosen because of its flexibility and its capacity to be the venue for many complex tasks. We have sought to conduct the implementation of the eP, for teaching and learning purposes, at a programmatic level across the institution. We recognise that for a system like this to operate optimally it needs to be integrated within a program of study at every year level and that piecemeal approaches to using ePortfolios, while of some value, do not ultimately allow the full potential of portfolio learning styles to flourish. This paper reports on the work-in-progress of our ePortfolio implementation.

Presented at the Summer 2009 Faculty Development Conference, University of Central Florida, Orlando, FL, May, 2009.

The concept of the “information age” implies we live in a different era than before, where information is the primary product of society. The raw reality of living and working in the information age is that the sheer volume of information being published, and potentially being accessible through the Internet, is so large that it is becoming increasingly more likely that you will not find the information you need, even with really good search strategies. When you are successful, however, you need to store that information into a place where others who are interested in that topic will also find it. This material intends to break new ground into using Social Bookmarking system Diigo to support research efforts. An advanced prototype for online course material includes 5 modules that allow instructors to mix and match what they need to take advantage of this technology.

Does cognitive load play a role in student satisfaction for online students? A relationship study on 1,401 university students showed a moderate, positive, and significant relationship exists. Analytics suggest a refined view of the student perspective that provides guidelines for instructors and designers.

(Phillips & Bradford, 2011) Teaching online is vastly different from teaching a comparable face-to-face class, even though this comparison is often made. The unique characteristics of teaching online require different approaches, tools, communication and techniques. Today's literature highlights many best practices and teaching strategies, including case studies illustrating successful examples. However, each online class offers such unique characteristics that it is difficult to produce one model illustrating all online classrooms. This presentation will guide you through the discovery and development process of your online teaching persona, and attempt to operationalize the various tools and strategies you might utilize to effectively support your individual style, while maintaining integrity with your teaching (and learning) philosophy. You will find this to be a unique, innovative faculty development approach to promote instructor presence in an online course. The session will be of particular interest to online teaching faculty, instructional designers and others with varied levels of experience. Those delivering online and face-to-face faculty development will be extremely interested in attending. The take away will be an exercise that could take up to two hours to fully develop and operationalize these factors in your online course. Those attending will be asked to volunteer their ideas to support the materials being presented and bolster online teaching presence.

(Bradford & Cato, 2011) More US students are taking online courses this year than last, and class sizes are increasing. The increase pressures faculty to abandon rich assessments to automated testing. An alternative strategy is streamline grading using technology. This session reviews project history, technology, and experiences to receiving and giving feedback using technology.

(Bradford, 2011) In this session, a conceptualized framework is presented to provide an organization a tool by which to self-evaluate their online learning initiative. The tool is a methodology that leverages Ellis and Goodyear’s framework, as well as leverages activity theory (Cole & Engeström, 1993), and a new categorization of online learning as described by Norberg, Dziuban, and Moskal (2011). The methodology is presented in sufficient detail to permit application to most any online learning implementation. A case study will be forthcoming, as this framework will be initiated in 2012 at the University of South Australia.

(Bradford, Smith, Grech, & Birbeck 2012) Aims and Background The reality of curriculum design is that educational complexity affords what can only be called a hidden curriculum. However, in Australian higher education the focus is increasingly on the rigor and coherency of curriculum design and alignment particularly as accrediting bodies demand education providers to provide evidence of how the intended curriculum is enacted in the classroom. Acknowledging the potential for a gap between ideology and classroom reality provided the context for a project at the University of South Australia to support the Bachelor of Nursing program with design processes, while simultaneously providing a mechanism to surface the hidden curriculum and avoid curriculum creep. Methods Informed by the field of curricula design, a mapping alignment tool was devised and populated with institutional data. Program directors and course co-ordinators contributed to course-level design discussions covering implicit themes, health priorities and domains of nursing practice, that flow through the program. Adding this information into the tool permits tracking, analysis, and evaluation across the curriculum. Results Tool usage identified gaps in curriculum design and led to functional reports that provide evidence relating to both the explicit and hidden curriculum. The tool also facilitated course design, while recognising interdependencies with program-specific outcomes. Conclusion The tool provides an excellent opportunity to improve design processes, as well as a vehicle to surface details that become lost in the complexity of an all- encompassing curriculum. Further, tool use results are reviewed to improve enterprise system’s data characterization of the intended curriculum to better close the reality gap.

This survey focuses on learning about the preferences and experiences of online college course takers. It was jointly developed by Chuck Dziuban, Ph.D, Patsy Moskal, Ed.D., and myself at the University of Central Florida, Orlando, FL, in May, 2009. The instrument is hosted within an open source survey system, Lime Survey, which I installed in my personal website domain (i.e., heybradfords.com). It is viewed best with MS Internet Explorer. There are no limitations for number of participants, as is the case with hosted sites, such as Survey Monkey, and the final data can be directly exported into SPSS for analysis.

This instrument is designed to provide two things: 1) to learn about the technologies students use, while also 2) to provide students with feedback regarding their level of familiarity with a wide variety of technologies currently available, which might better support the pursuit of their academic goals. This is a prototype and work is still being done to more fully develop the student's results. There are a number of interesting techniques used in this instrument: 1) grouped questions, 2) question items that have assessment values, 3) logic branching to ask specific technical questions depending upon other item responses, and 4) links to outside (i.e., the instrument) resources – in this case, web-based technical simulations. This instrument is designed using OpenSource LimeSurvey, which is hosted on my personal website.

This document presents in a comparative matrix grounded learning taxonomies from the original authors spanning five decades. The document was modified with consent of the original author, Atsusi Hirumi, to include the revised Bloom's Taxonomy by Anderson and Krathwohl (2001).

(Bradford, 2010) This prototype tool serves to compile information from multiple sources regarding a faculty member’s readiness to design and teach online. Once compiled, the faculty member may be assessed for their readiness to begin immediately or to be directed into professional development programs. Initially, faculty members submit evidence of their experience into Web Forms. That information is dropped into this tool, where a criteria builder interface may be used to connect submitted work to recognized rubrics, such as Quality Matters. Then, different experts may review submitted evidence and enter additional information or comments into sections of the tool, which get summarized into a final report for a review panel to make the final election of readiness. The resulting file then becomes a complete, final record of the faculty, which not only contains the recorded details, but also includes the criteria by which the assessment is made.

(Bradford, 2011) This prototype tool serves to facilitate rubric style grading for large classes by leveraging programming techniques. The tool includes many features, and it has been in development for three years. Designed to be used in conjunction with institutional learning management systems, the tool has been tested and used in large, higher education classes for two years. User and student surveys have been used to explore perspectives and experiences using the tool and viewing the results for one year. To request a copy of the tool, send an email to me using this address: GeorgeRBradford@gmail.com

(Bradford, Smith, Grech, & Birbeck 2012) This prototype tool serves to facilitate reports and information needs for accreditation, as well as for studying curriculum design. The tool focuses on improving course element alignment and surfacing the hidden curriculum. Development began in 2011 and became a grant project sponsored by the University of South Australia.

Teach@UniSA is a blended workshop offered to faculty three times per year, taking 2.5 days of full time sessions, and with follow-on activities organized several months after the workshop's conclusion. This particular session, How do I get feedback about my teaching?, walks faculty through an evidence-based process to cue into key student perceptions that result in strong satisfaction ratings. By first exploring where in their survey these items are explored, faculty can then take a number of strategies to improve their teaching or to check on student's perceptions of their current experience.

This workshop was delivered to faculty at the University of South Australia to initiate a dialog into what Technology-Enhanced Learning (TEL) can be, how it is grounded in the literature, and samples of tools to support material design and development.

[+] Workshop Session: My online teaching persona - How do I find him or her?

This workshop was delivered to faculty at the University of South Australia improve faculty communications and interactions with their students when teaching in fully online or blended modes. This work has a research base.

This is a graphic of the revised Bloom's Taxonomy of the cognitive domain, which researcher Barbara Clark had made some adaptations to. I made additional adjustments to further differentiate between the cognitive subdomains. This graphic is useful because it permits users to approach the formation process of instructional objectives from either as an end product (see the outside ring content) or as the starting point (the inside ring content). The circular nature also better communicates that while some of the subdomains are more complex, to get to even still greater levels of complexity, the instructional designer will likely need to repeat the cycle of lower level cognitive skill building.

Dr. Baiyun Chen and I developed this tool to facilitate an understanding of the associations and relationships between instructional objectives, assessments, assessment strategies, instructional strategies, and feedback strategies. For many, the relationship between these pieces of an instructional solution are too complex to grasp and to visualize the connection each has with each other. This visual aid is used during faculty consultations as a cue to disuss the pieces of a lesson design and to invoke thoughtful consideration.

This represents a small collection of a few course designs developed with faculty at UCF. I have worked with faculty in Chemistry, Biology, English, Anthropology, Education (Educational Leadership, Instructional Technology, History of Education, Philosophy of Education), Nursing, Psychology, Engineering, and Business (Marketing and Finances). Current work might be shown during face-to-face meetings upon request. All of these designs use a delivery strategy of asynchronous online courses, either in a full web-only mode (type W) or blended mode (type M). One of the English courses within this sample uses a Wiki instead of Blackboard, and another that is not listed uses that same Wiki host and incorporates the Social Bookmarking and Research design described elsewhere within my portfolio.

[+] Project Management Samples - Redesign of Faculty Development Program for Fully Online or Blended Delivery

I developed this service agreement by starting with pieces of different service contracts for many businesses on the web. The agreement is very structured, and very detailed. The intent is to promote fairness to both parties to the contract. The financial portion, which is specific for each customer, is calculated using a specialized spreadsheet that permits service contracts to be formulated for customers in multiple municipalities, separated by considerable distance. This agreement was used with several long-term customers.

Mr. Williams (not his real name) was a very special customer who suffered a serious illness that had him hospitalized for many years. When he exited the hospital, he couldn't wait to get working again, but his career had changed greatly from the development of personal computers. This is the overview of a training program to develop a number of skills that Mr. Williams would need to manage a small business in radio advertising. He would need to learn the basics of computers, how to use MS Word to write scripts and to create contracts. He would use MS Excel to create and calculate invoices, and he would use Adobe Audition to digitally produce the radio ads. The entire training program took 2 months for Mr. Williams to complete.

These are images from a specialized performance support system designed for an oral surgeon's office, who were required to complete detailed notes for each patient's surgery. The tool is built within Microsoft's InfoPath to permit the customer's office manager to quickly key in the data (the form is tabbed to recognize keys and next fields) and will interface with a database in the future should the customer wish to store the data in a database. With this version, the design permits printing of a large amount of data that will fit on a single printed page, with room for dental office stickers. The solution reduced to one-fourth the time previously required to create the surgery notes.

This is a presentation made to the senior staff at GATX Capital in San Francisco, CA following the results of a front-end analysis on work processes in their business. The solution for their situation was to build an electronic performance support system that would be an integrated help system, tied contextually to the financial systems they used, that would improve productivity, reduce mistakes, and grow their business. The solution required a full integration with the entire business operation, not only specific divisions. This presentation was designed to communicate the scope that the performance support solution would cover. It is based on work by Michael Cole and Yrö Engström at the University of California at San Diego.

This is the final report of a front-end analysis I did for SolArc in Tulsa, Oklahoma. SolArc is a software company that supports commodities trading, and at the time this work was done, they were looking for direction with the training models. As you will see from the report, there were other opportunities besides training to grow their revenue channels.

Lyondell in Houston, Texas upgraded their financial enterprise system and required extensive documentation and training to be produced to support the new system. I developed this document to guide the teams of technical writers to assist them with their development of training materials. Templates that used the elements within this guide were also developed to support their production efforts.

This is an inititial prototype training program for internal training at RWD. It was intended to serve three audiences; technical writers, project managers and instructional designers, and sales engineers. I left RWD before I completed the work.

This is the instructor guide to a part of a training program on handling grievances and appeals at Blue Shield of California. The blue text is actually hidden text, so that when you print with the settings to include hidden text, you get the instructor guide. When you print without hidden text included, you get the participant guide. I was the project manager for the full customer service representatives (CSR) training project that included 9 modules. In addition to doing the work of the project manager, I was also the lead instructional designer for the entire team (we were 15 people working on it for about 8 months), and I wrote the instructional materials for several modules. This is a sample. We also developed train-the-trainer materials, and I delivered the train-the-trainer sessions, as well as all the pilots, etc.

In this paper, the authors present the results of a review on four multicultural training programs, one of which is an in-depth analysis, with the aim of answering some of the following questions:

With so many programs available, how does one decide which program is best for the company?

Which program will provide the most information to employees?

Which program will allow employees to step away with more than a pamphlet of statistics and perspective trends?

Which program will provide a sense of what to do after the session has concluded?

The paper presents a brief description of each reviewed program, a table to compare and contrast the similarities and uniqueness of the different programs, a table to present the results of an in-depth analysis of one training program, and a short narrative to interpret the results in each table.

In this paper, the authors will compare and contrast higher education in the United States and Sweden. To conduct this comparative analysis, the authors took the following factors into consideration to capture an overall perspective, as well as a narrow but detailed view in two areas to exemplify the similarities and differences. For the overall perspective, the team first presents a synopsis of the history of higher education in each country. Secondly, the team compares some demographic details on the organization and operations of educational institutions for each country. To provide a narrow, but detailed perspective, the authors present key elements a student would experience following a degree program in Macro Economics from the Bachelor of Arts through the Doctorate for both the US and Sweden. Finally, a review of online learning in Sweden is presented. This final review yielded many insights into the state of affairs of the practice of teaching and learning through online technologies that place Sweden well behind the United States. The differences were enough that the team elected against including a comparative stance on this final section. The authors wish the reader to note that this comparative study will have a slant due to information being gathered and analyzed using US standards for comparing colleges and universities.

At the Earth Summit in 1992, education received international attention as critical to the process of sustainability development during the 21st century (Blewitt, 2004). The most used and common definition of ‘sustainability’ was developed in 1987 by the Brundtland commission, which defined sustainability as “development that meets the needs of the present without compromising the ability of future generations to meet their own needs” (The World Commission on Environment and Development, 1987). In this review the authors will first present a historical brief, which includes a narrative of the details that led up to and define the concept of sustainability, as well as present the changes the history of sustainability has brought upon global governance, and sustainability education on the international level. The narrative will further describe projected activities and plans of organizations and their educational or training needs as they pursue the implementation of sustainability projects. From these needs, The authors will elaborate in the section Education & Training Needs Arising from the Sustainability Movement on the impact and effect of sustainability education to Instructional Technology in the United States, as well as the resultant opportunities for instructional technologists. Finally, the review will conclude with remarks regarding the sustainability movement and the field of instructional technologists and designers.

A study by Mullen and Tallent-Runnels (2006) found significance in the differences between online and traditional students' reports of instructors' academic support, instructors' demands, and students' satisfaction. They also recognized the limitation to their study was their demographic data. In another study, Dziuban et al. identified multiple student satisfaction groupings into dimensions that structure studies in online learning. In the present study, differences between academic standing and ethnicity were added to broaden the Mullen & Tallent-Runnels (2006) demographic deficiencies and used several student satisfaction dimensions identified by Dziuban et al. (2007). The differences between academic standing and the studied dimensions were found to be statistically significant: the largest effect was Facilitated Learning and academic standing, which accounted for nearly 15% of the student scores’ variance; Engagement and Information Fluency had variance effects of, respectively, 4.5%, and 5.1%. However, error accounts for the majority of total variance in all the tests, which implies other variables’ influence.

This essay presents three instructional development models to explore their application characteristics, foundational epistemology, and their strengths and limitations through comparisons and contrasts with respect to each other. Further, this essay presents a limited number of arguments or criticisms against the use of instructional development models.

This analysis portfolio demonstrates our (author team: George Bradford, Taryn Hess, and Damon Regan) ability to analyze learning tasks using a variety of methods described by Jonassen, Tessmer, and Hannum (1999). This portfolio describes our application of three specific analysis methods after reviewing our approach to method selection and knowledge elicitation. It provides insights into our thinking as it changed throughout the process of developing this portfolio. We conclude with our reflections of the entire process.

In this paper, we (author team: George Bradford, Taryn Hess, and Damon Regan) present three treatment plans that were developed by following three different instructional design theories: Mayer’s Designing Instruction for Constructivist Learning, Schwarz, Lin, Brophy, & Bransford’s Flexibly Adaptive Instructional Designs, and Reigeluth’s Elaboration Theory. These three instructional design theories were applied to develop instructional plans for the same course that has the following title: Formative Research Methodology: Foundation, Applications, Issues. The Learning module and title for the course designs are: Unit 1 – Foundations of Formative Research Methodology. Each instructional plan (a.k.a., Treatment Plan) includes a detailed instructional strategy, as well as an evaluation (i.e., assessment) chart. The team designer also includes reflection notes on the application of each design theory that immediately follows the evaluation chart sections. In addition, this paper includes a comparative analysis of the final outcomes of each theory as it was applied for the instructional topic of Formative Research Methodology. Finally, the paper concludes with an overall reflection that covers the entire team project to develop this narrative report on the application of three instructional design theories.

This narrative was written to present (1) clear and concise descriptions of how the instructional unit is represented by the treatment plans developed in an earlier project; (2) concise and detailed reflections on experiences (challenges, key decisions, what was good, bad, what more we need or wish to learn); and (3) clear and concise comparisons and contrasts resulting from the original design concept to the final instructional lesson, while including notes about those differences. To meet this, this narrative includes (1) a short discussion of the treatment plan; (2) a short discussion on the development process the Team employed; (3) reflections on the development process; (4) a short discussion on the lesson evaluation strategy and implementation; (5) reflections on the evaluation strategy and implementation; (6) a short comparative analysis on the treatment plan to the final lesson design; and (7) the Team’s conclusions regarding the project.

This document defines the strategy for usability testing of a prototype online learning lesson for Dr. Hirumi’s Advanced Instructional Design course at UCF. The prototype lesson is in development by The Team (i.e., George Bradford, Taryn Hess, and Damon Regan), and is titled, “Foundations of Formative Research Methodology” (hereafter referred to as “Online Learning Lesson”). This document defines the objectives and plan for implementing usability testing, including a schedule for completion.

This is the instructional unit web site that was designed to teach doctoral students the foundations of Formative Research. Taryn Hess did the site design, while Damon Regan and I made suggestions regarding colors and navigational cues. The three of us contributed to the final content. George was additionally responsible for all of the evaluation strategy and its implementation that led to course site adjustments.

Pervasive in the literature, there are three distinct areas that can influence the degree of success college faculty might enjoy as a result of efforts to design and deliver their courses into a format for delivery through the Internet (such courses are also called “online” courses). The terminology for these areas varies, but here we combine some of the other terms parenthetically to demonstrate the variety: technical skills (computer literacy, Web skills, media production skills, Internet savvy, technical readiness), instructional design knowledge and practice (instructional systems design skills – ISD, pedagogical knowledge, instructional strategies, design competency, organization design), and institutional support (departmental, college, or university readiness, institution technical infrastructure, technology help desks) (Arabasz & Baker, 2003; Lee & Hirumi, 2004; Meloncon, 2007). This brings us to ask the primary question that seeks to address the issues of quality in online instructional content development and delivery: “What do experts in the field say are best practices, important guidelines, or skills as regards instructional design for college faculty, who would create and deliver educational content that will be delivered through the Internet?”

With the intense pressure to grow and deliver online learning programs, institutions of higher education must face a variety of issues and challenges. The question guiding this paper is, “What are the key issues for institutions of higher education seeking to implement online learning programs?” An answer to this question, even if it might be incomplete, will lead the reader to a better understanding of the range of issues any institution will likely be required to address, and thereby, at a minimum, provide an important perspective and some direction to planning such projects.

Consider the following visionary request to a web-based search-analysis engine: “Learn about new technologies that can be used as educational tools. Learn new pedagogical strategies and review existing strategies. Match new and existing pedagogical strategies with new technologies. Create realistic training scenarios for sales team.” Returning solutions to complex queries such as this is becoming possible, but not so much for educators. Come to learn, discuss, and ask the next questions.

First, this paper presents the consideration of a policy change to be made at the Florida Board of Governors level to reflect current conditions with regards to space planning and financial allocation. The proposal reflects the concern that expansion of student access to university programs can be increased through financial support of monies already available and which will soon be allocated. The concept of the policy shift is to adjust the calculation method to reflect actual classroom space demands, and to change the allocation appropriately to expand distance learning programs across SUS. The increased access students will have during difficult economic times reflects the can-do attitude of Floridians who retain focus on the educational mission by leveraging economies of scale afforded by distance learning solutions. Second, this paper presents the consideration of program adjustment internal to UCF in the advent of expanding faculty support services for distance learning programs. This section is intended to provide preparation and planning should the first policy adjustment be realized.

Dziuban et al. (2007) identified eight dimensions of Student Satisfaction for students who take college coursework in an online format that employs asynchronous technologies and corresponding instructional design. The Dziuban et al. eight dimensions can be said to comprise a framework for describing student satisfaction with asynchronous learning network (ALN) courses. If this model were to be considered robust, then educational researchers would have a common framework to study student satisfaction, the design of ALN type instructional designs, and the integration of other (future) technologies. Further, researchers would have a terminology that is potentially more accessible than is the case to date. In this exploration of the model, this researcher has the following questions:

1. Can the Dziuban Model be used to study student perception of online instruction and learning?

2. Can students generate questions they would want to see asked about their experiences with online instruction and learning at UCF, in this way provide a unique student perspective?

3. How and when would be favorable to use student generated questions in a study about student perceptions to online instruction and learning?

4. Can the results of the study be used to later explore students' perception of their online instruction and learning experiences?

This paper includes a review of the literature, a detailed discussion of the methods employed in this study, a presentation of results and findings, a discussion of those results and findings, and a conclusion. Further, this study includes three appendices: appendix A presents the protocol used for data gathering in the first phase of the study; appendix B presents the protocol used for data gathering in the second phase; and appendix C presents the final set of questions produced through the study.