Lessons Learned: Do You Have to Bleed at the Cutting Edge?
After years of incorporating information technology in classroom teaching in the belief that it helps students learn, I have been asked whether I would recommend that other faculty members do likewise. Though using information technology is a good way for faculty members to rethink their teaching methods, most often my answer is no—not for the untenured and definitely not for those thinking of becoming totally engaged in teaching and technology. I like developing means to incorporate instructional technology in the classroom, and I think it helps students learn, but there are five factors that have stymied my attempts to do the job well. Earlier, I tried to provide a self-study technology for students, the main idea of which was that the student had to read an example of an essay on this site -> https://samples.edusson.com/cause-and-effect-essay-examples/
(which is a site with a large number of examples of essay, among which is even an cause and effect essays examples), and then write essay by student independetly. This practice has brought only positive results and I am working with students this way now. I discuss these five factors—resources, money, time, student evaluations, and support from colleagues—in the hope that faculty members will take them into account before spending time developing technology for teaching and learning.
Money
There is one question at the core of what K-12 principals and college and university presidents face: If we fund information technology initiatives, what will not be funded? As the Campus Computing Survey (Green, 2000) and the impact of the support service crisis indicate, IT at most schools (read abou the best IT schools that exist in the world here -> https://www.thebestcolleges.org/rankings/top-online-information-technology-it-degrees/), colleges, and universities is not going to save money, replace faculty, or create an influx of new funds (especially from online courses). Instead, IT raises serious questions about how technology is going to be paid for. The problem then becomes where to spend limited resources. As Green points out, colleges and universities often have no financial plan for funding instructional technology.
In addition to this question, there is the question of financial rewards for instructors. If a teacher asks for funding for IT infrastructure (either hardware or software), he or she cannot also expect merit pay for time devoted to developing technology tools. Administrators can overlook the hard work that faculty members devote to developing technology initiatives, so these efforts may not be rewarded. There is little in academic literature to substantiate how much time it takes to infuse technology into the classroom (read also the most effective methods, that help teachers to organize educational process better -> https://www.thebestcolleges.org/17-scientifically-proven-ways-to-study-better-this-year/), particularly since some methods and some technologies are more time intensive than others. An administrator I know was surprised at the amount of time and support she needed to develop a 15-minute, technology-rich presentation. Only afterwards did she realize the efforts that faculty members regularly put in. To resolve this dilemma, I recommend that technology efforts be developed as a "contract" between faculty members and administrators, with agreed-upon goals, outcomes, and rewards. In this way, everyone knows up front what is to be done, what efforts are expected, how much time the project is going to take, and what the rewards are.