Public school teachers have a lot on their plate when it comes to measuring achievement. Student success is determined through assessments, graded materials and even technological savvy. The general consensus seems to be that in order to give K-12 students a fighting chance in the real world, teachers and administrators must stay on top of any and all technology trends. Is it worth the effort though?
The National Center for Education Statistics reported in 2009 that 97 percent of K-12 teachers had computers in their classrooms every day. In addition, 54 percent were able to bring a computer into the classroom. The overall ratio of students to classroom computers was 5.3 to 1.
Well that was then and this is now. Since 2009, teachers have made the shift to include mobile devices like tablets and Smartphones as part of the classroom culture. Computers are still there, but are quickly playing second fiddle to smaller, faster and just-plain-cooler pieces of technology. While the inclusion of cutting-edge technology certainly grabs the attention of students, does it actually make a difference in academic success?
Does technology really provide more opportunities?
The problem with answering these questions is that not enough time has passed since the influx of Internet-based learning has stormed K-12 classrooms. At a technology summit in early 2012, Troy Williams of Macmillan New Ventures told a packed conference room that companies like his do not "have the outcomes yet to say what leads to a true learning moment." He added that it would still be another three to five years before those numbers can truly be analyzed. Matthew Pittinsky, a co-founder of the popular Blackboard software, agreed with Williams, saying that "these are really early days" when it comes to truly integrated technology intended to improve student success in K-12 and higher education settings.
In its widest definition, though, technology has always been associated with the creation of a level playing field for students. Bernard John Poole of the University of Pittsburgh wrote ten pillars of technology integration in K-12 schools and his final point reads: Recognize that technology is for all, and involves all, in the process of lifelong learning. Poole talks about the way that teachers must receive ongoing training, and parents must be equally involved, in order to promote student achievement through technological advances. While his points sound good on paper, it leads one to wonder if he truly believes that technology is necessity of learning, or if it is only a means to capturing an ever-waning student body attention span.
At the public school level, all students have equal access to classroom computers and mobile devices. Even if these youngsters have no electronic access at home, upon entering a classroom they are able to interact with technology and keep up with their peers. That is all well and good - but does it matter? If all public K-12 classrooms got rid of computers and banned Internet-based learning, would it negatively impact academic success through college years? Would it affect graduation rates? Would American kids fall behind the rest of the world?
I think that truly depends on how you look at it. Does the technology itself provide heightened learning experiences? I'd argue that it does not. Instead, the implementation of the technology is a necessary move to keep students interested in the subject matter. I am not saying that I am against rapid adjustment to cutting-edge technology in learning and practice; I think there is no way to avoid embracing it and still turn out high numbers of world-ready graduates. I just think that there is a danger in relying on the technology to convey learning materials in a vacuum. Look at how much technology has changed since the 2009 report I referenced above. Does this mean that the students growing up in public school atmospheres in 2013 will be better prepared for life than those of 2009? What about the students of 2017, and so on?
What do you think? Does technology improve learning?