MindEdge Online Learning

Bring Your Own Device…at Your Own Risk

=

Bring Your Own Device…at Your Own Risk

Blog Post Heading

Blog Post Content

By Joe Peters
Editor, MindEdge Learning

It’s hard to imagine anyone with a better command of learning and technology than the late Dr. Patrick Winston, an MIT faculty member who, for nearly five decades, was on the forefront of machine learning and artificial intelligence.

How did he view technology in the classroom? He didn’t. One of his rules for his Introduction to Artificial Intelligence course was, “No Laptops.”

Granted, Dr. Winston’s preferences do not establish a universal axiom. But if an educator of his renown, working with students on the vanguard of technology, questions the value of such devices while lecturing, then maybe those of us with more pedestrian credentials should take note.

This is not to suggest that technology has no role to play in education. To the contrary, technology has an immense capacity to bridge logistical, financial, and cognitive gaps. Even a Luddite must acknowledge that the Internet provides access to a far greater breadth and depth of information—and far more efficient retrieval of that information—than the stacks of a research library.

However, as we see laptops becoming more prevalent in elementary and middle schools through programs such as BYOD (Bring Your Own Device), we should recognize the potential for educational distraction as well as benefit.
Background image of some students smiling
Not long ago, the Paris-based Organization for Economic Cooperation and Development, a coalition of 36 of the world’s most advanced countries, conducted a study of 15-year-old students who regularly use computers and the Internet at school. The study concluded that those tech-using students tended to underperform on international assessment tests.

While we should be careful about drawing broad conclusions from a single study, these results nonetheless indicate a need for circumspection. But instead of raising a red flag about the use of technology in the classroom, many school districts have gone ahead and bulldozed the flagpole.

What’s notable about many of the scholarly articles addressing BYOD and related programs is that their focus tends to be more on financial considerations than pedagogical ones. Inevitably, the justification for replacing the No. 2 pencil and three-ring binder with a laptop or tablet is a belief that we must start preparing students, even those in elementary school, for the jobs of the future.

In this regard, consider the data point provided us by the Waldorf School of the Peninsula, a private K-12 school that serves the children of many Silicon Valley executives. The school attracts the progeny of the country’s technology incubator by being famously no-tech.

Again, a single anecdote does not establish a truth. But perhaps what these executives understand is that today’s software and hardware may bear little resemblance to tomorrow’s workplace. The famed “Moore’s Law” holds that technological capability doubles roughly every two years. If that’s true, then by the time a fifth-grader sends out résumés as a college senior, he or she will be five technology generations removed from the tools of today. Expecting today’s laptops to prepare a 10-year-old for employment a decade hence is like using a World War I-era globe to teach modern geography.

Further, most kids have an intrinsic attraction to the range of today’s computing devices. For many parents, the challenge is not getting their children to use these devices—it is getting them to stop using them. If we’re concerned about making sure our kids are ready for tomorrow’s workplace, relax—most of them are more than capable of learning how to use a new device in very short order.

If today’s educators and parents are embracing classroom laptops with excessive exuberance, part of the fault must lie with those of us in the tech industry who have been clamoring for years about “skills gaps.” Whether you are talking about political leaders being duped by simple phishing schemes or investment professionals struggling with a spreadsheet, the tools of today’s workplace can be overwhelming. But perhaps we technologists have focused too much on the bytes, and overlooked the basics.

At the risk of committing techno-heresy in the eyes of my IT colleagues, let me suggest the fault has not been in the “hard skills” of understanding digital certificates and regular expressions, but rather in the so-called soft skills of communication, adaptability, and cooperation.

The next time you approach an elevator lobby, a bus stop, or some other public setting, make note of how many people prefer to stare at the thing in their hand rather than make even a superficial gesture—a smile, a greeting, or an inquiry about the weather. For all their promise, today’s devices can be very isolating. In a grander context, we should ask whether these resources encourage us to reach beyond our comfort zones, or are they more like a digital cocoon?

As we face the evolution of artificial intelligence, arguably it is more important than ever for us and our children to prioritize the skills between our ears rather than those at our fingertips. As Dr. Winston counseled MIT students annually, in a talk entitled How to Speak: “Your careers will be determined largely by how well you speak, by how well you write, and by the quality of your ideas … in that order.”

For a complete listing of MindEdge’s course offerings on cyber security and CISSP®, click here.

[An earlier version of this article ran in the MindEdge Learning Workshop Blog on October 18, 2018.]

Copyright © 2020 MindEdge, Inc.