Largely unexamined in the large-scale shift to digital learning in education are the accompanying ethical considerations. Indeed, the issues and tradeoffs that school leaders and teachers face in using technology in schools and for education — whether free or for a fee — are more complex than they have ever been. As Lisa Petrides and I write in a new op-ed for The Hechinger Report (“What’s the high-tech tradeoff for students and teachers?“):

In this new digital era for education, we should ask: What rules of the road are needed to ensure that decisions about technology are made in the best interests of students?

Among the places where our ‘rules of the road’ seem lacking, I’d count the following:

  • Access and use, which I define to include the myriad issues of equity of access, but also of our orientation toward student use of devices (i.e., are we teaching students to bend technology to their own goals – via coding or domains of their own – or to to merely dutifully click-through others’ software apps?)**
  • Intellectual property. Who owns student and teacher data? Who owns student and teacher work? What restrictions, if any, should be placed on school district re-use and sharing of student and teacher work? Platforms and products deal with this in different ways (sometimes in clearly exploitative ways), but too often this issue remains wholly unaddressed.
  • Transparency. Traditional textbooks and tests are held to a higher standard of external review for accuracy and bias than software (especially adaptive software). When every student does not experience the same content, when software algorithms are used to make decisions of consequence about students and educators, there is no room for proprietary black boxes.
  • Privacy and security. There should be no ambiguity about student (and teacher) data ownership, collection and sharing, and much more stringent penalties for failures to protect it. The costs of pervasive surveillance of student behaviors and actions to civil liberties and self-determination are incredibly high. Ender’s Game is a dystopian vision for the future of education.
  • Conflicts of interest. As the developer of tools and services, the private sector will always play an instrumental role in the use of technology in schools. Having said that, the goals and motivations of the private and public sectors are different and not 100 percent aligned. What safeguards need to be in place to ensure a fair and open market for services, untainted by real or apparent conflicts of interest?

Surely, others may frame them out differently or would add other issues to the list. (I even did so previously in a contributed piece to New America.) Nonetheless, I think we can agree that by not having sustained and meaningful discussions about the ethics of educational technology we are potentially backing into some potentially pretty awful things.

While Lisa and I offer some ideas for the steps that policymakers, educators, and technology companies can take to make the situation better in our piece, the need to raise general awareness of these issues — online, in conferences, within schools — is paramount to making progress. As we conclude:

We can regulate the use of technology while also promoting innovation. It is in everyone’s best interests to ensure that schools protect the digital rights of their stakeholders, putting the best interests of students and teachers at the center.

Won’t you join us?


** For a great read on one progressive approach to empowering students to take control of their technology use via open source, read The Open Schoolhouse by Charlie Reisinger.