You Won't Believe How Important This Topic Is! Social Media, AI, and Manipulation
The last two or three years have shown how powerful social media has become in our society. We’ve all been shocked to see how quickly and deeply Twitter, Facebook, and similar tools have worked their way into our individual lives and our culture at large. From a neuroscience perspective, however, it’s not surprising at all.
Our brains crave novelty. Discovering something new and interesting triggers a happy little reward circuit, and social media is a never-ending stream of interesting things. Our brains are also wired to make social information one of our highest priorities, so having a never-ending stream of new and interesting social content at our fingertips is naturally extremely engaging.
The surprise then isn’t that social media has had a large influence on us, but rather that we didn’t see it coming. (Actually, some did: I heard David Thornburg say this almost twenty years ago: “People keep saying we’re in the Information Age. We’re not. We’ve passed through the Information Age into the Communication Age. We need to understand the difference.”) What also shouldn’t be surprising is that there are people, businesses, and organizations that have become masterful at using neuroscience to create social media that is very powerfully manipulative.
We’ve all seen “clickbait” titles that prey on our curiosity like You won’t believe what happened next! While annoying, these are mainly harmless tricks in a media world competing for you attention. In the last few years, however, we’ve seen more and more items like Senator Belfry Advocates Eating Live Puppies, where the goal is to trigger a fearful and/or outraged response – and widespread sharing.
If you’re seeking to manipulate people, outrage and fear are very helpful tools. When these are strong enough, they will trigger your emotional system to a level that significantly impedes your ability to think. You go into a reactive mode rather than reflective state, and you feel like you just have to do something. People can be motivated to say things and act in ways they normally wouldn’t. (We’ve all had the experience of being in an argument and saying or doing something that we not only regretted, but afterwards didn’t even make sense to ourselves.)
If you are a regular user of Facebook or Twitter, think about how many of the messages you see that are framed to be upsetting. Many political messages in particular are built around outrage or fear. Even more troubling, many of the groups or individuals who create this content have figured out that it simply doesn’t matter if it’s true or not – people will click on it, read it, and share it regardless. A visit to www.snopes.com to read the Hot 50 stories will show how prevalent false stories are. (As of the time I’m writing this, only nine of the “Hot 50” articles are actually true.) We have a situation where there is a strong financial return for creating upsetting stories that feed into people’s pre-existing biases, and it’s a lot easier and cheaper to make up stories than to research real ones.
And the situation will only get worse. This posting was inspired by a tweet from Michelle Zimmerman (@mrzphd) about the subject of a presentation by writer Zeynep Tufecki (@zeynep):
Every time you use Machine Learning, you don’t know exactly what it’s doing. YouTube has found out that finding more extreme content keeps people engaged, like conspiracy, extreme politics.
Our social media and news feeds are increasingly managed by artificial intelligence and machine learning systems, which constantly monitor what gets and holds attention best. They combine the big data of the whole population of users with our own individual patterns to fine-tune exactly what should keep us most engaged. This AI doesn’t care about the nature of the content, whether it’s true, or whether it’s good or bad for you or society at large. It only cares about your attention. And AI will get more powerful at doing this every single day.
I want to be clear that I’m not saying social media and AI are bad. After all, this is a social media post inspired by other social media posts, and AI is an exciting topic for students and responsible for some amazingly positive things. (Michelle Zimmerman, mentioned above, has written a book for ISTE on the topic – http://iste.org/TeachAI) . What I am saying, however, is that part of digital literacy for our students has to be learning how to understand the manipulative power of social media can be used against them, and how they can use that knowledge to both protect themselves and to avoid becoming manipulators.
We should do is to stop teaching social-emotional learning and digital literacy as separate topics. For many (if not most) of our students, their social-emotional lives and their smartphones are inseparable. Many SEL programs and digital literacy programs already touch on this, but I think the last few years have demonstrated it needs to be emphasized much more strongly. When we teach students about social-emotional learning, part of their learning needs to be the application of this knowledge to their use of digital media. If they understand how social media sources use their emotions to manipulate them, they can be better equipped to resist it. Likewise, they can hopefully use this knowledge to make informed, more positive decisions on what they post and share themselves, and be part of the solution rather than part of the problem.
Our students will spend their entire lives in a world where social media and machine learning are ubiquitous. We need to help give them control over these technologies, so it's not the other way around.
For most educators, September is the crazy-busiest month of the year. For someone involved in professional development, though, it's down time. After go full-out for the latter part of July and all through August, the busy training season shuts down the day school starts. September for me is like summer break, only with a lot less sunscreen. And more sweaters.
It's given me some time to reflect on our summer projects, and in particular the second summer of Maker Camps that Caitlin and I hosted for NCCE at the wonderful Pack Forest Conference Center in Eatonville. We held two sessions, with an introductory camp on August 7 & 8, and a camp for more experience participants on August 9 & 10. About fifty educators participated, with a sizable number signing up for the two camps back-to-back.
Fourteen amazing instructors offered a range of sessions (here and here), including topics as diverse as design thinking, programming microcontrollers (such as Arduino and BBC Micro:Bit), 3D printing, connecting Making to the NGSS standards, and Making with simple materials. We also held extended evening “open labs,” where participants could spend time exploring and working with whichever tool, materials, or software they wanted. It was four busy, exciting days of activity.
A camp setting can provide a different kind of professional development than educators can normally experience. Where many conferences or institute can often be a dizzying series of shorter sessions, we worked to give teachers the opportunity to spend a significant amount of time working in the same mode that their students should. Making should engage learners in extended, focused exploration work with significant amounts of self-direction and agency. Put another way, if there’s no play, it’s not Making. (The NCCE conference does a lot of this, too!)
There’s a Latin phrase* that sums up the philosophy of our camps: Nemo dat non quad habet. It translates “You can’t provide what you don’t have.” Teachers can’t provide what they don’t have, either. Professional development experiences should reflect what the students will experience. Imagine an ideal classroom, where students are enthusiastically engaged in focused, challenging work that stretches their skills, knowledge, and creativity. They’re so self-directed that the teacher can just quietly circulate and provide assistance and advice on an as-needed basis. If that’s what we want in a classroom, that’s what the professional development should look like, too.
Don’t get me wrong. There are certainly times in both classrooms and professional development where direct instruction is appropriate, and research shows that even good old-fashioned lectures for some situations can be the most effective approach. I do presentations all the time. But it’s also kind of the default, and it can feel deceptively time-efficient because we can cover so much information so quickly. But it’s only efficient if the learners actually learn, and too much content is just the same as too little. It simply won’t stick for most students. And direct instruction is never a good way to teach skills if we don’t give ample time (hopefully with the instructor present) to practice.
So that’s why we design our camp that way, as well as other trainings we’ve done. Plus, it’s kind of a dirty little secret, but I long ago discovered that it’s much, much easier to prepare and teach an engaging six-hour project-based workshop than a 50-minute conference presentation. And I feel like the participants come away with a far greater likelihood of applying what they learned than if I had lectured to them for the same amount of time. And as the instructors, we always learn a lot more that way, too!
Interested in participating in our camps next summer or want updates on any upcoming workshops? Sign up for our newsletter! It will only be published every month or two, because you don't need more email, and we're not that talkative.
*The real legal meaning of the term is written up here. But I’m still going to use it my way.