Friday, March 18, 2011

Thoughts on Ed Tech, Technology and Attention

I just came back from an interesting technology forum (a Massachusetts Superintendent's Forum on Transforming Teaching & Learning with Technology). There's a number of thoughts I have from the conference, but the first thing I wanted to blog from it was some thoughts on attention and technology that sprung both from the content and the form of the event.

Early in the conference, we went over some of the new National Education Technology Plan, in particular, the need to focus less on content (which changes too fast anyway, the reasoning goes), and more on creativity, collaboration, critical thinking and research, which will be the key to creating the highly adaptable citizens and employees we'll need in the future. All of this strikes me as true, although not in any fundamental way different from the what much older reformers (think Dewey) have had to say about skills vs. knowledge, which means valid and interesting objections to those theories that have been raised by some (think E.D. Hirsch) probably apply to this as well.

The first presenter, a superintendent of a school district that's done a lot of work to put technology front-and-center, talked about the importance of the shift from direct instruction to more collaborative models to support these goals. Much was said about the mountains of information created and consumed every minute on the internet and how obsolete modes of instruction couldn't cut it anymore. I was struck, as many participants at ed conferences before me surely have been, by the irony of the fact that I was sitting at a session that was, in fact, two hours of direct instruction hearing about how direct instruction was obsolete.

Looking around the room, I thought about us as a group of learners: the group consisted of superintendents, principals, technology coordinators, and teachers. I'd say about 1 in 3 or 1 in 4 people there had an iPad or a laptop out. A similar number had a smartphone on the desk in front of them, a number of which beeped and chirred during the event (though none actually rang). Of the screens in front of me, I saw only one that was engaged in notetaking the whole time, the rest all wandered to emails or other distractions at some point.

From what I could see, I was the only person actively taking notes with pen and paper. I'd actually run to CVS to buy a new pad of paper before the conference, knowing that I can't really focus in a big room without something to write on. With my pen and paper in hand, I was thoroughly engaged through and through, writing notes both on things I thought useful and things I found frustrating (I was frustrated by inane things, like the fact that a presenter bungled the definition of virtualization and also by substantive things, like a video we were shown of a group of students taking several minutes to do a task on linked iPads that could have been done in 30 seconds with poster and markers). At the front of my notepad I kept a list of useful references or concepts I wanted to look up later to learn about. When something a presenter said set off a spark of an idea for me about something I wanted to do at our school, I immediately sketched the idea on my paper. When it was time for Q&A, I had something to ask each presenter. After the presentation, I went up to talk further with the presenters, took down names and emails, and made plans to try to make further connections in the future. I came back to school energized and with a clear list of action items based on the connections I'd made at the conference. All of this is as it should be.

Which is to say is that I know how to sit at a lecture in a conference and get something useful out of it. On another day, I might be writing that we should have had a more collaborative structure at the forum -- that we should have had BOFs break off, or at least some small group work or paired conversations -- surely educators know how to do such things. There are lots of kinds of work for which this would probably be more useful and it may be this would have benefited some of the people at this forum as well. However, this conference was really about sharing best practices and showing what's possible, about disseminating information to people who are in a position to do something useful with it. Given that goal, direct instruction actually made a lot more sense than something more collaborative would have.

Given that direct instruction was the chosen format -- and I think direct instruction is actually a good format far more often than progressive educators tend to admit -- the skills I learned in school for learning from direct instruction -- how to take notes, how to keep my mind engaged even when I'm a little bored, how to maximize the usefulness of what I'm hearing for me -- those skills all played a key role in me having a useful 2 hours. I imagine those skills serve lots of people well in their professional life -- after all, lots of professions require learning at every level, and lots of learning happens through lectures and conferences.

That said, for the professionals around me, as for all of us, I think, the technology in their hands actually presented a real challenge to their learning. There was no one off-task the entire time -- these were generally people at the top of their field -- but lots of people were off task for good parts of the presentation, and there was far more rudeness (chatting, beeping technology, etc.) than I would have expected. It was hard not to think that technology was eroding our ability to learn from one another effectively.

I know that, for me, my decision not to break out my laptop was crucial to my engagement in the conference. Had I had a laptop, I would have immediately started writing the emails when I had an idea for a connection I wanted to make, I would have looked up links to new software the moment I heard about it, I might even have installed and tested that software out there on the spot, all of which sounds like "engagement" but in fact would have distracted me and ultimately disengaged me. What's more, I would have checked the New York Times to see if France had started bombing in Libya or if they'd made any headway on the reactors in Japan and, had it turned out that something had happened, I would have read the articles in the entirety, not wanting to miss out on the events of the world. And that's just the distractions I would have initiaited myself: I also would have been receiving emails and instant messages from people who needed help with tech at school, from friends and family updating me with photos of little ones or plans for dinner, and so on.

When I imagine our teenagers faced with these same challenges, it is, frankly, terrifying. When I think that average class times are somewhere around 45 minutes in many schools, I wonder where on earth they are going to learn focus. One solution would be to just eliminate the two hour lecture format from the world, but I'm not sure that's optimal for reasons I've already mentioned.

People talk about technology as if the key to mastering it is using it more, as if we need to make sure our kids have lots of experience in front of screen to be prepared for "21st century jobs," but I don't for a second think that's true. I would love a chance to survey top programmers (the people who actually do master technology) to learn more about their habits. I highly doubt that they spend more time on facebook, or playing video games, or balancing three chats and two emails while working, than the rest of us do. My guess is that they work hard to find ways to fight distraction. I'm also guessing that they have an ability to focus that they may well have learned somewhere other than just coding -- I wouldn't be surprised to find that good programmers include a higher-than-average number of musicians or painters or readers than the rest of the population.

When I think back to the educational plan's learning objectives: creativity, collaboration, critical thinking and research, only research involves computers in a fundamental way. There, it seems, learning to evaluate and understand digital resources has to play a key role in what we teach our kids. Knowing the difference between this and this is vitally important for every adult, and schools need to give kids experience with both the good and the bad and help them learn the difference. But for the other skills -- creativity, collaboration, and critical thinking -- technology can be as much a hindrance as a help if not more. That's not to say I want to purge all the computers from our classrooms, but to say I don't think tech itself is the key to success in an increasingly tech-laden world. We will need students who know how to use a computer, sure. But, more importantly, we will need students who have more self-discipline, more ability to focus, and more ability to tune out distractions than students of the past. The road to those abilities may have more to do with music and art classes than it does with iPads and smartboards.

1 comment:

Ryan said...

Amen, Senor Hinkle!