Otakun recently pointed to an excerpt from Carr's latest book. The general point of the article is to examine how human memory is being conflated with computer memory, and whether or not that makes sense:
...by the middle of the twentieth century memorization itself had begun to fall from favor. Progressive educators banished the practice from classrooms, dismissing it as a vestige of a less enlightened time. What had long been viewed as a stimulus for personal insight and creativity came to be seen as a barrier to imagination and then simply as a waste of mental energy. The introduction of new storage and recording media throughout the last century—audiotapes, videotapes, microfilm and microfiche, photocopiers, calculators, computer drives—greatly expanded the scope and availability of “artificial memory.” Committing information to one’s own mind seemed ever less essential. The arrival of the limitless and easily searchable data banks of the Internet brought a further shift, not just in the way we view memorization but in the way we view memory itself. The Net quickly came to be seen as a replacement for, rather than just a supplement to, personal memory. Today, people routinely talk about artificial memory as though it’s indistinguishable from biological memory.While Carr is perhaps more blunt than I would be, I have to admit that I agree with a lot of what he's saying here. We often hear about how modern education is improved by focusing on things like "thinking skills" and "problem solving", but the big problem with emphasizing that sort of work ahead of memorization is that the analysis needed for such processes require a base level of knowledge in order to be effective. This is something I've expounded on at length in a previous post, so I won't rehash that here.
The interesting thing about the internet is that it enables you to get to a certain base level of knowledge and competence very quickly. This doesn't come without it's own set of challenges, and I'm sure Carr would be quick to point out that such a crash course would yield a false sense of security on us hapless internet users. After all, how do we know when we've reached that base level of confidence? Our incompetence could very well be masking our ability to recognize our incompetence. However, I don't think that's an insurmountable problem. Most of us that use the internet a lot view it as something of a low-trust environment, which can, ironically, lead to a better result. On a personal level, I find that what the internet really helps with is to determine just how much I don't know about a subject. That might seem like a silly thing to say, but even recognizing that your unknown unknowns are large can be helpful.
Some other assorted thoughts about Carr's excerpt:
- I love the concept of a "commonplace book" and immediately started thinking of how I could implement one... which is when I realized that I've actually been keeping one, more or less, for the past 10 or so years on this blog. That being said, it's something I wouldn't mind becoming more organized about, and I've got some interesting ideas about what my personal take on a commonplace would look like.
- Carr insists that the metaphor that portrays the brain as a computer is wrong. It's a metaphor I've certainly used in the past, though I think what I find most interesting about that metaphor is how different computers and brains really are. The problem with the metaphor is that our brains work nothing even remotely like the way our current computers actually work. However, many of the concepts of computer science and engineering can be useful in helping to model how the brain works. I'm certainly not an expert on the subject, but for example: You could model the brain as a binary computer because our neurons are technically binary. However, our neurons don't just turn on or off, they pulse, and things like frequency and duration can yield dramatically different results. Not to mention the fact that the brain seems to be a massively parallel computing device, as opposed to the mostly serial electronic tools we use. That is, of course, a drastic simplification, but you get the point. The metaphor is flawed, as all metaphors are, but it can also be useful.
- One thing that Carr doesn't really get into (though he may cover this in a later chapter) is how notoriously unreliable human memory actually is. Numerous psychological studies show just how impressionable and faulty our memory of an event can be. This doesn't mean we should abandon our biological memory, just that having an external, artificial memory of an event (i.e. some sort of recording) can be useful in helping to identify and shape our perceptions.
- Of course, even recordings can yield a false sense of truth, so things like Visual Literacy are still quite important. And again, one cannot analyze said recordings accurately without a certain base set of knowledge about what we're looking at - this is another concept that has been showing up on this blog for a while now as well: Exformation.