Note: This is an article in the Agile University series I describe in this blog post here. I'll add a table of contents as I write more articles, and in the meantime that link provides some context to this post.
I just finished the book Now You See It by Cathy Davidson, an interesting look at how the primary institutions in our lives- school and work- can be refreshed to take advantage of new technology and new understandings of how the human brain works. While I'm focusing primarily on the education side of the picture, there were also some lessons from Davidson's view of the workplace that can be applied to schools and universities, which I'll discuss later.
Davidson makes the argument that, despite our widespread critique of multitasking, the mind is made for it and even craves multitasking. As she put it, "The mind wanders off task because the mind's task is to wander." When we add modern technology to the mix- the internet, computers, and smartphones- collaborative and creative multitasking is more possible than ever before.
The complication as Davidson sees it is that our insititutions- school and work- are designed for the pre-internet 20th century and don't address the question of how we can use technology to be better, create more value, and learn more effectively. They fit the 20th century division of labor, not a networked, collaborative 21st century. For example, society claims that kids these days are dumbed down by technology, but Davidson asks if perhaps the problem isn't with the kids, but with the system that is supposed to serve them. "For all the punditry about the 'dumbest generation' and so forth, I believe that many kids today are doing a better job preparing themselves for their futures than we have done providing them with the institutions to help them. We're more likely to label them with a disability when they can't be categorized by our present system, but how we think about disability is actually a window onto how attention blindness keeps us tethered to a system that isn't working." (10).
Davidson presents several solutions to bring these institutions into the 21st century. For example, she talks at length about the idea of game-ifying work and school to keep people's attention. She also advocates embracing flexible, virtual work environments that prize an individual's unique talents and work style. She recommends collaborative, endeavor-based work and learning, as well as trashing traditional grading and curriculum-based teaching and letting students take the lead of their own education and even grading, all while using technology to accelerate progress.
It's no revelation that we're stuck with an outdated education system, and Davidson makes the excellent point that for the most part you could put a schoolteacher from 1900 in a modern classroom and they would recognize it instantly. That classroom is a product of the 19th and 20th centuries, when society- based on the prevailing work and management theories of the day- thought that the most efficient way to use workers to create value was an assembly line approach characterized by individual tasks as specialized and specific as possible. Think about the stereotype of an early Ford car factory, for example. A motor might be rolling down the assembly line, and as it passes, a worker in line screws in a piece. The engine continues and is replaced by the next, and all day long, the only thing that worker is doing is screwing in that one piece in each new engine. The man is no better, no more capable than the machine. If the ideal was this kind of mindless, low level work, then schools were seen as an integral piece in training a work force capable of doing these repetitive task. And this assembly line approach didn't just apply to industry, but also to the office. Think about how compartmentalized a 20th century company was- HR departments, Engineering, Sales, Marketing; all completely compartmentalized and almost never talking with one another. Each employee had a specific task to perform, like a piece in a motor. If one piece didn't work, the system failed.
Employers needed single-minded workers who would do their one task exceptionally well without asking questions, thinking for themselves, or getting distracted. Schools had to produce those kinds of workers, and so we evolved a system that reflected the specialized, hierarchical separation of labor found in the workplace. Knowledge was divided up into arbitrary disciplines, and ranked in terms of importance. This hierarchy of disciplines, as Davidson notes, was sciences on the top, humanities on the bottom, with physical education, shop, and arts being slowly eliminated over the years. Even the kids began to be ranked (letter grades, as she notes, didn't exist as we know them until 1897) so that they could be sorted into those most apt for employment. After all, in an assembly line model, you're only as fast as your slowest piece.
Davidson writes on page 279, "School has been organized to prepare us for this world of work, dividing one age group from another, one subject from another, with grades dictating who is or is not academically gifted, and with hierarchies of what knowledge is or is not important (science on top, arts on the bottom, physical education and shop gone). Intellectual work, in general, is divided up too, with facts separated from interpretation, logic from imagination, rationality from creativity, and knowledge from entertainment. In the end, there are no clear boundaries separating any of these things from the other, but we've arranged our institutions to make it all seem as discrete, fixed, and hierarchical as possible."
Of course, we now know there are better organizational schemes ways to productively create value for others, whether in a factory or in an office. Lean Manufacturing, Decentralized Management, and the Coventry Gang System have all shown this. But schools are still afflicted with that turn-of-the-20th-century mentality, the assembly line model of learning, despite the fact that the model most apt for the 21st century is a network. What sense does grading and sorting kids, or dividing knowledge up into arbitrary categories, make in a network model of society?
Grading and sorting kids is done all with the idea of getting them into a good college, and eventually, a good job. I agree with Paul Goodman in his fantastic Compulsory Miseducation. Why did we ever start to think that it was the job of the schools and universities to help employers find good employees? Shouldn't the employers worry about the best way to sort and rank potential employees, and universities bother themselves about the best way to sort and rank applicants? And Davidson's observation of the difference between an assembly-line model of production and a network model of production only reinforces this point further. A weak link in an assembly line slows the whole system down, granted. But in a network, the opportunity is there for each person to find their ideal position around a central node or cluster and contribute to it in an a-linear way that defies the assembly line way of thinking.
The best example I can think of comes from Davidson's book. Take people who have autism or asperger syndrome. They are not cut out for a typical work or school environment, and without special aid, typically fail miserably in a common school environment. That is to say, in the assembly line model where each "product" (student) coming out of the factory must be as close to identical as possible (as Davidson says, we've confused "high standards" with "standardization"). But that model ignores the special talents they have as a result of their disability. She mentions in her book a code testing company that employs nothing but people with autism and asperger. Employees in this company rigorously test new code from their clients for errors, bugs, and typos. People without autism and asperger are terrible at this job, and they don't have the concentration or attention span to catch all the errors in the code. But those with autism and asperger excel at this kind of intense, detail-oriented work. The workers there, who are labeled by the assembly line model as defective products of the education system, have found their niche where they can contribute the most value to society. That is the network model at its best.
Schools are built to reward monotasking, but a central point Davidson makes is that our brain, despite what we may think, is not built to monotask. Multitasking is its natural mode of operating. Our minds abhor the boredom and single-mindedness that comes with monotasking and instead defies disciplinary boundaries and craves novelty, stimuli, and collaboration. The best thing we can do is stop fighting our nature and take advantage of the possibilities our technology offers to build systems that embrace the way our minds work.
For example, Davidson points to the fact that in studies on distractions in the workplace, nearly half the time the distraction was internal. In other words, the subject was distracting themselves. Later in the book, she notes that on average, 80% of our neural energy taken up just talking to ourselves (280). These two statistics support Davidson's conclusion that our minds are naturally hyperactive. Like a little kid on a sugar high, they constantly need something to distract them and keep them occupied, and they quickly get bored when confined to performing the same task repeatedly or for an extended amount of time.
While often easy to vilify, the natural overactivity of the human brain has an advantage. It is constantly cross-referencing other parts of our experiences and memories looking for new connections that could be of value to us or those around us. In other words, the same distractability that frustrates us is also a source of creativity. And it apparently doesn't even take that much mental energy to switch gears mid-task to something new- only 5% of our at rest energy (280). Even when we're seemingly zoned out and day-dreaming, the mind is incredibly active at making connections with what's around us and with our stored bank of experience. The brain is a natural multi-tasker.
What's clear is that our minds aren't well ordered, logical, linearly-functioning machines as 20th century thinkers thought but are themselves networks of neurons constantly communicating with one another. We've been trained to think in terms of compartmentalized disciplines and functions, but our minds are naturally interdisciplinary, constantly seeking to connect new and old experiences in novel ways. Our distractability is a side effect of this. The key may be in consciously controlling what our distractions are to facilitate productive connections and distractions over unproductive ones. As Davidson writes, "What confuses the brain delights the brain. What confounds the brain enlivens the brain. What mixes up categories energizes the brain. Or, to sum it all up, as we have seen, what surprises the brain is what allows for learning." (286).
Unfortunately, our education system is designed to punish distractions and enforce an unnatural and stilting single-mindedness of the kind required for an assembly mind worker, not the creative knowledge workers the world needs from our schools in the 21st century.
Davidson provides some ideas and examples of how the school and university environments could be reformed to be fit for the 21st century. This is a point where the book falls a little short, unfortunately. Davidson lost the opportunity to comprehensively describe her vision. There's some great ideas and anecdotes provided, but they lacked cohesion and an easily understandable vision. For example, she describes some great examples of what she might have called "endeavor based learning", which could have easily formed a central operating principle for her classroom makeover (as she calls it). But she fails to make that principle explicit in her descriptions. This idea of endeavor based learning is something I will discuss later in the essay.
Davidson does discuss the role technology can play in the 21st century classroom to enhance the interdisciplinary, collaborative, student-driven learning opportunities she sees as key to the future of learning (as do I, for that matter). I think I would have loved to see a few more examples here, as the central example she provided was a fantastic one. She oversaw a program at Duke University to equip the majority of the students with free iPods. A partnership with Apple, the plan was to turn the entire campus into a learning lab on how a device like the iPod could enhance the student's educational experience on campus. Note that this was back in 2003 when the idea of an app itself was new, let alone that of a learning app.
The implementation was simple. All incoming Freshman got iPods. Any upperclassman could get one by proposing a use for it in one of their classes. In that case, everyone in that class would get an iPod. Professors could also propose uses, in which case all their students would get an iPod as well.
She uses this example because it has a happy ending, as the experiment was a huge success. Dozens of new apps were created and hundreds of new uses found for the device. More importantly, at least in my eyes, is the effect the process must have had on the students. In a limited but important way, it made them co-creators of their own learning experience. If the computer and the internet has democratized information, then it also has the potential to democratize the way we learn that information. We're certainly moving further in that direction even since the publication of Now You See It in 2010. But the democratization has mostly occurred outside of the classroom. Instead of percolating into the cracks of these old-as-rock institutions, classrooms have become more expensive (particularly on the subject of tuition and fees in universities and the per-student spending in public schools) and the extent of technology use in the classrooms are typically superficial, like smartboards instead of whiteboards, or having the online homework and quiz systems in language classes that students universally despise.
I do think of how some universities are experimenting with the idea of webcasting the overcrowded first and second year classes- the ones taught in auditoriums to hundreds of students at once. But that begs the question, why even have the class in the first place? Why not just make a recording of the lecture and put it online? It would save everyone's time, effort, and money. If students had a question (and few do), the email address of the instructor could be provided so that the student could email their question or even set up an in-person meeting if necessary.
This is going a bit off the topic of Davidson's book, but modern technology has made the lecture redundant and wasteful. Or perhaps it's always been that way. Dr. David Ray of the University of Oklahoma made the point that the lecture has been obsolete since the invention of the printing press. The word lecture comes from the latin verb legere, meaning to read, and the medieval latin lectura, meaning read. According to Dr. Ray, the origin of the lecture as we know it was in medieval monasteries, when books were precious because they had to be copied meticulously by hand. To create these copies, the head monk would read aloud from the original while the rest of the monks copied his words onto new parchment. We might even suppose that, as the monastery had at best only a couple copies of a given work at one time, one monk reading aloud (as all reading was done then, interestingly enough) would allow many other brother monks to partake of the "lecture" or reading of the book. As a system of learning (if it ever was one) it ceased to be meaningful when books could be cheaply and rapidly printed.
What's interesting is that some of the most vocal defendants of the lecture format are students themselves, who typically claim it to be an effective teaching method. I am very, very skeptical of this claim, and it is a hypothesis I'm going to be investigating in further writings. Small group discussions can certainly spark meaningful insights, but most lectures are a far cry from a discussion group and usually consist of the teacher writing notes on the board that would be more effectively communicated if he had simply passed out the book he had taken his notes from. Even better, technology allows each learner to find the codified information in formats conducive to their learning style: books, ebooks, audio programs, and even video programs. I knew many friends who, confused by the professor's lectures and explanations, taught themselves entire courses through free video series on YouTube.
And the lecture is so far removed from any practical application. I fail to see how listening to a teacher talk about an equation is more effective than actually doing the equation oneself, or even better, using that equation to do something useful or meaningful- like building an app or a widget, or even a prototype or model of one.
I know this is a big can of worms, and I'm leaving it unopened for now. But I will return to the topic in a later piece, rest assured!
One thing I'm curious to think about is, if a university were invented today, completely blank-slate, what would it look like? How would it be organized and maintained, and what would it prioritize and how? How would our new technology be used? What staples of the modern college would still find their way into a completely new model? These are tricky questions to answer, because to do so, one has to answer what the role of a university is. For example, there are a plethora of online university options available that take advantage of internet technology for so-called distance learning, but these seem to fill the relatively modern role of a university as a factory for credentialing professionals.
Davidson does provide a good example of what her idea of what a student-directed class could look like in her own course at Duke University. Teaching a class called Your Brain on the Internet, the idea was that it would be an interdisciplinary exploration of pretty much exactly what this book is about- the role our rapidly evolving technology has on the way we live our lives. She provided a list of recommended reading, but otherwise let her students direct the course and what would be discussed. From her description, Davidson was more of a facilitator than a teacher.
The class was incredibly successful. Not only did the students later rank this course as one of the most impactful they took while at Duke, they even went so far as to organize extra classes when they felt they needed it. For instance, when a thinker they had been discussing happened to be in town, they organized an extra-curricular class (if extra-curricular has any meaning when the curriculum is set by the students anyway!) to hear straight from the horse's mouth what he thought. Imagine if more classes were organized this way!
Of course, well-meaning administrators could easily ruin a course such as Davidson's by making it part of a required, general education curriculum. My guess is that in that case, the course would fail miserably, as it would only pay lip service to the idea of being student directed. Students would see it as yet another tick box to check off on the long, arduous, overly prescribed path to getting their degree.
As it turns out, students weren't completely satisfied with Davidson's administration of the course. For all its progressiveness, it still boringly followed the typical manner of grading a course: a professor-graded midterm, term paper, and final exam. After this feedback from her original class, she wrote a controversial blog post about a potential alternative: contract grading combined with class-sourced grades.
Contract grading, which I'd never heard of, goes back to the 60's or so. In that system, a student could agree to do proportionately less work in exchange for a guaranteed B or C, less than what would be required to get a A. So she proposed that students in the next iteration of her course could decide in advance exactly how much work and what kind of grade they were shooting for (she points out that the coursework required in the first version was not insignificant, so even someone shooting for a B or a C would still have her hands full).
When it came time to evaluate each other, the students would look at the amount of work their peers had agreed to per their contract and evaluate if they had fulfilled the terms of that contract. For example, if their contract stated that they had to write 10 blog posts during the course of the semester, did they do so? And were the blog posts of sufficient length and depth to be considered worthy of the name?
Having already discussed earlier my views on the modern grading system, I think this is a refreshing new approach of the kind that I'd like to see more of. I don't see grading, ranking, and labeling as part of role of a teacher- that's the job of an admissions officer or HR department. I see a student-directed grading initiative as an interesting compromise. It reinforces the self-directed nature of the class (and the inherently self-directed nature of learning) and reduces the workload of the professor so that they can focus on more important things. My guess is, the students are as strict if not more strict than the professor herself in upholding themselves to the terms they agreed to. I say that because the students undoubtedly felt a sense of pride and ownership of the course and what they learned in it, a feeling most never get in 16+ years of schooling, most of it mandatory and teacher-directed. We always take better care of what we own, and the students felt ownership of the course. They had a vested interest in maintaining its high standards.
In Compulsory Miseducation, Goodman writes that in the original medieval universities, grading and ranking the students was never considered part of the mission of the institution or its professors. Students of course had to demonstrate competence, that they were worthy to be included within a certain guild or group of professionals or scholars, just as we require lawyers today to take the bar exam and doctors to do a residency. But if they were good enough to be in (and I imagine the standards were fairly high), then they were all the way in. None of this nonsense of A's and C's.
"It is really necessary to remind our academics of the ancient history of Examination. In the medieval university, the whole point of the grueling trial of the candidate was whether or not to accept him as a peer. His disputation and lecture for the Master's was just that, a masterpiece to enter the guild. It was not to make comparative evaluations. It was not to weed out and select for an extra-mural licensor or employer. It was certainly not to pit one young fellow against another in an ugly competition. My philosophic impression is that the medieval thought they knew what a good job of work was and that we are competitive because we do not know. But the more status is achieved by largely irrelevant competitive evaluation, the less will we ever know."
How much more inspiring would our universities be if, instead of viewing themselves as a factory for employers and grad schools, they saw themselves as aiding students in creating a masterpiece, a perfection of their craft, and to prepare them to enter a close-knit fraternity of curious and worldly professionals and scholars?
The last topic of the book I want to touch on is Davidson's description of Endeavor Based Work (EBW for short- she never uses the acronym, but I'll be mentioning it a lot more than she did). The term EBW actually comes from IBM in what seems to be one of the most stunning corporate management transitions in modern history. In short, IBM managed to transform itself from a stodgy, conservative behemoth into a progressive, agile, and virtual workforce and in the process avoided becoming a footnote in history. The part of their story I'm interested in is their idea of EBW, because it has the potential to change the way we think about how we learn.
EBW at its core is simple. Instead of compartmentalizing various functions like chimneys on a factory into HR, Engineering, Sales, Customer Support, etc., EBW organizes its teams by projects. So a project has all the people on it from all the disciplines required to realize it. This is a feature of Agile, for example, and other decentralized management philosophies, but I love the name EBW because of how descriptive it is. Everyone on the team is responsible for the success of the whole project and contributes in their unique way to its success, like members of an orchestra. In fact, the analogy used in the book is that of a film crew. Everyone has a unique role or task, but all are united by the vision of the final film product. In the process, crew members may have to step up and do tasks outside of their specific role. But in the process of seeing the film take shape, everyone learns and grows infinitely more than if they were in an isolated silo performing just their specific, specialized task. And the final product is better for that. In fact, the film might not ever get made if the crew performed their work in isolated silos.
This idea of EBW seems to be part of Davidson's classroom makeover, though she never uses that term to describe it. However, I am convinced that it is the single idea described in her book that, if implemented in our education system, would do the most to positively revolutionize the way we learn.
Imagine if, instead of listening to a teacher drone on and on about seemingly unrelated subjects that have no context or meaning in one's life, students were given an endeavor to complete (or better yet, chose an endeavor to complete) that was meaningful and relevant to them. They could work on it alone or in teams. Completing it would require collaboration, engagement, hands-on learning, creativity, problem-solving, and initiative. Because it is a real task and a real problem, it would defy disciplinary classification, and the lessons learned would be real and meaningful.
Davidson provides several examples of this, but the most beautiful one is easily that of her own mother-in-law, Mrs. Davidson. Teaching in a remote, rural school in Mountain View, Canada, Mrs. Davidson challenged her students to find pen pals. The challenge was that the students had to find pen pals in another town called Mountain View, anywhere in the world. The most creative solution to this problem would "win" the competition. The students first had to create their own world map and, using the minuscule resources available in the school and town, find other Mountain Views. Once they did that, they still had to figure out how to get in touch with a resident there.
The results were nothing short of breathtakingly inspiring.
"One kid remembered that Hang Sang, the elderly Chinese man who ran the local general store, the only store in town, had come over to Canada to work on the railroad, as had so many Chinese immigrants. Taciturn, with a thick accent, Mr. Sang was amused and delighted when one child suddenly wanted help writing a letter- in Chinese. The kid had somehow found out about a town called Mountain View in China. That was the child who won the contest, and long after the contest was over, he spent time with Mr. Sang, talking to him in the back of his store.
But of course they all won. The globe became smaller through the connections they made, and their town became larger. They learned geography and anthropology and foreign languages too. The project lasted not just one full year but many years, and some kids visited those other Mountain Views when they grew up. To this day, I don't drive through a town named Mountain
View (there are a lot of them in the world, actually) without wondering if one of Mrs. Davidson's kids sent a letter there and, through the connection made, was inspired to go on, later, to become a professor or a doctor or a veterinarian (85)."
That was before the internet existed. What possibilities exist for these kinds of real-world, self-directed endeavors in our schools and universities now that, thanks to the internet, we have any kind of expertise, knowledge, experience, and personal connection at our fingertips?
While Cathy Davidson doesn't propose going as far as I think we need to go in reforming our education system, I think she offers several compelling pieces to the puzzle: endeavor based work in schools using technology as a force multiplier, as well as student-directed learning and peer-sourced grading. Davidson makes the point that this system would not only better fit the networked, collaborative society we live in today, but also works with our brains neurology, not against it. It is a smart, humanizing alternative that would reinforce a completely different set of values than those currently taught in our education system. It would teach creativity instead of regurgitation, collaboration instead of compartmentalization, initiative and independence instead of docility, and meaningful, experiential learning instead of learning for an arbitrary test.
The Colony, Texas
...sees much and knows much