In gaining awareness of ever more profound patterns of relating and resolving tensions, we are able to increasingly manipulate them for our ends. Because this ability to affect our will upon the world is tied to inner transformation, the process leading to mastery in engagement with the world is the same process leading to self-knowledge and self-reliance.
We can say that our conscious awareness is best directed towards understanding and controlling our own state, which is analogous to the automatic routine our other-than-conscious (embodied) self runs. We can say this because our conscious awareness is a finite resource whose utility is limited as we begin to apply it to ever finer phenomenon. Combined with an understanding of systems as bottom-up phenomenon, we can also understand why managing our state is so effective. We are tending to the relationship we have to ourselves and to the world.
A college friend and I, both interested in product design, recently challenged each other to begin keeping a "bug list". This is an idea we'd read about in a book by some of the founders of Ideo, the famous product design firm. A bug list is where you keep a list of all the problems you encounter or run into on a daily basis. The problems that seem the most promising become fodder for later product ideas. All good products start with a problem they're trying to solve. The bigger the problem (I've heard it described as solving shark bites versus mosquito bites) and the more people who have that problem, the more impactful the product will be.
More fundamentally, the bug list is a tool to get ourselves in the habit of looking for problems to solve. Everytime we complain, get frustrated, lose something, etc. during the day we make a note of what the problem was. Or I would sit down at the beginning of the day and think of ten or more problems I had as I mentally ran through my routine or through the previous day's activities. The nature of our challenge was to each come up with ten new problems a day for two weeks. Then we would have a call to compare and figure out the best problems we would like to solve. That's close to 300 problems we came up with between the two of us over those two weeks. Most of the ideas were shit, of course- things that couldn't really be solved with products or even software, or that were more personal problems or societal problems. Or problems that would require a multi-million dollar research team to solve, rather than two young engineers working in their spare time. Many of the problems were small problems, too- problems few other people likely saw as an issue, or that were mosquito bites that wouldn't be worth the hassle of using a product to solve. Still, we developed a short list of things we'd like to try and solve at some point. Even if none of these ideas go anywhere, coming up with them has been a powerful process for me. I see three distinct advantages. Society as a whole, at least in the west, is moving towards a decentralized, networked system rather than centralized. It is notably in our education system, both public and higher, that we are lagging behind.
The internet communications revolution is one of the key pieces behind this. As Jeremy Rifkens notes in The Empathic Civilization, the internet is in and of itself a decentralized system. Rather than being routed through a centralized phone system as with a telephone call, an internet user accesses information from a decentralized database of servers located all over the world. And with previous communications systems, information could only be passed along linearly- in a book, on a floppy disk, or along a telephone line. Whereas information uploaded to the internet can be accessed from anywhere in the world by any number of users. Information uploaded becomes part of a pool of shared value anyone with an internet connection can tap into. This is fundamentally democratizing, allowing unparalleled access to information as well as the chance for everyone to have a voice. We hear talks of the sharing economy as the model for the early 21st century, made possible thanks to the internet. It is a network model of society, where anyone can contribute value in an a-linear way and access the shared pool of value. The decentralizing effects of the internet can be seen in all facets of modern life, to varying degrees. It is decentralizing the way we learn, collaborate, work, and manage our companies. I've talked sufficiently about the effects of modern technology on learning in other articles, and suffice it to say here that one can learn most anything they want at little to no cost through the internet. Even in cases where expertise, mentorship, or peer-based collaboration is required, the necessary connections can easily be found on the internet. We've also seen how technology is decentralizing the way people work and companies organize themselves. The growing number of independent, freelance creative professionals is a perfect example of this. Unthinkable decades ago; now designers, creators, and consultants can independently promote and market themselves, and complete their contracts from their laptop and anywhere they have an internet connection. I've already talked about the example of IBM that Cathy Davidson uses in Now You See It. Faced with becoming irrelevant by fast moving, younger, smaller companies; IBM completely reinvented itself as a decentralized company under a network model. They use technology to manage independent, crossfunctional, interdisciplinary teams whose members are often scattered across all parts of the globe. The members of the team do what they call Endeavor Based Work, where a single crossfunctional team develops a project from start to finish, each member adding value in their unique way but also responsible for the final product. This is in contrast to a centralized organization, where the project, and ownership of it, would be tossed around from department to department like a hot potato: engineering, testing, manufacturing, quality control, marketing, sales, customer service, etc. It's no coincidence that most software companies, for example, use some form of the Agile. Agile is a decentralized, customer-centric management philosophy that focuses on creating value for customers through rapid prototyping and an empowered team structure. Rather than having a centralized hierarchy where orders are handed down from on high, in Agile, the thought goes that those working directly on a project with customers are the ones who have the most accurate, up-to-date information that affects the direction of a project. Hence the power being decentralized into the hands of the individual teams and its members. Managers and team leaders don't manage or lead in a traditional sense, but are there to facilitate the work of the team members as much as possible by removing obstacles and connecting the team to resources. It's no coincidence that an Agile team leader sounds a lot like the idea of a teacher-as-learning-facilitator I've discussed previously. The inversion of authority functions into facilitation functions is a key aspect of any decentralizing system. Agile arose in the 90's as a way to solve the problems software companies of the day were having, namely, that they would often spend months or years developing a piece of software that didn't actually fit customer's needs and hence went unused, that the software they built was too interdependent and could not easily be updated, and that they spent an inordinate amount of time at the end debugging the code. As I write this, I can even see the parallels between this and the problems with our education institutions I've previously discussed, that our schools and universities are not customer-centric, deliver learning "products" that their customers don't want, have parts that are too interdependent and as a result prevent them from adapting and trying new things and experimenting. Eventually, the pioneers of Agile created the following manifesto of 12 principles to describe their work and management philosophy:
Rather than developing a product for a year in an isolated silo, hoping it is what the customers need, then kicking it down the line to the sales and marketing department, Agile teams invert the approach. The team works hand in hand with the customer, who often forms a member of the team. At the very least, the teams typically include a "customer advocate" position whose job is to ask, "is this really what the customer wants? Will the customer actually use this?" They build a rough, dirty prototype to begin with that may look nothing like the final product but that can be shown to the customer in a week or two weeks rather than two years. This is key because it means crucial feedback can be collected than can influence the next prototype, which is slightly better than the previous one. Over the course of dozens of iterations and evolving prototypes, the constant customer feedback ensures that the product, slowly but surely, moves towards completion, often much more rapidly than a traditionally-managed project. And resources are saved because by the time the product is finished, no extra testing is needed. The customer testing was built into the development and the developers, as a result, know that the product delights its customers. That is obviously best case circumstances, as it is hard for many companies to successfully implement agile, especially larger ones with more established corporate cultures. But the data support the idea of Agile and its efficacy: Agile-managed projects are successful 50% of the time, to within 50% of the original timeframe of the project planners. By comparison, traditionally (centrally) managed projects are successful just 14% of the time and are accurate to with 350% of the original timeframe alloted (source). The teams are purposefully kept small to facilitate communication and collaboration, usually no larger than 5-6 people. Teams larger than that have been shown to actually output less on average than smaller teams, due to the complexity inherent in having more people who need to be constantly communicating, collaborating, and developing and maintaining a consensus and direction. It's much more effective to split larger teams up into smaller teams whose leader-facilitators then coordinate the combined actions of the two mini-teams. These teams typically set their own goals and tasks to achieve those goals then hold each other accountable in the execution. The team members, as with the IBM example, are often comprised of a spectrum of disciplines required to complete the project. This form of customer-centric, decentralized management is not unique to software, however, but has cropped up and is cropping up in a variety of disparate fields. Toyota, for example, pioneered Lean Manufacturing. Later, the concepts of Lean were applied to managing new startups and the Lean Startup movement arose. Software Agile is slowly giving birth to a new movement of modular, rapid, decentralized manufacturing called Extreme Manufacturing (after Extreme Programming in software). There are the fields of design thinking and human-centered design which have emerged in recent years, as well. And ironically enough, much of the inspiration for Agile came from the pioneering work of the architect Christopher Alexander whose seminal works A Timeless Way of Building and A Pattern Language described architectural pattern languages as a tool for democratizing the development of livable homes, offices, and towns. Inspired by Alexander, early Agile practitioners began developing their own library of software patterns to accelerate their work and increase their ability to collaborate across teams. Even the military began experimenting with decentralized management as early as the late 70's and 80', and the post-9/11 fight against terrorism and the wars in Iraq and Afghanistan seem to have renewed the US military's commitment to decentralization. The question is not if, but when decentralization will begin cropping up in new fields of human endeavors, and those who are capable of facilitating the transition to decentralized systems will almost certainly be heralded as pioneers as we become accustomed to living in a networked society. in Centralized organizations are characterized by artificial divisions among work tasks or disciplines. In a corporation, that might look like Engineering, Human Relations, Sales, Marketing, Customer Service, etc. In an academic setting, it is the academic disciplines: biology, physics, psychology, and so on. It seems efficient on paper to have all the engineers talking with one another, all the biologists talking to each other, and all the psychologists talking together; and indeed it may be efficient for certain aims. For specialists doing isolated research in very specific subsets of their discipline, it makes sense in some ways to communicate and collaborate primarily with other specialists of that discipline. But the greater efficiency, relevancy, and customer-centricity of the organization suffers as a whole because the individual buckets of disciplines or functions isn't communicating across the various stovepipes of the organization. Meanwhile, the power structure of a centralized organization is very top-down. The CEO sets the overall policy that the high level managers pass on to mid level managers and on to the direct reports at the bottom of the hierarchy. Ultimately, this leads to less buy-in from people involved in the system. They don't feel ownership for the overall product or service that they're offering, as they only ever see a small piece of the final product, and they have no real say in how it looks or how it is delivered.
It's obvious that our education system is still very centralized. Christensen et al. and Davidson describe it very well in Disrupting Class and Now You See It. While they're talking specifically about public education, universities are in the same pot. A typical college student advances through an assembly line's worth of required courses, all taught in a very monolithic way independent of learning styles, typically through a lecture and occasionally through discussions and labs. The students are treated like buckets into which more knowledge is poured by the professors, divided up into discipline-specific departments, until the students are sufficiently "full" of knowledge- both discipline-specific and of the so called "general education"- that they've earned their very expensive set of credentials. And while in the past, the university at least had a somewhat decentralized power structure in that professors largely governed themselves, universities have as a whole been rapidly centralizing their power structure in a more familiar, almost corporate hierarchy as professors have willingly given up their powers of self governance and in doing so have lost a sense of ownership and buy-in into the service-based goals of a university (beyond it's function as a credentialing institution and means for personal economic advantage). Zemsky et al. describe this phenomenon of a rapidly centralizing university power structure in their work Remaking the American University. For most of the history of the university, professors have organized themselves to elect department heads, handle admissions and counseling, and run the day to day operations of the university. But as the authors describe, a lattice of administrative workers arranged in a centralized, manager-centric model slowly emerged to make the university financially self-sufficient beginning in the 70's and 80's as state appropriations for universities declined. As time went on, the administrators expanded their lattice of powers and responsibilities until gradually a whole class of non-professors where governing the university instead of the professors. They note that in 10 years, from 1975 to 1985, the average amount of faculty increased by 6% while the administrative staff increased by 60% (23). Yet professors were fine with this, as it freed up more of their time to do prestigious activities that would advance themselves professionally: research, publication, and professional service (25). Unfortunately, as Zemsky et al. note, this gradual disengagement of the professors had poor outcomes for the institution as a whole. It lost its sense of an organization providing a greater societal good and instead came to view itself as a purely consumer good offering students credentials in exchange for tuition money. Professors have, for the most part, been too busy pursuing their personal goals over the goals of the university as a whole, acting more like mercenaries rather than members of a community of scholars (26). "In the early twenty-first century, all that social activism is now gone or disappearing. Today colleges and universities are seen principally as gateways to economic security and middle-class status. Except for the occasional bout with political correctness, almost no one worries about higher educaiton institutions leading young people astray. If anything, the lament is that they have, in their pursuit of market advantage, bceome dispensers of degrees and certificates rather than communities of educators who originate, debate, and promulgate important ideas" (4). This is a direct result of the gradual centralization of the power structure of American universities combined with the new role of the university as a path to middle-class economic prosperity. But that middle-class opportunity as we knew it is disappearing fast, if it's not already completely disappeared. Liberal arts majors are hard pressed to find jobs worthy of a bachelor's or master's degree after they graduate. Professional degrees such as business and engineering still confer some employment opportunities. But it seems like a given that even technical jobs will soon be outsourced to the developing world, where the same jobs can be done more cheaply and effectively- at least, in a centralized model of education and work. In a centralized system, as long as the engineers can cheaply and effectively execute the specifications set by the business team. But as I've mentioned, the inability of the functional "stovepipes" to communicate between one another often means this system doesn't work as well in practice as in theory. That said, with modern communications technology, decentralized teams can easily be managed and coordinated even when the members are scattered across the globe. So outsourcing isn't necessarily a phenomenon of centralization. Zemsky et al. try to answer the question, why care that the role of universities have changed and they've become the degree factories they are today? Besides the fact that the road they provide to prosperity isn't as clear as it may have been in the past due to globalization, they say: "The answer lies in what is lost when universities are shaped almost exclusively by the wants of students seeking educational credentials and businesses and govermental agencies seeking research outcomes. When universities are wholly dominated by market interests, there is a notable abridgement of their roles as public agencies- and a diminution of their capacity to provide public enues for testing ideas and creeds as well as agendas of public action... Finally, what is being lost is the idea that knowledge has other than instrumental purposes, that ideas are important whether or not they confer personal advantage" (7). Centralization has led universities to focus almost exclusively on the credentialing of our youth to the exclusion of its other potential functions. Another unfortunate side effect is that it provides a subpar educational experience. Christensen et al. describe one such effect this has on the educational experience in a university: "Consider colleges and universities, by illustration. Their major lines of organizational structure are typically drawn by academic field: departments of mathematics, physics, French, economics, classics, and so on. The reason for structuring universities in academic departments is to facilitate the faculty's ability to interact with others who share common interests and expertise and to help them publish in specialized acdemic journals so that they can achieve tenure. As a result of these structures, college education for most students entalis repeated bouncing back and forth in a cumbersome way between departments and administration to get their education. And colleges incur extraordinary overhead expenses to deal with the fact that few of them are organized in ways tthat optimize the flow of students through the requisite experiences" (172). And so students pay more for a subpar experience that doesn't quite prepare them to create value in a networked, 21st century America. If their goal is to mimic the outdated centralized, assembly-line style management and work practices that resulted in the outsourcing of a large part of our economy in the past decades, then they are succeeding. In the meantime, students who have never had an opportunity to experiment and explore their interests and identity are stuck in a system that discourages experimentation and exploration of disciplines, majors, and interdisciplinary collaboration, the latter being where the truly interesting and engaging work is usually done. They often spend two years taking "general education" classes before they get to any meaty class on what they're supposed to be excited about, which is their choice of major. These general education classes are usually the same things covered in High School, which discourages and depresses them and make them realize they are in for More of the Same. And almost without fail (there are notable exceptions, but they are exactly that, exceptions) these classes are easy and poorly taught in large auditoriums of 200+ students, as the professors realize the students are there because they have to be and not because they want to be. So by the time they get to any interesting classes in their third year or if they're lucky by the end of their second year, many students feel it's too late to change majors even if they realized they studied the wrong thing. The sunk-cost bias sinks in, and after all, each class they already took means hundreds if not thousands of dollars down the drain if it doesn't count towards their new major. And even changing general-education requirements between majors and departments means that some of those classes may be wasted, as well. Worse, because of the academic nature of the courses even in professional tracks like engineering and business and social science, someone can easily graduate from a university without knowing the least thing about how their profession actually operates. This is, once again, thanks to the centralized nature of the university. A centralized system requires standardized processes across the institution, and the easiest thing to do as the institution grows is to apply the existing processes to the new parts of the organization even if they aren't ideal. As professions typically taught in vocational schools or apprenticeships suddenly found themselves taught in universities, no effort was made to find effective ways to teach the practical skills associated with those professions. They were instead shoe-horned into the existing pedagogical methods perhaps suited to teaching philosophy and rhetoric but ill suited to what most students in a university study nowadays. And with the current system of economic incentives for universities, educational quality is typically ignored in favor of ever more robust and competitive recruitment processes (Zemsky et al. 44). The Endeavor Based University is a decentralized education at its finest. The technology and infrastructure is already there such that anyone can learn practically anything from anywhere they have an internet connection. At a fraction of the cost of a university degree, they could even take advantage of non-virtual resources like libraries, fab labs, hacker spaces, apprenticeships, and mentorship.
Universities and other educational institutions have the opportunity to recast themselves as network nodes in this decentralized network of learning, that is, natural accumulations of peers, expertise, resources, and facilities that can vastly accelerate an individual student's learning as well as advance a certain agenda: civic and democratic engagement, cross-cultural collaboration, humanistic values, etc. Think about the portrait of a typical college student's experience that Goodman drew earlier. Imagine what a university education could be, instead. Imagine if the university were instead project based or endeavor based, and interdisciplinary instead of divided into disciplines. Each student would still have a major, but from the very beginning they would be working on real world projects in small teams instead of sitting in lectures for the first two years that had little to do with what they might actually be doing after they graduate. That way, they could decide from the beginning if that major was what they really wanted to be doing, instead of having to wait until their third or fourth year to find out. These teams, rather than working on a contrived academic project, would be working on a real project with real value for someone. It could be working hand in hand with a professor on his research project, working with a real startup, or even a project a corporation or company had. Or, it could be a completely student driven initiative: a startup, a community improvement or activism project, or even a local political campaign. The team would be interdisciplinary, as a real world team would be. For example, imagine if the project were to try and launch a new product. Engineers, industrial designers, graphic designers, etc. would be needed, of course. But also business students, entrepreneurship majors, an accounting major. A psychology or sociology student could be recruited, whose role would be customer discovery and to be the customer advocate on the team. Or imagine a team that decided they wanted to run for a local political science office. That could include political science majors, sociology majors, a journalism student, even a systems or industrial engineer. Or even if students wanted to work with a professor on a research project not based on disciplinary choice, either on a team or 1 on 1, that would be an option as well. Lecture classes as we know them would be gone. Instead of learning abstract knowledge, memorizing it for the test, then promptly forgetting it after the course was over, there would be just in time learning, roughly analogous to just in time manufacturing in lean. In just in time manufacturing, rather than expensively storing hundreds of extra pieces in a warehouse until they're needed, costs are cut by using superior organization to deliver parts to the factory only when they're needed. In just in time learning, students would learn only what they needed to know to complete what they were working on at that particular moment in their project. If they needed to learn differential equations to design a particular engine piece, their learning facilitator would direct them to the textbook or online tutorial series they needed to learn that, and to a professor on campus that could help them if they got stuck. So instead of learning a bunch of theory without application that has no real application or context or meaning for them, they would learn the material much more effectively because it meant something to them. It was something they learned, not something a professor taught them. Those differential equations would live on in their memory and experience as that engine piece they designed that was actually being used in a car somewhere. And the students would be creating real value in the world, something most of us have never had the opportunity, never been trusted to do until after we graduate. This would create tighter bonds between the university, its students, and the community. The university would come to be seen as a community resource where the nation's youth could be employed to solve some of the world's, the nation's, or the community's most pressing problems, keeping it relevant in an age where someone can get just as quality an education (though not the credentials) online at a fraction of the cost. Rather than learning only a particular set of technical skills, which are likely to be out of date as soon as they graduate or soon thereafter, the students would learn those in addition to the real gems: learning how to collaborate, work in teams, set and achieve their own objectives, be self-starters, problem solve an ambiguous real world problem, create real value for others, build consensus, think critically and creatively, and take action to achieve real, measurable change. In other words, the higher-level skills that are needed in a creative, networked society where outsourcing and automation are rapidly making those skills necessary to get a job or even better, create a job where one didn't exist previously. Of course, this idea brings up many questions to answer, which I'll explore in future articles. For example, how does a liberal arts education fit into this idea? How do you measure the results of students' learning or the efficacy of the university in such a system? Among others. So stay tuned. Dillon Dakota Carroll Prague, Oklahoma Schools as Nodes in a Learning Network, and Thoughts on Disrupting Class, Christensen et al.1/2/2016 Disrupting Class is a 2011 book, written by Christensen, Horn, and Johnson, that attempts to show how schools can take advantage of computer and internet based software to provide an intrinsically motivating learning environment for kids. As they point out, students need customizable learning suited to their pace, learning style, and personal interests. That learning should also give them the most opportunities for success.
The complication is that schools, organizationally, cannot do the very thing students need the most. Due to the interdependence of the modern school, customized leraning cannot be offered without prohibitive costs. Technology has only been bolted on as an afterthought and hasn't changed the core teaching methods. Christensen, et al. see a future where teachers are learning coaches and facilitators and software-based, student-centric tutoring programs allow students to learn the material they want, at their pace, with the grading and assessment built into the software. This won't come all at once, due to the entrenched infrastructure of our education system. Instead, it will happen in the gaps where traditional teaching isn't reaching as their effectiveness is proved, and will eventually become the norm as costs fall and the role of teachers is changed. In certain ways, Christensen et al.'s ideas are similar to that of Cathy Davidson in Now You See It: using technology to create a network model of learning. In fact, the authors of the respective books use the same metaphors, though in different words. While Davidson talks about an assembly line versus a network model, Christensen et al. talk about them in terms of business models. The current system we have, as they describe, is a value added process, or value chain, analogous to an assembly line. At each stage of the value chain, new inputs are added that create value for the end customer. The textbook manufacturers send textbooks to the schools, and the teachers in the schools use them to add value to the students in certain batches: 9th grade math, 7th grade science, etc. The model we need to have, on the other hand, is a value network. Instead of the producers creating value in a linear fashion that is consumed at the end point of the chain by the students, in a value network, each of the consumers adds their own value into a vast pool that can be shared by everyone who's bought into that network. Think about YouTube as the consumate example. Anyone can add value to the network in the form of a funny, useful, or entertaining video, and the benefits are available for all to use. In fact, YouTube is essentially the model Christensen et al. envision: an internet-based learning platform where teachers, students, and parents can develop learning apps that can be shared with one another at low, or no, cost (132). Currently, schools are only able to provide monolithic, one-size-fits-all learning because, as the authors say, "Today's system was designed at a time when standardization was seen as a virtue" (38). And in many ways, despite our newfound understanding of learning styles, developmental psychology, and more, we still believe this. The fact that we still bother with the same, monolithic standards for every single child in the US, which are the same measures we use to compare ourselves internationally, proves this. I won't go too far into the author's explanation, found in chapter one, but they do an excellent job of showing how the various pieces of a modern school are too interdependent. You can't change one piece without changing all of them. They are interdependent temporally (with the age-based grade system), laterally (across disciplines- as they note, you can't change the way Spanish is taught without changing the way English is taught), physically (the school buildings are designed to facilitate only one kind of learning), and hierarchically (schools have various stakeholders to make happy, often in conflicting ways) (33). Because of the interdependence of the parts and the fact that, at its core, the system is designed to standardize, adding computers and software to the mix solves nothing. It apports marginal benefits, but nothing revolutionary because the fundamental teaching methods haven't changed. The software solutions designed are little better than digital textbooks. As a result, customizing learning to aid students is prohibitively expensive. The authors make the point that in Rhode Island, educating a regular student costs about $9,300 per year. But educating a Special Education student, who's learning is supplemented with special materials, individualized instruction, etc., costs just shy of $23,000 per year (34). Ultimately, technology can't be the solution. It's a force multiplier that improves the efficacy of the solution you already have. If you have an ineffective solution that creates poor outcomes, technology can't save it. The analogy Ivan Illich uses in Deschooling Society is that expecting technology to solve our educational crisis is like the US military trying to bomb the Viet Cong into submission with bigger and more destructive bombs (77). It can never work because the fundamental approach is flawed. As John Boyd noted during his time in the Pentagon, “People should come first. Then ideas. And then hardware.” The fundamental way students, teachers, and people in general interact in our education system has to change before technology can be effective at multiplying the impact of learning. Ultimately, that's what Disrupting Class is about. While Christensen et al. would probably disagree with much of Ivan Illich's philosophy of completely abolishing mandatory education, their ideas are very similar in important ways. In his book Deschooling Society, written before the internet, Ivan Illich provides an alternative vision to modern education to replace our current model of mandatory education. He describes four learning networks, free and open to all to use (91):
This sounds pretty close to the network model Disrupting Class promotes as an ideal, where students and teachers can trade personalized learning apps, expertise, and knowledge in a mutually beneficial way. The only real difference I see between Illich's and Christensen et al.'s idea is that the latter still buys into the idea of standardization: all kids need to be learning more or less the same thing and get ranked and graded on those same things. If we accept that, then we still need most of the current educational infrastructure we currently have: mandatory attendance and all the associated overhead and extra cost associated with keeping a quarter of our population under control for half the days of the year, not to mention the largesse of a national system of testing and standards. To this point Christensen et al. describe two uses for testing. The first is for students to demonstrate mastery of the subject material, which is fine pedagogically when that aim is separated from our compulsive need to rank, sort, and compare students. Indeed, according to Cathy Davidson, the original letter grade system arose primarily as a shorthand among teachers to understand how well their own students were grasping the material. But alas, the second use Christensen et al. see is to compare students. "College admission decisions are built around test scores. The evaluation of which schools and districts are doing satisfactory jobs educating their students depends upon standardized exams. Even the assembly of honor rolls- whose purpose is to compare students- is largely based upon performance on exams" (111). As I already wrote in a previous article, it seems silly and wasteful to test kids on behalf of universities and employers, and sends the message that our schools are basically factories and feeders for these institutions. At what point did it become the responsibility of our school system to help those institutions choose who to accept? But as Christensen et al. point out, colleges do need a way to make admissions decisions. If they want to use test scores to do that, there are plenty of ways to accomplish that aim without making it the responsibility of the public school system. Universities could easily have their own entrance exam, and at any rate, plenty of standardized tests like the ACT and SAT exist, all supposedly designed to test college-readiness. The second point, that of evaluating the performance of schools and districts, is an issue inherent in a standardized, mandatory system of schooling. If we truly accepted a plurality of interests, passions, and learning styles among ourselves, and as a result rejected a standardized, mandatory system, then the onus would be on the teachers to make their class interesting and relevant enough that kids would want to attend. Then it would be easy to see who the good instructors were: they would be the ones who could present the material in a way that was interesting and engaging to the students, and actually had attendance. In vying for kid's attention in an open marketplace of ideas, instruction, and learning-facilitation, it would spur a search for innovative and effective pedagogical methods such as the very ones described in Disrupting Class in creatively run institutions like Quest 2 Learn, The Met, and High Tech High. The entire city or town could be opened up to facilitate more natural, integrated, and holistic learning. Christopher Alexander, ironically enough an architect, provides a visionary example of what our educational system could look like in his A Pattern Language. Envisioning a learning network remarkably similar to that of Ivan Illich, Alexander describes "another network, not physical like transportation, but conceptual and equally important, is the network of learning: the thousands of inter-connected situations that occur all over the city, and which in fact comprise the city's 'curriculum'". This city-as-curriculum is in fact a decentralized education "congruent with the urban structure itself", noting that "living and learning are the same." He continues: "In a society which emphasizes teaching, children and students- and adults- become passive and unable to think or act for themselves. Creative, active individuals can only grow up in a society which emphasizes learning instead of teaching" (99). I mention Alexander here because one of the many inspiring architectural "patterns" (or solutions) in his compendium is that of the University as Marketplace (231), more or less similar to what I described above when discussing the potential results of abolishing a mandatory, standardized curriculum. "Concentrated, cloistered universities, with closed admission policies and rigid procedures which dictate who may teach a course, kill opportunities for learning. The original universities in the middle ages were simply collections of teachers who attracted students because they had something to offer. They were marketplaces of ideas, located all over the town, where people could shop around for the kinds of iedas and learning which made sense to them. By contrast, the isolated and over-administered university of today kills the variety and intensity of the different ideas at the university and also limits the student's opportunity to shop for ideas." He notes that the key aspects here are that:
A university or school system run in this way could essentially be seen as nodes in Illich's learning networks: natural points where learners, mentors, researchers, masters, apprentices, instructors, and resources congregate. Going back to the idea of testing, what do these standardized tests measure anyway? Their chief virtue is that the metrics they use are easy to measure and compare. But ease of use does not a good metric make. What higher-level, more important values are we leaving unmeasured as a result of our focus on our precious, multiple choice, standardized tests? But, we say, how can our students be prepared for the real world if they don't learn certain basic skills? And implicit in that question, is how can we know if they've really learned those skills if we don't test and compare them? One thing everyone agrees on is that, the more intrinsically motivated one is to learn, the better. Indeed, Christensen et al. state several times in Disrupting Class that fostering intrinsic motivation should be a chief aim of our education system. But that intrinsic motivation cannot develop in an environment of coercion. Because the learning has no real context or meaning for students, it doesn't stick, and kids pass their time slowly learning the same things over and over again to pass the next test. Yet there's substantial evidence that when someone is truly motivated to learn and able to freely choose to do so, they can learn the same content on their own or with minimal instruction in a fraction of the time it would have taken in a coercive school setting. Think about all the hair-pulling and gnashing of teeth that teaching reading inspires in our schools today, for example. Paulo Freire, an internationally renowned revolutionary pedagogue, became famous for his work as an itinerant teacher in rural Brazil. He would go from village to village teaching illiterate farmers how to read. These tenant farmers were forgotten by society and exploited by landowners, in large part because of their illiteracy. Because they never had access to resources to learn how to read, they grew up illiterate, and as a result couldn't do something as simple as sign their own name at a courthouse- something they had to be able to do if they wanted to take legal recourse to protect themselves against the endemic exploitation they faced. He found that, without fail, a month was all it took to teach these "dumb" illiterate farmers enough of the basics of reading and writing that they became self-sufficient autodidacts, starting with the words and topics that were important to them as exploited, poor, rural farmers- their Key Vocabulary, as Sylvia Aston-Warner called it. From there, they could learn the rest on their own. Why? Because they had clear, strong motivations to do so. They needed, and knew they needed, to learn how to read to be free and to live well in their society. I believe it is a mistake to assume that, with the overabundance of the written word in nearly all parts of the United States today, our youth would not come to the same conclusion that these poor Brazilian farmers came to. And again, it comes down to what we value. Do we want pacified citizenry that duly does as its told? Or do we want ingenious self-starters who identify a problem and take the initiative to correct it? The first is the system we currently have. The second requires that we trust ourselves in a way that, as we've become slowly more institutionalized, we've forgotten how to do. If the basic skills we aspire to teach in school are truly as important as we think, then we will by necessity learn them as a natural part of living. As a simple example, imagine a youth that has lagged behind his peers in learning how to read, for whatever reason. All his friends are on Facebook, and he wants to be able to use Facebook to talk to his friends. So he begins teasing out the patterns on his own, and if resources were made available to him through a free and open learning network, he would probably take advantage of them. John Holt provides an admittedly much more compelling example describing his experiences working as a teacher in a summer reading program designed to help at-risk, poor, mostly black kids with poor reading skills. "Leon didn't speak. When he did, he didn't say much. But what he said I will never forget. He stood up, holding before him a paperback copy of Dr. Martin Luther King's book Why We Can't Wait, which he had read or mostly read, during that summer session. He turned from one to another of the adults, holding the book before each of us and shaking it for emphasis, and, in a voice trembling with anger, said several times at the top of his lungs, "Why didn't anyone ever tell me about this book? Why didn't anyone ever tell me about this book?" What he meant, of course, was that in all his years of schooling no one had ever asked him to read, or ever shown him or mentioned to him, even one book that he had any reason to feel might be worth reading. It's worth noting that Why We Can't Wait is full of long intricate sentences and big words. It would not have been easy reading for more than a handful of students in Leon's or any other high school. But Leon, whose standardized Reading Achievement Test scores "proved" that he had the reading skills of a second-grader, had struggled and fought his way through that book in perhaps a month or so. The moral of the story is twofold: that young people want, need, and like to read books that have meaning for them, and that when such books are put within easy reach they will sooner or later figure out, without being taught and with only minimal outside help, how to read them" (33). Finally, Christensen et al.'s last explicit use for comparing students: honor rolls. I can think of no more vain or petty reason to test students than to sustain the practice of honor rolls or principal's lists in schools. They are an academic beauty pageant, and don't actually justify the need to test. It is one of the many carrots we offer students to gain their buy-in and cooperation: do well on these tests and you'll get a gold star, do poorly and you'll have to stay back and repeat the class. All this is to say that if we take Christensen et al.'s fundamental theorem to its logical conclusion- that everyone is different, with their own learning interests, passions, pace, and learning style, then the idea of testing everyone on the same standard seems unnecessary and wasteful. If we accept that, then the whole apparatus of nationalized standards and testing, national curriculums, even mandatory attendance seems to totter. Why have the costly educational infrastructure at all, if these learning networks can be provided at a fraction of the cost? Christensen et al. provide an example of a fictitious kid named Doug. A star soccer player, academically he is "falling through the cracks". "She [the principal] has seen Doug in class a couple of times- he's perfected the art of appearing to take notes, but unlike most of his teachers, she knows he's not. He's doodling. Fantastic, elaborate doodles. That first glimpse of his notebook had horrified her- how long had he been getting away with this? But she had also instantly known he was talented. Maybe Doug belongs in a school with more unconventional programming- more art, more creative kinds of writing, more music. Too bad Randall Circle [the school] doesn't have the infrastructure or funding for that stuff" (208). Too bad, indeed. It's clear that the school isn't serving kids like Doug adequately and may even be doing more harm than good, as by the school's standards, Doug is a failure. But by other standards- athletic, artistic- Doug might very well be considered a success, or at the least very promising. Christensen et al. note that every kid (and really, every person) has a need to feel successful and competent (176). But Doug will never be able to feel competent in that system; the one school activity he does excel at (soccer) is disdainfully labeled as "extracurricular". Doctors learn that their first imperative is to, above all else, do no harm. Should we not hold our schools to the same standard? Would Doug not be better off if he were left to his own devices so that he could find the "unconventional programming" on his own? Later on, a fantastic AP calculus teacher named Escalante is described. "Escalante was an exceptional teacher. Why not capture Escalante's instructional magic on film and make it available to schools anywhere?... But these sorts of films have had little impact because they were simply carmmed into classrooms as a tool on top of the traditional teaching methods. Not surprisingly, never has a calculus teacher announced to the class, 'Kids, today is a great day. We have these films of a teacher in Los Angeles, and you just need a technician to run the projector. You don't need me any more'" (83). I almost feel like that quote needs no explanation. It is the equivalent of keeping our tax code convoluted to satisfy and employ an industry's worth of H&R Blocks, and it sounds like Christensen et al. are as frustrated by that as I am. Of course, change won't happen all at once. The interests- of teachers, unions, administrators, textbook suppliers, standardized testing companies- are too entrenched. A complete collapse of the system is unlikely, instead rapidly increasing costs and mediocre improvements seem to be on the horizon. In the meantime, the disruption will happen in the cracks of the current education system. In the meantime, I will echo one of the calls of Christensen et al.: we need more experiments, more pilot schools and pilot initiatives designed to push the boundaries of what we think we know about learning and teaching and show that other, more empowering ways of educating ourselves are possible. Perhaps universities have an advantage in that, as more autonomous institutions, they can test new ideas and change with more speed and agility. That is, if any of them are willing to say "enough!" to the current collegiate arms race long enough to care about the quality of their educational instruction. I'll end with a poignant observation that Ivan Illich made: "The social decision to allocate educational resources preferably to those citizens who have outgrown the extraordinary learning capacity of their first four years and have not arrived at the height of their self-motivated learning will, in retrospect, probably appear as bizarre" (34). Dillon Dakota Carroll Prague, Oklahoma Note: This is an article in the Agile University series I describe in this blog post here. I'll add a table of contents as I write more articles, and in the meantime that link provides some context to this post.
I just finished the book Now You See It by Cathy Davidson, an interesting look at how the primary institutions in our lives- school and work- can be refreshed to take advantage of new technology and new understandings of how the human brain works. While I'm focusing primarily on the education side of the picture, there were also some lessons from Davidson's view of the workplace that can be applied to schools and universities, which I'll discuss later. Davidson makes the argument that, despite our widespread critique of multitasking, the mind is made for it and even craves multitasking. As she put it, "The mind wanders off task because the mind's task is to wander." When we add modern technology to the mix- the internet, computers, and smartphones- collaborative and creative multitasking is more possible than ever before. The complication as Davidson sees it is that our insititutions- school and work- are designed for the pre-internet 20th century and don't address the question of how we can use technology to be better, create more value, and learn more effectively. They fit the 20th century division of labor, not a networked, collaborative 21st century. For example, society claims that kids these days are dumbed down by technology, but Davidson asks if perhaps the problem isn't with the kids, but with the system that is supposed to serve them. "For all the punditry about the 'dumbest generation' and so forth, I believe that many kids today are doing a better job preparing themselves for their futures than we have done providing them with the institutions to help them. We're more likely to label them with a disability when they can't be categorized by our present system, but how we think about disability is actually a window onto how attention blindness keeps us tethered to a system that isn't working." (10). Davidson presents several solutions to bring these institutions into the 21st century. For example, she talks at length about the idea of game-ifying work and school to keep people's attention. She also advocates embracing flexible, virtual work environments that prize an individual's unique talents and work style. She recommends collaborative, endeavor-based work and learning, as well as trashing traditional grading and curriculum-based teaching and letting students take the lead of their own education and even grading, all while using technology to accelerate progress. It's no revelation that we're stuck with an outdated education system, and Davidson makes the excellent point that for the most part you could put a schoolteacher from 1900 in a modern classroom and they would recognize it instantly. That classroom is a product of the 19th and 20th centuries, when society- based on the prevailing work and management theories of the day- thought that the most efficient way to use workers to create value was an assembly line approach characterized by individual tasks as specialized and specific as possible. Think about the stereotype of an early Ford car factory, for example. A motor might be rolling down the assembly line, and as it passes, a worker in line screws in a piece. The engine continues and is replaced by the next, and all day long, the only thing that worker is doing is screwing in that one piece in each new engine. The man is no better, no more capable than the machine. If the ideal was this kind of mindless, low level work, then schools were seen as an integral piece in training a work force capable of doing these repetitive task. And this assembly line approach didn't just apply to industry, but also to the office. Think about how compartmentalized a 20th century company was- HR departments, Engineering, Sales, Marketing; all completely compartmentalized and almost never talking with one another. Each employee had a specific task to perform, like a piece in a motor. If one piece didn't work, the system failed. Employers needed single-minded workers who would do their one task exceptionally well without asking questions, thinking for themselves, or getting distracted. Schools had to produce those kinds of workers, and so we evolved a system that reflected the specialized, hierarchical separation of labor found in the workplace. Knowledge was divided up into arbitrary disciplines, and ranked in terms of importance. This hierarchy of disciplines, as Davidson notes, was sciences on the top, humanities on the bottom, with physical education, shop, and arts being slowly eliminated over the years. Even the kids began to be ranked (letter grades, as she notes, didn't exist as we know them until 1897) so that they could be sorted into those most apt for employment. After all, in an assembly line model, you're only as fast as your slowest piece. Davidson writes on page 279, "School has been organized to prepare us for this world of work, dividing one age group from another, one subject from another, with grades dictating who is or is not academically gifted, and with hierarchies of what knowledge is or is not important (science on top, arts on the bottom, physical education and shop gone). Intellectual work, in general, is divided up too, with facts separated from interpretation, logic from imagination, rationality from creativity, and knowledge from entertainment. In the end, there are no clear boundaries separating any of these things from the other, but we've arranged our institutions to make it all seem as discrete, fixed, and hierarchical as possible." Of course, we now know there are better organizational schemes ways to productively create value for others, whether in a factory or in an office. Lean Manufacturing, Decentralized Management, and the Coventry Gang System have all shown this. But schools are still afflicted with that turn-of-the-20th-century mentality, the assembly line model of learning, despite the fact that the model most apt for the 21st century is a network. What sense does grading and sorting kids, or dividing knowledge up into arbitrary categories, make in a network model of society? Grading and sorting kids is done all with the idea of getting them into a good college, and eventually, a good job. I agree with Paul Goodman in his fantastic Compulsory Miseducation. Why did we ever start to think that it was the job of the schools and universities to help employers find good employees? Shouldn't the employers worry about the best way to sort and rank potential employees, and universities bother themselves about the best way to sort and rank applicants? And Davidson's observation of the difference between an assembly-line model of production and a network model of production only reinforces this point further. A weak link in an assembly line slows the whole system down, granted. But in a network, the opportunity is there for each person to find their ideal position around a central node or cluster and contribute to it in an a-linear way that defies the assembly line way of thinking. The best example I can think of comes from Davidson's book. Take people who have autism or asperger syndrome. They are not cut out for a typical work or school environment, and without special aid, typically fail miserably in a common school environment. That is to say, in the assembly line model where each "product" (student) coming out of the factory must be as close to identical as possible (as Davidson says, we've confused "high standards" with "standardization"). But that model ignores the special talents they have as a result of their disability. She mentions in her book a code testing company that employs nothing but people with autism and asperger. Employees in this company rigorously test new code from their clients for errors, bugs, and typos. People without autism and asperger are terrible at this job, and they don't have the concentration or attention span to catch all the errors in the code. But those with autism and asperger excel at this kind of intense, detail-oriented work. The workers there, who are labeled by the assembly line model as defective products of the education system, have found their niche where they can contribute the most value to society. That is the network model at its best. Schools are built to reward monotasking, but a central point Davidson makes is that our brain, despite what we may think, is not built to monotask. Multitasking is its natural mode of operating. Our minds abhor the boredom and single-mindedness that comes with monotasking and instead defies disciplinary boundaries and craves novelty, stimuli, and collaboration. The best thing we can do is stop fighting our nature and take advantage of the possibilities our technology offers to build systems that embrace the way our minds work. For example, Davidson points to the fact that in studies on distractions in the workplace, nearly half the time the distraction was internal. In other words, the subject was distracting themselves. Later in the book, she notes that on average, 80% of our neural energy taken up just talking to ourselves (280). These two statistics support Davidson's conclusion that our minds are naturally hyperactive. Like a little kid on a sugar high, they constantly need something to distract them and keep them occupied, and they quickly get bored when confined to performing the same task repeatedly or for an extended amount of time. While often easy to vilify, the natural overactivity of the human brain has an advantage. It is constantly cross-referencing other parts of our experiences and memories looking for new connections that could be of value to us or those around us. In other words, the same distractability that frustrates us is also a source of creativity. And it apparently doesn't even take that much mental energy to switch gears mid-task to something new- only 5% of our at rest energy (280). Even when we're seemingly zoned out and day-dreaming, the mind is incredibly active at making connections with what's around us and with our stored bank of experience. The brain is a natural multi-tasker. What's clear is that our minds aren't well ordered, logical, linearly-functioning machines as 20th century thinkers thought but are themselves networks of neurons constantly communicating with one another. We've been trained to think in terms of compartmentalized disciplines and functions, but our minds are naturally interdisciplinary, constantly seeking to connect new and old experiences in novel ways. Our distractability is a side effect of this. The key may be in consciously controlling what our distractions are to facilitate productive connections and distractions over unproductive ones. As Davidson writes, "What confuses the brain delights the brain. What confounds the brain enlivens the brain. What mixes up categories energizes the brain. Or, to sum it all up, as we have seen, what surprises the brain is what allows for learning." (286). Unfortunately, our education system is designed to punish distractions and enforce an unnatural and stilting single-mindedness of the kind required for an assembly mind worker, not the creative knowledge workers the world needs from our schools in the 21st century. Davidson provides some ideas and examples of how the school and university environments could be reformed to be fit for the 21st century. This is a point where the book falls a little short, unfortunately. Davidson lost the opportunity to comprehensively describe her vision. There's some great ideas and anecdotes provided, but they lacked cohesion and an easily understandable vision. For example, she describes some great examples of what she might have called "endeavor based learning", which could have easily formed a central operating principle for her classroom makeover (as she calls it). But she fails to make that principle explicit in her descriptions. This idea of endeavor based learning is something I will discuss later in the essay. Davidson does discuss the role technology can play in the 21st century classroom to enhance the interdisciplinary, collaborative, student-driven learning opportunities she sees as key to the future of learning (as do I, for that matter). I think I would have loved to see a few more examples here, as the central example she provided was a fantastic one. She oversaw a program at Duke University to equip the majority of the students with free iPods. A partnership with Apple, the plan was to turn the entire campus into a learning lab on how a device like the iPod could enhance the student's educational experience on campus. Note that this was back in 2003 when the idea of an app itself was new, let alone that of a learning app. The implementation was simple. All incoming Freshman got iPods. Any upperclassman could get one by proposing a use for it in one of their classes. In that case, everyone in that class would get an iPod. Professors could also propose uses, in which case all their students would get an iPod as well. She uses this example because it has a happy ending, as the experiment was a huge success. Dozens of new apps were created and hundreds of new uses found for the device. More importantly, at least in my eyes, is the effect the process must have had on the students. In a limited but important way, it made them co-creators of their own learning experience. If the computer and the internet has democratized information, then it also has the potential to democratize the way we learn that information. We're certainly moving further in that direction even since the publication of Now You See It in 2010. But the democratization has mostly occurred outside of the classroom. Instead of percolating into the cracks of these old-as-rock institutions, classrooms have become more expensive (particularly on the subject of tuition and fees in universities and the per-student spending in public schools) and the extent of technology use in the classrooms are typically superficial, like smartboards instead of whiteboards, or having the online homework and quiz systems in language classes that students universally despise. I do think of how some universities are experimenting with the idea of webcasting the overcrowded first and second year classes- the ones taught in auditoriums to hundreds of students at once. But that begs the question, why even have the class in the first place? Why not just make a recording of the lecture and put it online? It would save everyone's time, effort, and money. If students had a question (and few do), the email address of the instructor could be provided so that the student could email their question or even set up an in-person meeting if necessary. This is going a bit off the topic of Davidson's book, but modern technology has made the lecture redundant and wasteful. Or perhaps it's always been that way. Dr. David Ray of the University of Oklahoma made the point that the lecture has been obsolete since the invention of the printing press. The word lecture comes from the latin verb legere, meaning to read, and the medieval latin lectura, meaning read. According to Dr. Ray, the origin of the lecture as we know it was in medieval monasteries, when books were precious because they had to be copied meticulously by hand. To create these copies, the head monk would read aloud from the original while the rest of the monks copied his words onto new parchment. We might even suppose that, as the monastery had at best only a couple copies of a given work at one time, one monk reading aloud (as all reading was done then, interestingly enough) would allow many other brother monks to partake of the "lecture" or reading of the book. As a system of learning (if it ever was one) it ceased to be meaningful when books could be cheaply and rapidly printed. What's interesting is that some of the most vocal defendants of the lecture format are students themselves, who typically claim it to be an effective teaching method. I am very, very skeptical of this claim, and it is a hypothesis I'm going to be investigating in further writings. Small group discussions can certainly spark meaningful insights, but most lectures are a far cry from a discussion group and usually consist of the teacher writing notes on the board that would be more effectively communicated if he had simply passed out the book he had taken his notes from. Even better, technology allows each learner to find the codified information in formats conducive to their learning style: books, ebooks, audio programs, and even video programs. I knew many friends who, confused by the professor's lectures and explanations, taught themselves entire courses through free video series on YouTube. And the lecture is so far removed from any practical application. I fail to see how listening to a teacher talk about an equation is more effective than actually doing the equation oneself, or even better, using that equation to do something useful or meaningful- like building an app or a widget, or even a prototype or model of one. I know this is a big can of worms, and I'm leaving it unopened for now. But I will return to the topic in a later piece, rest assured! One thing I'm curious to think about is, if a university were invented today, completely blank-slate, what would it look like? How would it be organized and maintained, and what would it prioritize and how? How would our new technology be used? What staples of the modern college would still find their way into a completely new model? These are tricky questions to answer, because to do so, one has to answer what the role of a university is. For example, there are a plethora of online university options available that take advantage of internet technology for so-called distance learning, but these seem to fill the relatively modern role of a university as a factory for credentialing professionals. Davidson does provide a good example of what her idea of what a student-directed class could look like in her own course at Duke University. Teaching a class called Your Brain on the Internet, the idea was that it would be an interdisciplinary exploration of pretty much exactly what this book is about- the role our rapidly evolving technology has on the way we live our lives. She provided a list of recommended reading, but otherwise let her students direct the course and what would be discussed. From her description, Davidson was more of a facilitator than a teacher. The class was incredibly successful. Not only did the students later rank this course as one of the most impactful they took while at Duke, they even went so far as to organize extra classes when they felt they needed it. For instance, when a thinker they had been discussing happened to be in town, they organized an extra-curricular class (if extra-curricular has any meaning when the curriculum is set by the students anyway!) to hear straight from the horse's mouth what he thought. Imagine if more classes were organized this way! Of course, well-meaning administrators could easily ruin a course such as Davidson's by making it part of a required, general education curriculum. My guess is that in that case, the course would fail miserably, as it would only pay lip service to the idea of being student directed. Students would see it as yet another tick box to check off on the long, arduous, overly prescribed path to getting their degree. As it turns out, students weren't completely satisfied with Davidson's administration of the course. For all its progressiveness, it still boringly followed the typical manner of grading a course: a professor-graded midterm, term paper, and final exam. After this feedback from her original class, she wrote a controversial blog post about a potential alternative: contract grading combined with class-sourced grades. Contract grading, which I'd never heard of, goes back to the 60's or so. In that system, a student could agree to do proportionately less work in exchange for a guaranteed B or C, less than what would be required to get a A. So she proposed that students in the next iteration of her course could decide in advance exactly how much work and what kind of grade they were shooting for (she points out that the coursework required in the first version was not insignificant, so even someone shooting for a B or a C would still have her hands full). When it came time to evaluate each other, the students would look at the amount of work their peers had agreed to per their contract and evaluate if they had fulfilled the terms of that contract. For example, if their contract stated that they had to write 10 blog posts during the course of the semester, did they do so? And were the blog posts of sufficient length and depth to be considered worthy of the name? Having already discussed earlier my views on the modern grading system, I think this is a refreshing new approach of the kind that I'd like to see more of. I don't see grading, ranking, and labeling as part of role of a teacher- that's the job of an admissions officer or HR department. I see a student-directed grading initiative as an interesting compromise. It reinforces the self-directed nature of the class (and the inherently self-directed nature of learning) and reduces the workload of the professor so that they can focus on more important things. My guess is, the students are as strict if not more strict than the professor herself in upholding themselves to the terms they agreed to. I say that because the students undoubtedly felt a sense of pride and ownership of the course and what they learned in it, a feeling most never get in 16+ years of schooling, most of it mandatory and teacher-directed. We always take better care of what we own, and the students felt ownership of the course. They had a vested interest in maintaining its high standards. In Compulsory Miseducation, Goodman writes that in the original medieval universities, grading and ranking the students was never considered part of the mission of the institution or its professors. Students of course had to demonstrate competence, that they were worthy to be included within a certain guild or group of professionals or scholars, just as we require lawyers today to take the bar exam and doctors to do a residency. But if they were good enough to be in (and I imagine the standards were fairly high), then they were all the way in. None of this nonsense of A's and C's. "It is really necessary to remind our academics of the ancient history of Examination. In the medieval university, the whole point of the grueling trial of the candidate was whether or not to accept him as a peer. His disputation and lecture for the Master's was just that, a masterpiece to enter the guild. It was not to make comparative evaluations. It was not to weed out and select for an extra-mural licensor or employer. It was certainly not to pit one young fellow against another in an ugly competition. My philosophic impression is that the medieval thought they knew what a good job of work was and that we are competitive because we do not know. But the more status is achieved by largely irrelevant competitive evaluation, the less will we ever know." How much more inspiring would our universities be if, instead of viewing themselves as a factory for employers and grad schools, they saw themselves as aiding students in creating a masterpiece, a perfection of their craft, and to prepare them to enter a close-knit fraternity of curious and worldly professionals and scholars? The last topic of the book I want to touch on is Davidson's description of Endeavor Based Work (EBW for short- she never uses the acronym, but I'll be mentioning it a lot more than she did). The term EBW actually comes from IBM in what seems to be one of the most stunning corporate management transitions in modern history. In short, IBM managed to transform itself from a stodgy, conservative behemoth into a progressive, agile, and virtual workforce and in the process avoided becoming a footnote in history. The part of their story I'm interested in is their idea of EBW, because it has the potential to change the way we think about how we learn. EBW at its core is simple. Instead of compartmentalizing various functions like chimneys on a factory into HR, Engineering, Sales, Customer Support, etc., EBW organizes its teams by projects. So a project has all the people on it from all the disciplines required to realize it. This is a feature of Agile, for example, and other decentralized management philosophies, but I love the name EBW because of how descriptive it is. Everyone on the team is responsible for the success of the whole project and contributes in their unique way to its success, like members of an orchestra. In fact, the analogy used in the book is that of a film crew. Everyone has a unique role or task, but all are united by the vision of the final film product. In the process, crew members may have to step up and do tasks outside of their specific role. But in the process of seeing the film take shape, everyone learns and grows infinitely more than if they were in an isolated silo performing just their specific, specialized task. And the final product is better for that. In fact, the film might not ever get made if the crew performed their work in isolated silos. This idea of EBW seems to be part of Davidson's classroom makeover, though she never uses that term to describe it. However, I am convinced that it is the single idea described in her book that, if implemented in our education system, would do the most to positively revolutionize the way we learn. Imagine if, instead of listening to a teacher drone on and on about seemingly unrelated subjects that have no context or meaning in one's life, students were given an endeavor to complete (or better yet, chose an endeavor to complete) that was meaningful and relevant to them. They could work on it alone or in teams. Completing it would require collaboration, engagement, hands-on learning, creativity, problem-solving, and initiative. Because it is a real task and a real problem, it would defy disciplinary classification, and the lessons learned would be real and meaningful. Davidson provides several examples of this, but the most beautiful one is easily that of her own mother-in-law, Mrs. Davidson. Teaching in a remote, rural school in Mountain View, Canada, Mrs. Davidson challenged her students to find pen pals. The challenge was that the students had to find pen pals in another town called Mountain View, anywhere in the world. The most creative solution to this problem would "win" the competition. The students first had to create their own world map and, using the minuscule resources available in the school and town, find other Mountain Views. Once they did that, they still had to figure out how to get in touch with a resident there. The results were nothing short of breathtakingly inspiring. "One kid remembered that Hang Sang, the elderly Chinese man who ran the local general store, the only store in town, had come over to Canada to work on the railroad, as had so many Chinese immigrants. Taciturn, with a thick accent, Mr. Sang was amused and delighted when one child suddenly wanted help writing a letter- in Chinese. The kid had somehow found out about a town called Mountain View in China. That was the child who won the contest, and long after the contest was over, he spent time with Mr. Sang, talking to him in the back of his store. But of course they all won. The globe became smaller through the connections they made, and their town became larger. They learned geography and anthropology and foreign languages too. The project lasted not just one full year but many years, and some kids visited those other Mountain Views when they grew up. To this day, I don't drive through a town named Mountain View (there are a lot of them in the world, actually) without wondering if one of Mrs. Davidson's kids sent a letter there and, through the connection made, was inspired to go on, later, to become a professor or a doctor or a veterinarian (85)." That was before the internet existed. What possibilities exist for these kinds of real-world, self-directed endeavors in our schools and universities now that, thanks to the internet, we have any kind of expertise, knowledge, experience, and personal connection at our fingertips? While Cathy Davidson doesn't propose going as far as I think we need to go in reforming our education system, I think she offers several compelling pieces to the puzzle: endeavor based work in schools using technology as a force multiplier, as well as student-directed learning and peer-sourced grading. Davidson makes the point that this system would not only better fit the networked, collaborative society we live in today, but also works with our brains neurology, not against it. It is a smart, humanizing alternative that would reinforce a completely different set of values than those currently taught in our education system. It would teach creativity instead of regurgitation, collaboration instead of compartmentalization, initiative and independence instead of docility, and meaningful, experiential learning instead of learning for an arbitrary test. Dillon Carroll The Colony, Texas I want to share a small tip I've been having some success with to help establish a daily routine- something I know is important, but have never been able to do until recently! I would always set out with the best of intentions, plotting out to a tee how my morning and night would go and exactly what I would be doing. The sequence would spiral out of control like a cancer until I had my first three or four hours of the day plotted. Sometimes I'd even go so far as to plan out the entire day. And these wonderfully intentioned plans and planned routines never worked. The problem I'd always have was that I'd mess up once and misspend the day. Guilt-wracked, by the end of the evening I'd have realized how unproductive I'd been and "binge-work", staying up till the wee hours of the morning to try and get as much of my backlogged work done as possible. But I never worked well after 2am anyway, and the next morning I'd sleep in and wreck my fledgling attempts at creating my own routine. Thinking about what is different with my efforts now, I can see that I've had, or am having, a mindset shift away from this self-destructive work-bulemia. And one thing that helped get me there (or maybe a consequence of getting there, who knows?) was to start thinking of my day as a series of "checkpoints", like the save-points in a video-game. Basically, I've chosen only three hard and fast times for my routine during the day. The idea is simple, and not new at all, but I think it is effective for reasons I explain below. My three checkpoints are:
Here's why this works for me. This reduces or completely eliminates the anxiety of having an entire day planned out to the tee. Three simple times to keep track of instead is totally manageable. Once those times arrive, I know I need to drop everything and move on. I suppose I could ignore the deadline, but only having three hard-and-fast times throughout the day makes that seem like a cop-out and like I'm cheating myself. It totally destresses the process for me. It forces me to focus on what's truly essential for me to feel like I had a good day. I want to write, and really make a serious thing of my writing, but could never make a habit of it. The same with working out. Yet the difference between the days when I do those two things- write and exercise- and when I don't is frighteningly stark. In short, if I got to those two things during the day, it wasn't necessarily a good day but it certainly wasn't a bad day. Finally, a big reason for my binge-working and my inability to create new habits or routines was really just because I couldn't ever go to bed on time. Voilà my third checkpoint. I still have a checklist of routine things I want to get done each day that is more than three items- it's at 15 items, actually- but I try not to sweat it if I miss some of them in a day and instead focus on meeting the checkpoints. Because I know that even if I've totally wasted the day up to that point, if I can make a checkpoint, I'm more likely than not going to continue on track and start ticking off the rest of the items on my list. Like a video-game checkpoint, these allow me to start anew and try again at having a good, productive day. In essence, I have three opportunities each day to turn a bad day into a good one, or at least a productive one. It's like building mini-periods of reflection into the day, mini-sprints of work/recovery that give me the psychological opportunity to renew myself a little bit with each checkpoint. And even if I don't feel like doing the checkpoint, the bar is set so low that I know the easiest thing to do is just force myself to do it rather than deal with all the nasty regret and guilt. After all, I only have the three real commitments during the day, and they're pretty easy. Sit in front of my computer with Evernote open until 11. Put on workout clothes and walk outside at noon. Go brush my teeth at ten pm. Since motivation usually only comes after taking action, meeting those three checkpoints is like knocking over the first in a line of dominos. I'm back on track for the day and feeling great. Or at least better. Zan Perrion provides some inspiration here in his book The Alabaster Girl. In it, he recommends a small ritual he calls Vespers, as in vespertine (occurring in the evening). The idea is that at some point before going to bed, we find a few quiet moments to ourselves to reflect and prepare for the next day. One could imagine that daily Vespers is like a checkpoint in a video game. We are playing the game of Life on the ‘Hard’ difficulty setting, but because we saved the game at that point yesterday evening, because we reconnected with what is truly important, we can always fall back on that point again any time in the future. Hope you found this tip to be useful!
Dillon Dakota Carroll Dhaka, Bangladesh Universal Principles Across Disparate Fields: Perspectives on Iterating One's Way to Success9/19/2015 When I was a teenager, I became obsessed by the idea of being able to tune into my surroundings and learn so rapidly that I could adapt to and overcome any circumstance. There wasn't any unusual need or mission that spurred this fixation, just the normal teenage angst of finding one's way in the world. I was more socially inept than most teenagers- I can say that now with affection- so I suppose the idea of becoming some sort of super-adaptive ninja appealed to me. The teenage-me reasoned thusly: if I could be really, really good at learning from my environment and adapting to what I learn, then I could circumvent all awkward social situations, always know how to make the best of fickle fortune, and ultimately acquire the quiet confidence and easy-going nature I so desperately desired. In other words, my sixteen-year-old self desired some superpowerful, panacea-like mental framework that would not only cure me of my psychological ailments but also transform me into the exact opposite, the antonym, the antithesis, of who I currently was. I would be the ultimate badass because I could learn any skill and overcome any circumstance. Let no one accuse me of not dreaming big. Ultimately, my juvenile pursuit of this one-mindset-to-rule-them-all, like so many searches, was eventually forgotten to time as I became distracted with more immediate tasks like graduating high school and University, achieving some semblance of success with girls, and figuring out what I actually wanted to do with my life (unfortunately my University didn't as yet have a "super-adaptive ninja" major, but I hear one is in the works). Don't get me wrong. My intentions were good. But I didn't really know where to begin looking for such ideas, or how to piece together the fragmented research I was doing. I became vaguely aware of the importance of east-Asian meditative philosophies to my research, and the idea of Mindfulness proposed by a Harvard researcher named Ellen Langer. That was about as far as I got. Fast forward to about two month ago. Boyd and the OODA LoopI was reading a book called Boyd: The Fighter Pilot that Changed the Art of War. The subject of the biographical work is Colonel John Boyd, a near-mythological figure to those that knew him and know of his work. He did incredible things, and these accomplishments were all the more impressive because they were across such a wide variety of fields. He was perhaps one of the greatest fighter pilots the world has ever seen. Nicknamed 40 second Boyd, he had a standing bet that he could defeat anyone in a dogfight in less than 40 seconds. He never lost. He revolutionized the theory and practice of dogfighting and literally wrote the manual on flying fighters. He discovered the Energy-Maneuverability theory, which completely changed the way fighters were flown and designed. In fact, he used these theories to help design the F-15 and F-16. Later in his life, he stopped flying and instead devoted himself to purely intellectual pursuits. He investigated the nature of creativity, for example. Perhaps most famously, he invented the OODA loop. The OODA loop is short for Observe, Orient, Decide, Act. It's a decision making framework for taking in new information and reacting to that information to make the best of the real-world circumstances and achieve success. On the one hand, seems too simple. We do these things (observing, orienting, etc) routinely, it doesn't take a genius to draw all four of them in a big circle, right? The importance is in how the loop is explained and applied. And more importantly for the ghost of the hopeful sixteen-year-old me, I felt the faint-yet-intoxicating spark of recognition, a spark that "lit my fire and fired my soul", to paraphrase Douglas Hofstadter (I am a Strange Loop). And not just because it seemed like the key piece I'd been missing in my search for personal growth acceleration. In fact, I didn't even draw the connection to my long-lost quest of years past until I began writing this article. It sparked a burning recognition because I realized I already knew and understood the key concepts in the OODA loop. Agile and Lean Startup At least, I knew them as they were applied to entrepreneurship and product development. I'm referring specifically to Agile methodologies and Lean Startup in particular, since that's what I'm most familiar with. In Lean Startup, the idea is to take a rapid approach to validating one's hypotheses about a new product, service, or feature. Part of the ambiguity of a new business idea is that you don't know what your customers think. Lean Startup is a framework for getting customer feedback on a product iteration as rapidly as possible- say, two weeks instead of two years. This is important because you could spend two years developing a product and launch it without getting any customer feedback. Your "loop" is two years long in that case. But what happens if the product fails because it is out of touch with what customers want? You've wasted 2 years. You've gained valuable feedback on what your customers want, but it came too late in the game to be useful. By accelerating the rate at which you get real customer feedback on something tangible- be it a prototype, experience, or mock-up- you instead gain actionable insights, or what's called validated learning. The learning is actionable because you're getting it fast enough to feed back into your product to improve it. And by getting these insights, you're effectively getting inside the mind of your customer. You're understanding who they are and ultimately, how to build a product that solves a real problem of theirs and that they will want to buy. The nuance here is that it requires a deep empathic understanding of the customer and their problems, frustrations, and aspirations. You've empathized and understood them so deeply that you've internalized who they are and what they care about. Getting Inside the LoopNow let's look at the OODA loop. Here's what Robert Coram had to say about it in Boyd: Before Boyd came along, others had proposed primitive versions of an OODA Loop. The key thing to understand about Boyd's version is not the mechanical cycle itself, but the need to execute the cycle in such fashion as to get inside the mind and the decision cycle of the adversary. This means the adversary is dealing with outdated or irrelevant information and thus becomes confused and disoriented and can't function (p. 346)... The faster you can perform all the steps, the faster you can assimilate new information and put it to use to achieve whatever your objective at that moment in time is. But as Coram noted, the key is to use each loop to understand the opponent. Your actions can't occur in a vacuum- they have to elicit a response or feedback of some kind from the opponent that serves to peel back a further layer of their mind and see how they think and how they work. Each loop is an iteration of action and reflection, allowing you to course-correct each time. If you're directly competing against someone, or something, moving through the loop faster and using it to gain validated learnings about their behavior allows you to react more quickly, throw your opponents off balance, and evolve a winning strategy while everyone else is still struggling to get their bearings. The goal is to "get inside the other guy's loop" and out-iterate him to whatever success in that instance means to you. If you're moving through the entire loop twice as fast compared to the competition, then you're learning and changing your behavior for the better twice as fast. It's not just that you're doing twice as much as the opponent- it's that the action that you do take is more effective, more deadly, and more in touch with the actual circumstances because you're assimilating and learning from the environment twice as quickly. You're gaining validated learning about his behavior and the environment more quickly than him, and each loop gives you the opportunity to reflect and assimilate that information before beginning the loop anew. The ideal outcome in both the OODA loop and Lean Startup is the same: that you come to know someone else so deeply and intimately that you know exactly how they'll respond to your actions and decisions. In Lean Startup it's your customer, in the OODA loop it is whoever you're competing against. As stated before, this is a function of speed paired with the measured feedback from calculated real-world action. A sense for human psychology and a keen empathic sense also seem crucially important to reach this final, key stage of these cycles. Josh Waitzkin wrote an insightful passage on discovering this experience in his excellent book The Art of Learning: ...The 19th century sage Wu Yu-hsiang [wrote] a typically abstract Chinese instructional conundrum: I think these are all powerful theories because they put the emphasis on being agile and in-tune with the ambiguous real world circumstances that any new initiative faces when implemented. Empowering the people you're working with becomes paramount. Technology, size, and resources become less important. According to Coram, Boyd's mantra was "Machines don't fight wars, people do, and they use their minds" (p. 367). What strikes me is that both cycles, at their core, are about increasing the tempo at which you're able to act, learning from your actions, and adjusting to improve, all with the goal of getting inside the loop. The faster you learn, the faster you win. What's key is that the learning is validated. It's not about reading a book or talking to someone or taking a test and having "learned" in the academic sense. It's about learning by doing. Your learnings are validated by the fact that you can actually see the effect your actions have in the world. This real-world learning can then be applied to change your behavior for the better. As my friend Eric Morrow puts it, "if your behavior isn't changing after an experiment you run, then you're wasting your time." Universal PrinciplesAs with the OODA loop, the power of Lean Startup comes from moving through the entire loop as quickly as possible. The faster you can move through the loop and get customer feedback on prototypes, the faster you can course-correct and iterate your way to a product that customers want and will pay for. With each iteration, you're gaining insights validated by the people who are supposed to be paying you- in other words, the only people who's feedback about your product you should trust! If you halve the time you make your way through the loop, you've effectively halved the time it will take to iterate your way to a successful product. You've doubled the rate at which you're learning and improving.
For any superficial differences, these two decision-making methodologies from very different fields are talking about the same things once you reduce them to their core principles. In particular:
These same principles lie at the heart of all Agile methodologies, not just Lean Startup. There's the emphasis on getting feedback from customers and clients through rapid prototypes or "potentially-shippable features", and moving through "sprints" as quickly as possible. Team structure is decentralized, as the team members need to have the freedom to design the best possible product based on client feedback, not mandates from an out of touch manager. What's interesting is that many of the disparate Agile methodologies all evolved independently and have only been lumped together post-creation. While programmers invented the Agile philosophy and many of its most popular applied methodologies, like Scrum, Lean actually evolved from Japanese car manufacturers long before. Lean Startup, while inspired by Lean manufacturing, evolved from the particular issues entrepreneurs faced. The funny thing is, this isn't the first time I've recognized these same ideas packaged in the jargon of a different field or profession. I had the same feeling of recognition about a year ago when I read Christopher Alexander's A Timeless Way of Building and A Pattern Language. Alexander describes these same concepts in different words, and applied to architecture and urban planning. Alexander's approach includes:
Then you have Design Thinking and Human-Centered Design, similar philosophies to the aforementioned which evolved from yet again distinct fields. Both encapsulate nearly all the principles I've discussed previously with the OODA loop, Agile, and Alexander's Pattern Language. Their unique angle is that they focus above all else on empathizing with the user to fully understand their problems before working with them to develop and test rapid prototypes. So far we have what appears to be a set of universal decision-making principles designed to guide us to success in uncertain and ambiguous circumstances. They vary superficially based on the unique application in various fields of human endeavors. Entrepreneurs, soldiers, designers, and architects call them by different names, but at heart they're talking about more or less the same thing: a way of engaging with the world on ambiguous terms and guaranteeing good outcomes. Who's to say where these same ideas will crop up again? And in what untapped human pursuits could these principles be applied to create new value? After all, one of the easiest ways to innovate and create value is to take innovations and ideas from one field and cross pollinate them into a new field. Lean Startup is just old concepts (the scientific method and hypothesis testing, plus the Lean manufacturing idea of maximizing value and minimizing waste) applied to a new field (the traditionally ambiguous and unscientific field of business). These principles don't inform what the ideal outcome is, rather, they describe the process for reaching it: Make a bunch of small tests, and use the data to gradually improve each time. Make each iteration of the process as short as possible to learn as quickly as possible. Each test should be as close to the real thing as possible. Accumulate a database of validated learnings that allow you to empathize with and understand whoever will be reacting to your test: the client, the end-user, the enemy. Empathize with them and understand their problems and frustrations, to design the best way to overcome those problems (or use them against them in combat). I mentioned that when I was a teenager, I obsessed about the idea of finding the ultimate mental framework that would allow me learn anything, adapt to anything, overcome any obstacle to achieve anything. I never did get anywhere close- ultimately I was too young, too scattered, too unsure of where to even start looking for such a framework. But now that I think about the suite of principles circumscribed by Agile, the OODA loop, Design Thinking, and Pattern Languages, I think I may have found something pretty damn close to it. I'm excited to think about how these can be applied and used to accelerate something new. |
...sees much and knows much
|