We began working in January of 2014 on Levaté, an accessory product for wheelchair users to help improve their independence. It'll likely be three full years before the product is ready to be manufactured, about a year and a half behind our original, optimistic schedule. We've slowly learned that developing a new product and launching a business around it takes longer than expected. Double any initial estimate of time and money cost.
Being in a pensive mood as I mulled over what we'd learned in the past thirty months, I began thinking about the broader lessons for my life. This is the list I came up with.
The Endeavor Based University is a decentralized education at its finest. The technology and infrastructure is already there such that anyone can learn practically anything from anywhere they have an internet connection. At a fraction of the cost of a university degree, they could even take advantage of non-virtual resources like libraries, fab labs, hacker spaces, apprenticeships, and mentorship.
Universities and other educational institutions have the opportunity to recast themselves as network nodes in this decentralized network of learning, that is, natural accumulations of peers, expertise, resources, and facilities that can vastly accelerate an individual student's learning as well as advance a certain agenda: civic and democratic engagement, cross-cultural collaboration, humanistic values, etc.
Think about the portrait of a typical college student's experience that Goodman drew earlier. Imagine what a university education could be, instead.
Imagine if the university were instead project based or endeavor based, and interdisciplinary instead of divided into disciplines. Each student would still have a major, but from the very beginning they would be working on real world projects in small teams instead of sitting in lectures for the first two years that had little to do with what they might actually be doing after they graduate. That way, they could decide from the beginning if that major was what they really wanted to be doing, instead of having to wait until their third or fourth year to find out.
These teams, rather than working on a contrived academic project, would be working on a real project with real value for someone. It could be working hand in hand with a professor on his research project, working with a real startup, or even a project a corporation or company had. Or, it could be a completely student driven initiative: a startup, a community improvement or activism project, or even a local political campaign.
The team would be interdisciplinary, as a real world team would be. For example, imagine if the project were to try and launch a new product. Engineers, industrial designers, graphic designers, etc. would be needed, of course. But also business students, entrepreneurship majors, an accounting major. A psychology or sociology student could be recruited, whose role would be customer discovery and to be the customer advocate on the team.
Or imagine a team that decided they wanted to run for a local political science office. That could include political science majors, sociology majors, a journalism student, even a systems or industrial engineer.
Or even if students wanted to work with a professor on a research project not based on disciplinary choice, either on a team or 1 on 1, that would be an option as well.
Lecture classes as we know them would be gone. Instead of learning abstract knowledge, memorizing it for the test, then promptly forgetting it after the course was over, there would be just in time learning, roughly analogous to just in time manufacturing in lean. In just in time manufacturing, rather than expensively storing hundreds of extra pieces in a warehouse until they're needed, costs are cut by using superior organization to deliver parts to the factory only when they're needed.
In just in time learning, students would learn only what they needed to know to complete what they were working on at that particular moment in their project. If they needed to learn differential equations to design a particular engine piece, their learning facilitator would direct them to the textbook or online tutorial series they needed to learn that, and to a professor on campus that could help them if they got stuck. So instead of learning a bunch of theory without application that has no real application or context or meaning for them, they would learn the material much more effectively because it meant something to them. It was something they learned, not something a professor taught them. Those differential equations would live on in their memory and experience as that engine piece they designed that was actually being used in a car somewhere.
And the students would be creating real value in the world, something most of us have never had the opportunity, never been trusted to do until after we graduate. This would create tighter bonds between the university, its students, and the community. The university would come to be seen as a community resource where the nation's youth could be employed to solve some of the world's, the nation's, or the community's most pressing problems, keeping it relevant in an age where someone can get just as quality an education (though not the credentials) online at a fraction of the cost.
Rather than learning only a particular set of technical skills, which are likely to be out of date as soon as they graduate or soon thereafter, the students would learn those in addition to the real gems: learning how to collaborate, work in teams, set and achieve their own objectives, be self-starters, problem solve an ambiguous real world problem, create real value for others, build consensus, think critically and creatively, and take action to achieve real, measurable change. In other words, the higher-level skills that are needed in a creative, networked society where outsourcing and automation are rapidly making those skills necessary to get a job or even better, create a job where one didn't exist previously.
Of course, this idea brings up many questions to answer, which I'll explore in future articles. For example, how does a liberal arts education fit into this idea? How do you measure the results of students' learning or the efficacy of the university in such a system? Among others. So stay tuned.
Dillon Dakota Carroll
Note: This is an article in the Agile University series I describe in this blog post here. I'll add a table of contents as I write more articles, and in the meantime that link provides some context to this post.
I just finished the book Now You See It by Cathy Davidson, an interesting look at how the primary institutions in our lives- school and work- can be refreshed to take advantage of new technology and new understandings of how the human brain works. While I'm focusing primarily on the education side of the picture, there were also some lessons from Davidson's view of the workplace that can be applied to schools and universities, which I'll discuss later.
Davidson makes the argument that, despite our widespread critique of multitasking, the mind is made for it and even craves multitasking. As she put it, "The mind wanders off task because the mind's task is to wander." When we add modern technology to the mix- the internet, computers, and smartphones- collaborative and creative multitasking is more possible than ever before.
The complication as Davidson sees it is that our insititutions- school and work- are designed for the pre-internet 20th century and don't address the question of how we can use technology to be better, create more value, and learn more effectively. They fit the 20th century division of labor, not a networked, collaborative 21st century. For example, society claims that kids these days are dumbed down by technology, but Davidson asks if perhaps the problem isn't with the kids, but with the system that is supposed to serve them. "For all the punditry about the 'dumbest generation' and so forth, I believe that many kids today are doing a better job preparing themselves for their futures than we have done providing them with the institutions to help them. We're more likely to label them with a disability when they can't be categorized by our present system, but how we think about disability is actually a window onto how attention blindness keeps us tethered to a system that isn't working." (10).
Davidson presents several solutions to bring these institutions into the 21st century. For example, she talks at length about the idea of game-ifying work and school to keep people's attention. She also advocates embracing flexible, virtual work environments that prize an individual's unique talents and work style. She recommends collaborative, endeavor-based work and learning, as well as trashing traditional grading and curriculum-based teaching and letting students take the lead of their own education and even grading, all while using technology to accelerate progress.
It's no revelation that we're stuck with an outdated education system, and Davidson makes the excellent point that for the most part you could put a schoolteacher from 1900 in a modern classroom and they would recognize it instantly. That classroom is a product of the 19th and 20th centuries, when society- based on the prevailing work and management theories of the day- thought that the most efficient way to use workers to create value was an assembly line approach characterized by individual tasks as specialized and specific as possible. Think about the stereotype of an early Ford car factory, for example. A motor might be rolling down the assembly line, and as it passes, a worker in line screws in a piece. The engine continues and is replaced by the next, and all day long, the only thing that worker is doing is screwing in that one piece in each new engine. The man is no better, no more capable than the machine. If the ideal was this kind of mindless, low level work, then schools were seen as an integral piece in training a work force capable of doing these repetitive task. And this assembly line approach didn't just apply to industry, but also to the office. Think about how compartmentalized a 20th century company was- HR departments, Engineering, Sales, Marketing; all completely compartmentalized and almost never talking with one another. Each employee had a specific task to perform, like a piece in a motor. If one piece didn't work, the system failed.
Employers needed single-minded workers who would do their one task exceptionally well without asking questions, thinking for themselves, or getting distracted. Schools had to produce those kinds of workers, and so we evolved a system that reflected the specialized, hierarchical separation of labor found in the workplace. Knowledge was divided up into arbitrary disciplines, and ranked in terms of importance. This hierarchy of disciplines, as Davidson notes, was sciences on the top, humanities on the bottom, with physical education, shop, and arts being slowly eliminated over the years. Even the kids began to be ranked (letter grades, as she notes, didn't exist as we know them until 1897) so that they could be sorted into those most apt for employment. After all, in an assembly line model, you're only as fast as your slowest piece.
Davidson writes on page 279, "School has been organized to prepare us for this world of work, dividing one age group from another, one subject from another, with grades dictating who is or is not academically gifted, and with hierarchies of what knowledge is or is not important (science on top, arts on the bottom, physical education and shop gone). Intellectual work, in general, is divided up too, with facts separated from interpretation, logic from imagination, rationality from creativity, and knowledge from entertainment. In the end, there are no clear boundaries separating any of these things from the other, but we've arranged our institutions to make it all seem as discrete, fixed, and hierarchical as possible."
Of course, we now know there are better organizational schemes ways to productively create value for others, whether in a factory or in an office. Lean Manufacturing, Decentralized Management, and the Coventry Gang System have all shown this. But schools are still afflicted with that turn-of-the-20th-century mentality, the assembly line model of learning, despite the fact that the model most apt for the 21st century is a network. What sense does grading and sorting kids, or dividing knowledge up into arbitrary categories, make in a network model of society?
Grading and sorting kids is done all with the idea of getting them into a good college, and eventually, a good job. I agree with Paul Goodman in his fantastic Compulsory Miseducation. Why did we ever start to think that it was the job of the schools and universities to help employers find good employees? Shouldn't the employers worry about the best way to sort and rank potential employees, and universities bother themselves about the best way to sort and rank applicants? And Davidson's observation of the difference between an assembly-line model of production and a network model of production only reinforces this point further. A weak link in an assembly line slows the whole system down, granted. But in a network, the opportunity is there for each person to find their ideal position around a central node or cluster and contribute to it in an a-linear way that defies the assembly line way of thinking.
The best example I can think of comes from Davidson's book. Take people who have autism or asperger syndrome. They are not cut out for a typical work or school environment, and without special aid, typically fail miserably in a common school environment. That is to say, in the assembly line model where each "product" (student) coming out of the factory must be as close to identical as possible (as Davidson says, we've confused "high standards" with "standardization"). But that model ignores the special talents they have as a result of their disability. She mentions in her book a code testing company that employs nothing but people with autism and asperger. Employees in this company rigorously test new code from their clients for errors, bugs, and typos. People without autism and asperger are terrible at this job, and they don't have the concentration or attention span to catch all the errors in the code. But those with autism and asperger excel at this kind of intense, detail-oriented work. The workers there, who are labeled by the assembly line model as defective products of the education system, have found their niche where they can contribute the most value to society. That is the network model at its best.
Schools are built to reward monotasking, but a central point Davidson makes is that our brain, despite what we may think, is not built to monotask. Multitasking is its natural mode of operating. Our minds abhor the boredom and single-mindedness that comes with monotasking and instead defies disciplinary boundaries and craves novelty, stimuli, and collaboration. The best thing we can do is stop fighting our nature and take advantage of the possibilities our technology offers to build systems that embrace the way our minds work.
For example, Davidson points to the fact that in studies on distractions in the workplace, nearly half the time the distraction was internal. In other words, the subject was distracting themselves. Later in the book, she notes that on average, 80% of our neural energy taken up just talking to ourselves (280). These two statistics support Davidson's conclusion that our minds are naturally hyperactive. Like a little kid on a sugar high, they constantly need something to distract them and keep them occupied, and they quickly get bored when confined to performing the same task repeatedly or for an extended amount of time.
While often easy to vilify, the natural overactivity of the human brain has an advantage. It is constantly cross-referencing other parts of our experiences and memories looking for new connections that could be of value to us or those around us. In other words, the same distractability that frustrates us is also a source of creativity. And it apparently doesn't even take that much mental energy to switch gears mid-task to something new- only 5% of our at rest energy (280). Even when we're seemingly zoned out and day-dreaming, the mind is incredibly active at making connections with what's around us and with our stored bank of experience. The brain is a natural multi-tasker.
What's clear is that our minds aren't well ordered, logical, linearly-functioning machines as 20th century thinkers thought but are themselves networks of neurons constantly communicating with one another. We've been trained to think in terms of compartmentalized disciplines and functions, but our minds are naturally interdisciplinary, constantly seeking to connect new and old experiences in novel ways. Our distractability is a side effect of this. The key may be in consciously controlling what our distractions are to facilitate productive connections and distractions over unproductive ones. As Davidson writes, "What confuses the brain delights the brain. What confounds the brain enlivens the brain. What mixes up categories energizes the brain. Or, to sum it all up, as we have seen, what surprises the brain is what allows for learning." (286).
Unfortunately, our education system is designed to punish distractions and enforce an unnatural and stilting single-mindedness of the kind required for an assembly mind worker, not the creative knowledge workers the world needs from our schools in the 21st century.
Davidson provides some ideas and examples of how the school and university environments could be reformed to be fit for the 21st century. This is a point where the book falls a little short, unfortunately. Davidson lost the opportunity to comprehensively describe her vision. There's some great ideas and anecdotes provided, but they lacked cohesion and an easily understandable vision. For example, she describes some great examples of what she might have called "endeavor based learning", which could have easily formed a central operating principle for her classroom makeover (as she calls it). But she fails to make that principle explicit in her descriptions. This idea of endeavor based learning is something I will discuss later in the essay.
Davidson does discuss the role technology can play in the 21st century classroom to enhance the interdisciplinary, collaborative, student-driven learning opportunities she sees as key to the future of learning (as do I, for that matter). I think I would have loved to see a few more examples here, as the central example she provided was a fantastic one. She oversaw a program at Duke University to equip the majority of the students with free iPods. A partnership with Apple, the plan was to turn the entire campus into a learning lab on how a device like the iPod could enhance the student's educational experience on campus. Note that this was back in 2003 when the idea of an app itself was new, let alone that of a learning app.
The implementation was simple. All incoming Freshman got iPods. Any upperclassman could get one by proposing a use for it in one of their classes. In that case, everyone in that class would get an iPod. Professors could also propose uses, in which case all their students would get an iPod as well.
She uses this example because it has a happy ending, as the experiment was a huge success. Dozens of new apps were created and hundreds of new uses found for the device. More importantly, at least in my eyes, is the effect the process must have had on the students. In a limited but important way, it made them co-creators of their own learning experience. If the computer and the internet has democratized information, then it also has the potential to democratize the way we learn that information. We're certainly moving further in that direction even since the publication of Now You See It in 2010. But the democratization has mostly occurred outside of the classroom. Instead of percolating into the cracks of these old-as-rock institutions, classrooms have become more expensive (particularly on the subject of tuition and fees in universities and the per-student spending in public schools) and the extent of technology use in the classrooms are typically superficial, like smartboards instead of whiteboards, or having the online homework and quiz systems in language classes that students universally despise.
I do think of how some universities are experimenting with the idea of webcasting the overcrowded first and second year classes- the ones taught in auditoriums to hundreds of students at once. But that begs the question, why even have the class in the first place? Why not just make a recording of the lecture and put it online? It would save everyone's time, effort, and money. If students had a question (and few do), the email address of the instructor could be provided so that the student could email their question or even set up an in-person meeting if necessary.
This is going a bit off the topic of Davidson's book, but modern technology has made the lecture redundant and wasteful. Or perhaps it's always been that way. Dr. David Ray of the University of Oklahoma made the point that the lecture has been obsolete since the invention of the printing press. The word lecture comes from the latin verb legere, meaning to read, and the medieval latin lectura, meaning read. According to Dr. Ray, the origin of the lecture as we know it was in medieval monasteries, when books were precious because they had to be copied meticulously by hand. To create these copies, the head monk would read aloud from the original while the rest of the monks copied his words onto new parchment. We might even suppose that, as the monastery had at best only a couple copies of a given work at one time, one monk reading aloud (as all reading was done then, interestingly enough) would allow many other brother monks to partake of the "lecture" or reading of the book. As a system of learning (if it ever was one) it ceased to be meaningful when books could be cheaply and rapidly printed.
What's interesting is that some of the most vocal defendants of the lecture format are students themselves, who typically claim it to be an effective teaching method. I am very, very skeptical of this claim, and it is a hypothesis I'm going to be investigating in further writings. Small group discussions can certainly spark meaningful insights, but most lectures are a far cry from a discussion group and usually consist of the teacher writing notes on the board that would be more effectively communicated if he had simply passed out the book he had taken his notes from. Even better, technology allows each learner to find the codified information in formats conducive to their learning style: books, ebooks, audio programs, and even video programs. I knew many friends who, confused by the professor's lectures and explanations, taught themselves entire courses through free video series on YouTube.
And the lecture is so far removed from any practical application. I fail to see how listening to a teacher talk about an equation is more effective than actually doing the equation oneself, or even better, using that equation to do something useful or meaningful- like building an app or a widget, or even a prototype or model of one.
I know this is a big can of worms, and I'm leaving it unopened for now. But I will return to the topic in a later piece, rest assured!
One thing I'm curious to think about is, if a university were invented today, completely blank-slate, what would it look like? How would it be organized and maintained, and what would it prioritize and how? How would our new technology be used? What staples of the modern college would still find their way into a completely new model? These are tricky questions to answer, because to do so, one has to answer what the role of a university is. For example, there are a plethora of online university options available that take advantage of internet technology for so-called distance learning, but these seem to fill the relatively modern role of a university as a factory for credentialing professionals.
Davidson does provide a good example of what her idea of what a student-directed class could look like in her own course at Duke University. Teaching a class called Your Brain on the Internet, the idea was that it would be an interdisciplinary exploration of pretty much exactly what this book is about- the role our rapidly evolving technology has on the way we live our lives. She provided a list of recommended reading, but otherwise let her students direct the course and what would be discussed. From her description, Davidson was more of a facilitator than a teacher.
The class was incredibly successful. Not only did the students later rank this course as one of the most impactful they took while at Duke, they even went so far as to organize extra classes when they felt they needed it. For instance, when a thinker they had been discussing happened to be in town, they organized an extra-curricular class (if extra-curricular has any meaning when the curriculum is set by the students anyway!) to hear straight from the horse's mouth what he thought. Imagine if more classes were organized this way!
Of course, well-meaning administrators could easily ruin a course such as Davidson's by making it part of a required, general education curriculum. My guess is that in that case, the course would fail miserably, as it would only pay lip service to the idea of being student directed. Students would see it as yet another tick box to check off on the long, arduous, overly prescribed path to getting their degree.
As it turns out, students weren't completely satisfied with Davidson's administration of the course. For all its progressiveness, it still boringly followed the typical manner of grading a course: a professor-graded midterm, term paper, and final exam. After this feedback from her original class, she wrote a controversial blog post about a potential alternative: contract grading combined with class-sourced grades.
Contract grading, which I'd never heard of, goes back to the 60's or so. In that system, a student could agree to do proportionately less work in exchange for a guaranteed B or C, less than what would be required to get a A. So she proposed that students in the next iteration of her course could decide in advance exactly how much work and what kind of grade they were shooting for (she points out that the coursework required in the first version was not insignificant, so even someone shooting for a B or a C would still have her hands full).
When it came time to evaluate each other, the students would look at the amount of work their peers had agreed to per their contract and evaluate if they had fulfilled the terms of that contract. For example, if their contract stated that they had to write 10 blog posts during the course of the semester, did they do so? And were the blog posts of sufficient length and depth to be considered worthy of the name?
Having already discussed earlier my views on the modern grading system, I think this is a refreshing new approach of the kind that I'd like to see more of. I don't see grading, ranking, and labeling as part of role of a teacher- that's the job of an admissions officer or HR department. I see a student-directed grading initiative as an interesting compromise. It reinforces the self-directed nature of the class (and the inherently self-directed nature of learning) and reduces the workload of the professor so that they can focus on more important things. My guess is, the students are as strict if not more strict than the professor herself in upholding themselves to the terms they agreed to. I say that because the students undoubtedly felt a sense of pride and ownership of the course and what they learned in it, a feeling most never get in 16+ years of schooling, most of it mandatory and teacher-directed. We always take better care of what we own, and the students felt ownership of the course. They had a vested interest in maintaining its high standards.
In Compulsory Miseducation, Goodman writes that in the original medieval universities, grading and ranking the students was never considered part of the mission of the institution or its professors. Students of course had to demonstrate competence, that they were worthy to be included within a certain guild or group of professionals or scholars, just as we require lawyers today to take the bar exam and doctors to do a residency. But if they were good enough to be in (and I imagine the standards were fairly high), then they were all the way in. None of this nonsense of A's and C's.
"It is really necessary to remind our academics of the ancient history of Examination. In the medieval university, the whole point of the grueling trial of the candidate was whether or not to accept him as a peer. His disputation and lecture for the Master's was just that, a masterpiece to enter the guild. It was not to make comparative evaluations. It was not to weed out and select for an extra-mural licensor or employer. It was certainly not to pit one young fellow against another in an ugly competition. My philosophic impression is that the medieval thought they knew what a good job of work was and that we are competitive because we do not know. But the more status is achieved by largely irrelevant competitive evaluation, the less will we ever know."
How much more inspiring would our universities be if, instead of viewing themselves as a factory for employers and grad schools, they saw themselves as aiding students in creating a masterpiece, a perfection of their craft, and to prepare them to enter a close-knit fraternity of curious and worldly professionals and scholars?
The last topic of the book I want to touch on is Davidson's description of Endeavor Based Work (EBW for short- she never uses the acronym, but I'll be mentioning it a lot more than she did). The term EBW actually comes from IBM in what seems to be one of the most stunning corporate management transitions in modern history. In short, IBM managed to transform itself from a stodgy, conservative behemoth into a progressive, agile, and virtual workforce and in the process avoided becoming a footnote in history. The part of their story I'm interested in is their idea of EBW, because it has the potential to change the way we think about how we learn.
EBW at its core is simple. Instead of compartmentalizing various functions like chimneys on a factory into HR, Engineering, Sales, Customer Support, etc., EBW organizes its teams by projects. So a project has all the people on it from all the disciplines required to realize it. This is a feature of Agile, for example, and other decentralized management philosophies, but I love the name EBW because of how descriptive it is. Everyone on the team is responsible for the success of the whole project and contributes in their unique way to its success, like members of an orchestra. In fact, the analogy used in the book is that of a film crew. Everyone has a unique role or task, but all are united by the vision of the final film product. In the process, crew members may have to step up and do tasks outside of their specific role. But in the process of seeing the film take shape, everyone learns and grows infinitely more than if they were in an isolated silo performing just their specific, specialized task. And the final product is better for that. In fact, the film might not ever get made if the crew performed their work in isolated silos.
This idea of EBW seems to be part of Davidson's classroom makeover, though she never uses that term to describe it. However, I am convinced that it is the single idea described in her book that, if implemented in our education system, would do the most to positively revolutionize the way we learn.
Imagine if, instead of listening to a teacher drone on and on about seemingly unrelated subjects that have no context or meaning in one's life, students were given an endeavor to complete (or better yet, chose an endeavor to complete) that was meaningful and relevant to them. They could work on it alone or in teams. Completing it would require collaboration, engagement, hands-on learning, creativity, problem-solving, and initiative. Because it is a real task and a real problem, it would defy disciplinary classification, and the lessons learned would be real and meaningful.
Davidson provides several examples of this, but the most beautiful one is easily that of her own mother-in-law, Mrs. Davidson. Teaching in a remote, rural school in Mountain View, Canada, Mrs. Davidson challenged her students to find pen pals. The challenge was that the students had to find pen pals in another town called Mountain View, anywhere in the world. The most creative solution to this problem would "win" the competition. The students first had to create their own world map and, using the minuscule resources available in the school and town, find other Mountain Views. Once they did that, they still had to figure out how to get in touch with a resident there.
The results were nothing short of breathtakingly inspiring.
"One kid remembered that Hang Sang, the elderly Chinese man who ran the local general store, the only store in town, had come over to Canada to work on the railroad, as had so many Chinese immigrants. Taciturn, with a thick accent, Mr. Sang was amused and delighted when one child suddenly wanted help writing a letter- in Chinese. The kid had somehow found out about a town called Mountain View in China. That was the child who won the contest, and long after the contest was over, he spent time with Mr. Sang, talking to him in the back of his store.
But of course they all won. The globe became smaller through the connections they made, and their town became larger. They learned geography and anthropology and foreign languages too. The project lasted not just one full year but many years, and some kids visited those other Mountain Views when they grew up. To this day, I don't drive through a town named Mountain
View (there are a lot of them in the world, actually) without wondering if one of Mrs. Davidson's kids sent a letter there and, through the connection made, was inspired to go on, later, to become a professor or a doctor or a veterinarian (85)."
That was before the internet existed. What possibilities exist for these kinds of real-world, self-directed endeavors in our schools and universities now that, thanks to the internet, we have any kind of expertise, knowledge, experience, and personal connection at our fingertips?
While Cathy Davidson doesn't propose going as far as I think we need to go in reforming our education system, I think she offers several compelling pieces to the puzzle: endeavor based work in schools using technology as a force multiplier, as well as student-directed learning and peer-sourced grading. Davidson makes the point that this system would not only better fit the networked, collaborative society we live in today, but also works with our brains neurology, not against it. It is a smart, humanizing alternative that would reinforce a completely different set of values than those currently taught in our education system. It would teach creativity instead of regurgitation, collaboration instead of compartmentalization, initiative and independence instead of docility, and meaningful, experiential learning instead of learning for an arbitrary test.
The Colony, Texas
I wrote recently that, having almost finished my first book, I have already started on a second. My plan is to write articles and publish them here, on this blog, as I research. The idea is to stitch these articles into a book at the end of it all.
This second book is quite different from my first, but I'm very excited about it as it is on a topic dear to my heart: university education. In particular, the question I want to ask over the course of the coming months is:
How can the modern university be made more relevant, more humanizing, and more effective at facilitating the learning of students?
Before I explain what I mean by that, I'll explain why writing this is an important project for me to undertake.
I am incredibly lucky to have gone to a great university, studied a high-paying major I didn't hate, earned more scholarships than was probably fair, and taken advantage of the opportunity for life-changing extracurriculars, friendships, and study abroad programs.
In other words, my four years at the University of Oklahoma were excellent.
Underneath all that, however, there's a rankling remorse of sorts, one that I've often spoken of with my friends, many of whom have had a similar experience. We felt that the classes we had, with a few very notable exceptions, were the worst part of our college experience, and the place where we learned the least, despite the fact that they (and the accompanying homework, studying, and tests) occupied the largest part of each of our days. I do not think my friends and I were alone in thinking and feeling this, and validating (or dis-validating) this hypothesis is one of the aims of my research to come.
Of all the problems facing universities today- increasing competition from non-traditional education sources, rising prices, astronomical student to teacher ratios- the fact that universities fail so miserably at what should be their core competency is the most sharply ironic, even poignant, problem of all.
Take Paul Goodman's description of the typical experience of a student in a college classroom, written in his book Compulsory Miseducation. Note that I've edited it down quite a bit for brevity, though it is still quite a long passage. Long, but both engrossing and elucidating, and it summarizes perfectly why I think this is a book that needs to be written. The emphasis in bold is mine.
Here is a young fellow in a college classroom... He is in his junior year. So, omitting kindergarten, he has been in an equivalent classroom for nearly fifteen continuous years, intermitted only by summer vacations or play. Schooling has been the serious part of his life, and it has consisted of listening to some grown-up talking and of doing assigned lessons. The young man has almost never seriously assigned himself a task. He's bright -- he can manipulate formulas and remember sentences, and he has made a well-known college...
It was written in 1964 but could easily have been written in 2016. In fact, it's almost certainly more true now than then.
And yet students continue to swamp university admissions offices in ever-increasing numbers, attesting to the allure of two things: first, what we Americans call the College Experience, my generation's rite of initiation into adulthood. Second, the slip of paper (expensive as it may be) certifying our professional and intellectual capabilities, and without which we would be barred from entering most of the professional world.
We can and should do better in our universities, and it doesn't take much effort to come up with dozens of reasons why. To name a few: the huge cost both for students and the community-at-large, the fact that 6 out of every 10 Americans attend a college or university in their life, the role these institutions have in the lives of its attendees as the path to prosperity and, increasingly, as a rite of passage into adulthood; the role universities have in setting standards for our public schools.
In short, having the best possible educative system at a university at the lowest cost possible should be, at the least, of keen interest to all. This means, as I stated earlier, discovering how to make universities more relevant, humanizing, and effective at facilitating learning of the right kind, by which I chiefly mean self-directed and experiential. And as I will argue later, evolving our universities and our public education system in general to meet these goals is probably, as I will argue later, one of the most important initiatives the American people can undertake in the 21st century.
I promised I would explain why I chose those three categories- relevant, humanizing, and effective at facilitating self-directed learning- as the goals which our higher education system should strive to achieve.
To be honest, they just seemed right. I have a certain idea of what principles and values our schools and universities should engender in their students, and how these institutions can evolve to do exactly that. The above categories work well enough for now, the start of my flight-of-the-mind, and I'm confident that a better way to organize these principles will emerge in due time.
Relevant. Universities are hopelessly out of date, not just since the digital age but since the invention of the printing press. Ironically, our universities have in most regards retrograded, even compared to those of the medieval world. The modern university is desperately in need of an overhaul that takes advantage of modern technology and philosophy to cut both costs and obsolete learning practices; improve collaboration, empathy, and initiative; and prepare students to tackle the pressing issues of our generation.
Humanizing. Perhaps the goal of a university we've most lost touch with, as our modern day higher education institutions seem more concerned with how they can increase the amount of donor dollars they receive. Yet this is also perhaps the most important goal of all, as it deals very directly with the kind of people we want to be and the kind of community we want to live in. For example, how can universities help students find themselves and their vocation, becoming free, independent, and self-driven individuals without losing touch with the responsibility we each have for ourselves and one another? Or, as the democratic education pioneer John Holt put it,
"The fundamental educational problem of our time is to find ways to help children grow into adults who have no wish to do harm. We must recognize that traditional education, far from having ever solved this problem, has never tried to solve it."
Effective at Facilitating the Right Kind of Learning. I know few people who find the lecture and the test, the hammer and the forge of traditional education, to be very inspiring or to spark any kind of real learning. On the other hand, evidence is mounting that the ideal learning situation is precisely the opposite of that found in a typical college classroom.
In particular, the ideal qualities of this ideal include it being self-directed, experiential, and interdisciplinary, and that it be based on solving real world problems in small teams. This loose formula is incredibly effective at not only facilitating learning of the subject matter at hand but also at sparking and fanning the flames of the self-actualizing values described above: independence, initiative, responsibility, and finding one's vocation.
Ironically, we see once again how universities have seemingly forgotten the wisdom of the past. One need look no further for inspiration than the relationships of the Master with his apprentices that was alive until well-meaning mandatory education and anti-child labor laws extinguished this age-old tradition in the early twentieth century.
The overall vision of this book is that it be an honest and motivating picture of the untenable problems facing our education system in general and our universities in particular, a brief history of how these problems came to be, and, as described above, a discussion of the imperative changes universities must make not to survive, but to better serve its students and its communities. Indeed, the second half of the book-to-be will transition into discussing in-depth the changes necessary and how to implement them in order to do its students justice.
To create this vision, I took inspiration and synthesized ideas from a variety of fields, sources, and influences as disparate as Agile and Lean product management philosophy of the past 20 odd years, the ancient Greek idea and practice of Praxis, anarchic theories of architecture and urban planning from the 70's, democratic education pioneers from the 60's, my time working at the University of Oklahoma Economic Development department in 2013-2014, as well as two intensely respected and absolutely formative professors I had the honor of knowing while at the University of Oklahoma.
I have a general idea of what the book will say and what my next few articles will be, but part of the adventure will be seeing how that vision evolves and changes as I dive deeper into the subject and encounter questions, issues, and research I hadn't yet considered. For now, however, I'm going to explore the role modern universities play in the United States, with the first article on that topic published by the end of the week.
Dillon Dakota Carroll
December 2nd 2015
When I was a teenager, I became obsessed by the idea of being able to tune into my surroundings and learn so rapidly that I could adapt to and overcome any circumstance.
There wasn't any unusual need or mission that spurred this fixation, just the normal teenage angst of finding one's way in the world. I was more socially inept than most teenagers- I can say that now with affection- so I suppose the idea of becoming some sort of super-adaptive ninja appealed to me.
The teenage-me reasoned thusly: if I could be really, really good at learning from my environment and adapting to what I learn, then I could circumvent all awkward social situations, always know how to make the best of fickle fortune, and ultimately acquire the quiet confidence and easy-going nature I so desperately desired.
In other words, my sixteen-year-old self desired some superpowerful, panacea-like mental framework that would not only cure me of my psychological ailments but also transform me into the exact opposite, the antonym, the antithesis, of who I currently was. I would be the ultimate badass because I could learn any skill and overcome any circumstance.
Let no one accuse me of not dreaming big.
Ultimately, my juvenile pursuit of this one-mindset-to-rule-them-all, like so many searches, was eventually forgotten to time as I became distracted with more immediate tasks like graduating high school and University, achieving some semblance of success with girls, and figuring out what I actually wanted to do with my life (unfortunately my University didn't as yet have a "super-adaptive ninja" major, but I hear one is in the works).
Don't get me wrong. My intentions were good. But I didn't really know where to begin looking for such ideas, or how to piece together the fragmented research I was doing. I became vaguely aware of the importance of east-Asian meditative philosophies to my research, and the idea of Mindfulness proposed by a Harvard researcher named Ellen Langer. That was about as far as I got.
Fast forward to about two month ago.
Boyd and the OODA Loop
I was reading a book called Boyd: The Fighter Pilot that Changed the Art of War. The subject of the biographical work is Colonel John Boyd, a near-mythological figure to those that knew him and know of his work. He did incredible things, and these accomplishments were all the more impressive because they were across such a wide variety of fields.
He was perhaps one of the greatest fighter pilots the world has ever seen. Nicknamed 40 second Boyd, he had a standing bet that he could defeat anyone in a dogfight in less than 40 seconds. He never lost.
He revolutionized the theory and practice of dogfighting and literally wrote the manual on flying fighters.
He discovered the Energy-Maneuverability theory, which completely changed the way fighters were flown and designed. In fact, he used these theories to help design the F-15 and F-16.
Later in his life, he stopped flying and instead devoted himself to purely intellectual pursuits. He investigated the nature of creativity, for example.
Perhaps most famously, he invented the OODA loop.
The OODA loop is short for Observe, Orient, Decide, Act. It's a decision making framework for taking in new information and reacting to that information to make the best of the real-world circumstances and achieve success.
On the one hand, seems too simple. We do these things (observing, orienting, etc) routinely, it doesn't take a genius to draw all four of them in a big circle, right?
The importance is in how the loop is explained and applied.
And more importantly for the ghost of the hopeful sixteen-year-old me, I felt the faint-yet-intoxicating spark of recognition, a spark that "lit my fire and fired my soul", to paraphrase Douglas Hofstadter (I am a Strange Loop).
And not just because it seemed like the key piece I'd been missing in my search for personal growth acceleration. In fact, I didn't even draw the connection to my long-lost quest of years past until I began writing this article. It sparked a burning recognition because I realized I already knew and understood the key concepts in the OODA loop.
Agile and Lean Startup
At least, I knew them as they were applied to entrepreneurship and product development.
I'm referring specifically to Agile methodologies and Lean Startup in particular, since that's what I'm most familiar with.
In Lean Startup, the idea is to take a rapid approach to validating one's hypotheses about a new product, service, or feature. Part of the ambiguity of a new business idea is that you don't know what your customers think. Lean Startup is a framework for getting customer feedback on a product iteration as rapidly as possible- say, two weeks instead of two years.
This is important because you could spend two years developing a product and launch it without getting any customer feedback. Your "loop" is two years long in that case.
But what happens if the product fails because it is out of touch with what customers want? You've wasted 2 years. You've gained valuable feedback on what your customers want, but it came too late in the game to be useful. By accelerating the rate at which you get real customer feedback on something tangible- be it a prototype, experience, or mock-up- you instead gain actionable insights, or what's called validated learning. The learning is actionable because you're getting it fast enough to feed back into your product to improve it.
And by getting these insights, you're effectively getting inside the mind of your customer. You're understanding who they are and ultimately, how to build a product that solves a real problem of theirs and that they will want to buy. The nuance here is that it requires a deep empathic understanding of the customer and their problems, frustrations, and aspirations. You've empathized and understood them so deeply that you've internalized who they are and what they care about.
Getting Inside the Loop
Now let's look at the OODA loop. Here's what Robert Coram had to say about it in Boyd:
Before Boyd came along, others had proposed primitive versions of an OODA Loop. The key thing to understand about Boyd's version is not the mechanical cycle itself, but the need to execute the cycle in such fashion as to get inside the mind and the decision cycle of the adversary. This means the adversary is dealing with outdated or irrelevant information and thus becomes confused and disoriented and can't function (p. 346)...
The faster you can perform all the steps, the faster you can assimilate new information and put it to use to achieve whatever your objective at that moment in time is. But as Coram noted, the key is to use each loop to understand the opponent. Your actions can't occur in a vacuum- they have to elicit a response or feedback of some kind from the opponent that serves to peel back a further layer of their mind and see how they think and how they work.
Each loop is an iteration of action and reflection, allowing you to course-correct each time. If you're directly competing against someone, or something, moving through the loop faster and using it to gain validated learnings about their behavior allows you to react more quickly, throw your opponents off balance, and evolve a winning strategy while everyone else is still struggling to get their bearings.
The goal is to "get inside the other guy's loop" and out-iterate him to whatever success in that instance means to you. If you're moving through the entire loop twice as fast compared to the competition, then you're learning and changing your behavior for the better twice as fast. It's not just that you're doing twice as much as the opponent- it's that the action that you do take is more effective, more deadly, and more in touch with the actual circumstances because you're assimilating and learning from the environment twice as quickly. You're gaining validated learning about his behavior and the environment more quickly than him, and each loop gives you the opportunity to reflect and assimilate that information before beginning the loop anew.
The ideal outcome in both the OODA loop and Lean Startup is the same: that you come to know someone else so deeply and intimately that you know exactly how they'll respond to your actions and decisions. In Lean Startup it's your customer, in the OODA loop it is whoever you're competing against. As stated before, this is a function of speed paired with the measured feedback from calculated real-world action. A sense for human psychology and a keen empathic sense also seem crucially important to reach this final, key stage of these cycles.
Josh Waitzkin wrote an insightful passage on discovering this experience in his excellent book The Art of Learning:
...The 19th century sage Wu Yu-hsiang [wrote] a typically abstract Chinese instructional conundrum:
I think these are all powerful theories because they put the emphasis on being agile and in-tune with the ambiguous real world circumstances that any new initiative faces when implemented. Empowering the people you're working with becomes paramount. Technology, size, and resources become less important. According to Coram, Boyd's mantra was "Machines don't fight wars, people do, and they use their minds" (p. 367).
What strikes me is that both cycles, at their core, are about increasing the tempo at which you're able to act, learning from your actions, and adjusting to improve, all with the goal of getting inside the loop. The faster you learn, the faster you win.
What's key is that the learning is validated. It's not about reading a book or talking to someone or taking a test and having "learned" in the academic sense. It's about learning by doing. Your learnings are validated by the fact that you can actually see the effect your actions have in the world. This real-world learning can then be applied to change your behavior for the better. As my friend Eric Morrow puts it, "if your behavior isn't changing after an experiment you run, then you're wasting your time."
As with the OODA loop, the power of Lean Startup comes from moving through the entire loop as quickly as possible. The faster you can move through the loop and get customer feedback on prototypes, the faster you can course-correct and iterate your way to a product that customers want and will pay for. With each iteration, you're gaining insights validated by the people who are supposed to be paying you- in other words, the only people who's feedback about your product you should trust! If you halve the time you make your way through the loop, you've effectively halved the time it will take to iterate your way to a successful product. You've doubled the rate at which you're learning and improving.
For any superficial differences, these two decision-making methodologies from very different fields are talking about the same things once you reduce them to their core principles. In particular:
These same principles lie at the heart of all Agile methodologies, not just Lean Startup. There's the emphasis on getting feedback from customers and clients through rapid prototypes or "potentially-shippable features", and moving through "sprints" as quickly as possible. Team structure is decentralized, as the team members need to have the freedom to design the best possible product based on client feedback, not mandates from an out of touch manager.
What's interesting is that many of the disparate Agile methodologies all evolved independently and have only been lumped together post-creation. While programmers invented the Agile philosophy and many of its most popular applied methodologies, like Scrum, Lean actually evolved from Japanese car manufacturers long before. Lean Startup, while inspired by Lean manufacturing, evolved from the particular issues entrepreneurs faced.
The funny thing is, this isn't the first time I've recognized these same ideas packaged in the jargon of a different field or profession. I had the same feeling of recognition about a year ago when I read Christopher Alexander's A Timeless Way of Building and A Pattern Language. Alexander describes these same concepts in different words, and applied to architecture and urban planning.
Alexander's approach includes:
Then you have Design Thinking and Human-Centered Design, similar philosophies to the aforementioned which evolved from yet again distinct fields. Both encapsulate nearly all the principles I've discussed previously with the OODA loop, Agile, and Alexander's Pattern Language. Their unique angle is that they focus above all else on empathizing with the user to fully understand their problems before working with them to develop and test rapid prototypes.
So far we have what appears to be a set of universal decision-making principles designed to guide us to success in uncertain and ambiguous circumstances. They vary superficially based on the unique application in various fields of human endeavors. Entrepreneurs, soldiers, designers, and architects call them by different names, but at heart they're talking about more or less the same thing: a way of engaging with the world on ambiguous terms and guaranteeing good outcomes.
Who's to say where these same ideas will crop up again? And in what untapped human pursuits could these principles be applied to create new value? After all, one of the easiest ways to innovate and create value is to take innovations and ideas from one field and cross pollinate them into a new field. Lean Startup is just old concepts (the scientific method and hypothesis testing, plus the Lean manufacturing idea of maximizing value and minimizing waste) applied to a new field (the traditionally ambiguous and unscientific field of business).
These principles don't inform what the ideal outcome is, rather, they describe the process for reaching it: Make a bunch of small tests, and use the data to gradually improve each time. Make each iteration of the process as short as possible to learn as quickly as possible. Each test should be as close to the real thing as possible. Accumulate a database of validated learnings that allow you to empathize with and understand whoever will be reacting to your test: the client, the end-user, the enemy. Empathize with them and understand their problems and frustrations, to design the best way to overcome those problems (or use them against them in combat).
I mentioned that when I was a teenager, I obsessed about the idea of finding the ultimate mental framework that would allow me learn anything, adapt to anything, overcome any obstacle to achieve anything. I never did get anywhere close- ultimately I was too young, too scattered, too unsure of where to even start looking for such a framework.
But now that I think about the suite of principles circumscribed by Agile, the OODA loop, Design Thinking, and Pattern Languages, I think I may have found something pretty damn close to it. I'm excited to think about how these can be applied and used to accelerate something new.
"I just don't think you all are going to make it. You're not in wheelchairs yourselves, so how can you know what a wheelchair user wants?"
That was the gist of what a visiting oil and gas entrepreneur told my business partner about our social business startup, which as you can probably guess, aims to create innovative products for wheelchair users.
The problem with his statement is that if true, it completely invalidates the field of social entrepreneurship.
I don't think he was right. As I'll explain, social businesses work because they have the same engine under the hood as a traditional business: a way of bringing in revenue in a repeatable and scalable fashion from customers, or a business model. I'll also explain how, with the right mindset and methodology, social businesses can thrive even without being their own customers.
You can scratch other people's itches too
But first, let's talk a bit about why we hear this oft repeated advice so much. The writers of Rework, Jason Fried and David Hansson call it "scratching your own itch". Generally, this is great advice. If you're your own customer, figuring out how to make your customer happy starts with making yourself happy! As Fried and Hansson point out, for example, Nike was started by a track coach who wanted better running shoes for his team.
Heavenly Bread: A Social Business Case Study
Social ventures can also be a great example of this. Heavenly Bread, a social impact bakery in Tulsa, Oklahoma, had a dual-fold social mission. The first was to provide fresh, nutritional, and preservative-free whole-grain bread to the community. The founder loved her bread and wanted to share it with the community- she is a perfect example of her own customer in this sense.
But let's look at the other side to her social mission. Appalled by Oklahoma's exceedingly high rate of female incarceration (the highest in the nation, in fact) and the difficulty of reintegrating into society after incarceration, she decided to have an open employment policy. And in fact, her first employee was a formerly-incarcerated woman. Let's call her Jane.
While technically the founder's employee, from the perspective of a social business, Jane was receiving a service from Heavenly Bread. Heavenly Bread gave her a job when nearly no-one else would have. The company was attempting to provide a certain segment of the population (formerly incarcerated women) with a socially-impactful service (a steady job they likely couldn't get anywhere else, and without which would likely return to jail). Having never been jailed, the founder was not a member of this population segment.
Despite not having lived the same experiences as her formerly-incarcerated employee, the founder was successful in providing the service to her. After perhaps a half year of working with Heavenly Bread, Jane successfully transitioned out of her job there and into a new company. She likely wouldn't have been able to gain the job without her experience at Heavenly Bread, which helped Jane psychologically transition back to society but also gain the post-prison work experience a potential employer would want to see before hiring her.
So here we have one company that has elements of both kinds of organizations. They are their own customers from the bakery side, but like many social businesses, they also tried to help improve the condition of a distinct segment of the population of which they were not a part.
Build products your customers want to buy
The goal of any new business, social businesses included, should be to find paying customers as soon as possible. This is the feedback mechanism that makes businesses so agile. If you think about it, when a customer gives you money it means that they think the product is valuable enough that they would spend their hard-earned money on it. Suddenly, you don't need to be your own customer- you have successfully scratched the itch of someone else. There are follow-up questions to ask (how many customers are there out there? Do you make enough of a margin off sales to cover costs and expand?) but you've done what is without a doubt the hardest part of starting a new organization: creating a solution people are willing to pay for.
Social impact non-profits may be able to stay afloat from donations, but the point of a social business is that the organization can, at the least, become financially self-sustaining through customer-generated income. If you don't have an effective solution to your customer's problems, you're going to find it out very quickly when no one buys the product. And those who do buy the product will gladly tell you what they really think about it because they've got skin in the game- their money. When I work with startups, I always tell them that feedback from paying customers is the best, hands-down. Otherwise, people will lie through their teeth to you about your product to not hurt your feelings.
Getting to the point of paying customers
Getting to a sale (or even a paid beta) is a huge milestone for a new business. But how do you get to that point? And as social innovators scratching someone else's itch, how can we collect quality feedback when we're too early stage for revenue and thus lack that as a feedback mechanism?
The answer: rapid, experiential prototypes tested with a scientific method.
This will be familiar to Agile or Lean Startup practitioners. This section is designed to explain why these principles are important and show some examples of them in action.
Rapid. No one builds the perfect product right from the beginning. The product always changes as it goes from idea to reality- so we want the cost to change to be as low as possible. There needs to be a focus on quick and rapid prototypes so that as we learn new information, it's as easy as possible to build the next version of the prototype.
The faster we build and test simple versions of the product, the faster we learn. The faster we learn, the faster we can adapt to what works, and the sooner we can get to a working solution. The quicker and dirtier the prototypes, the faster you move and the less time and money you waste. Want to know what some of the first prototypes were for the wheelchair lift my startup is developing? Stacked pallets in one case and stacked reams of copy paper in another. You can't get quicker and dirtier than that.
Scientific. We want to build our social ventures on a solid platform of objective data. This means being able to isolate variables and collect data so that as we build our prototypes, we know what is and isn't working and what needs to be changed.
Business plans, a necessary evil, are usually just a collection of guesses about how we think our business, industry and market will perform. Once you start building rapid prototypes, you can take some of these hypotheses and test them to see if they are actually true or not. Really what we're trying to get at with our rapid prototypes is, before we even launch our product or service, how can we prove objectively that we're building the right product?
In the case of the pallet and paper prototypes of the wheelchair lift, we needed to know very early on if wheelchair users wanted to be lifted from the seat while the chair remained on the ground, or if they preferred to have their whole wheelchair lifted with them. So we stacked copy paper under their seats to test the former, and lifted them onto pallets to test the latter. In a day's work, we learned an important insight about how the product functioned that might have wasted months of time had we tried to mimic the actual lifting technology. In this case, all the wheelchair users we worked with preferred to have their whole chair lifted. They felt more stable having the entire wheelchair lifted with them.
This is the core of the Lean Startup cycle I teach in classes and workshops: A hypothesis you need to validate, the experiment you'll validate it with (a quick prototype and metrics to judge them by) and actionable insights. I say actionable because if the results don't change your behavior in some way, it's a bad experiment! Also, many people start with the prototype they want to build and come up with a generic, unspecific or unusable hypothesis as an afterthought. This is backwards, and results in experiments that don't actually put the hypothesis to the test! It's imperative to decide what you want to validate first, and why that's important to know (in other words, how it changes your behavior). Having decided that, you can design your experiment and prototype.
Experiential. I take experiential to mean two things: Experiential prototyping as a process that incubates empathy, and experiential prototypes as a type of rapid prototype.
Let's talk about the process first. By making the prototyping and product development process as experiential as possible, you're empathizing with your customers. In a social business in which you're not your own customer, the more you can empathize with your customer and understand their fears, motivations and desires, the better a solution you can design for them. This is really the core of the Human-Centered Design movement: you're tearing down the veil of "otherness" that separates the product designers from the product users.
In one great example of how to do this, Nordstream tasked their innovation team with developing an app that added value to the sunglass shopping experience. The entire team set up shop inside a Nordstrom store, in the sunglass department. The whole team interviewed until they had a decent idea of what the app needed to be. Then the coders, working at desks they set up around the sunglass displays, coded it in front of their potential users. They could build a working version and immediately turn it over to have it tested. While they were working on the first running version, the rest of the team used sharpies and copy paper to build mockups they could test with customers. That way, the programmers knew exactly what they would build in the first version. I highly recommend watching the video as it's a great illustration of all three of these principles at work.
Other ways to empathize more with customers:
These are just a few ideas, but hopefully they spark some of your own!
You can also make an experiential prototype as a type of rapid prototype. Let me give you an example.
A startup I worked with called Park Ave wanted to build a marketplace for buyers and sellers of parking spaces during special events like sports games and concerts. They had already started building the app, but before they sunk 6 more months into developing it they wanted to know if there was actually a market for it. So they went out that weekend and spent two days at a baseball game series, talking to potential customers trying to sell them a parking spot for the next game in the series.
Notice that while they're not testing the technology (the app that would take months to develop), they are still in fact testing the value proposition of the app (that parking at special events is enough of a hassle that users will pay to reserve a spot in advance). They're testing the experience of using the product. Almost no one cares how whiz-bang the technology is, they care about if the technology makes them cooler in some way. So the most effective rapid prototypes are also experiential in that they get at the root of how the users would feel using a product if it actually existed.
After all, people use products because of how it changes their experience of the world. Entrepreneurs, even social entrepreneurs, tend to equate the value proposition of their product with the product itself, or with its features. That's only half the story. The value proposition is what the product does and why that's important to the user. Does it save them time or money? Does it make them feel cooler or more heroic? It's the experience of using the product that counts- not just usability, but also how the user's experience of the world changes through using the product.
In Park Ave's case, they learned through 3 to 4 cycles of week-long tests that there was not in fact a market for this product. They then pivoted to a new product. Their new idea resulted in a $10,000 paid beta with a major state university, which they are currently in the process of conducting. The point here is that they avoided 6 months of useless programming and product development and learned through a few weeks worth of rapid, experiential prototyping that their product wouldn't succeed. Instead, they're spending their time building something they know their customer wants because they've already been paid for it.
Generally, it's a good thing to be your own customer. Assuming you're a decently self-aware person, there's no customer discovery to be done.
However, this advice shouldn't be taken to the extreme. Social businesses can, and do, succeed and thrive even in situations where the founders aren't their own ideal customer and are scratching someone else's itch.
Because social entrepreneurs use business as a vehicle to deliver social impact, finding paying customers is the ultimate validation that you're scratching the right place and the itch is going away. The key becomes getting to that point, perhaps even in the form of a paid beta or preorders, as soon as possible.
Agile and Lean Startup methodologies are crucial to getting your social business to that point and beyond. Namely, social entrepreneurs should aim to validate their critical assumptions about their business and customers by using rapid, experiential prototyping to test what works and what doesn't work quickly and cheaply, all while building an empathic understanding of the customer's fundamental experience of life and resulting trials and vicissitudes.
Dillon Dakota Carroll
The New Year is a pretty awesome time of year. It's a natural beginning for new endeavors and goals- perhaps the best beginning of them all. At least, the most celebrated. Yet, the failure of New Years Resolutions are well documented. Perhaps part of the problem is that New Years, by nature, only happen once a year. You only get once a year to fail, then you have to wait till the next New Year. An oversimplification, but bear with me. If a New Year as a beginning is such a wonderful and inspiring time, then how can each week, or each day even, have the effect of a New Year? How can we renew ourselves regularly and feel the spark of inspiration we feel on December 31st, thus giving ourselves that many more chances to succeed?
Whether failing through lack of motivation, strategy, or the failure to adequately form new habits; change is hard. Most people aren't prepared to actually change. Change means doing new things and more importantly not doing what we're already doing. I needn't point fingers further than myself to find the perfect example of this. In the past, my method to implement change has been haphazard: throw lots of things and ideas up in the air and see what survives the fall. I suffer from a lack of a structure that I can stand on to reach my goals.
I want to find a method that works for me. In particular, I'm thinking about how I can combine a consistent routine, a satisfying lifestyle, and Agile sprints to create a satisfying answer to the two issues above.
I've tried for a while to have a structured routine each day that allows me to accomplish the daily tasks I set before myself. It's been a source of frustration for me to fail so frequently at applying a daily structure. Ultimately, I see using Agile-style experiments as the way to make small adjustments an incremental progress towards this objective. I'll talk about that later. First, I want to talk about why I think having a rejuvenating morning routine can help create a sense of beginning.
Brett McKay, on his blog The Art of Manliness, wrote in an article on the importance of family traditions: "Traditions and rituals often tell a story about a family... [and] add to the rhythm and seasonality of life. Our world and universe are composed of cycles big and small – sunrise and sunset, death and rebirth, winter, spring, summer, and fall. Even the generations move in cycles. A circular conception of time and a desire to follow the natural rhythm of the days and the seasons is embedded deep within us, but has been flattened out in a modern age that creates its own timetable and concentrates only on the present."
He's not talking about routines per se- in fact, he specifically defines why a tradition is different from a routine (writes McKay: "they differ from routines and habits in that they are done with a specific purpose in mind and require thought and intentionality"), but I'd argue that having a fulfilling, regular routine that involves more than just showering and brushing one's teeth still provides many of the benefits he describes traditions conferring. What tells a story about a person more than the activities they make sure to do, every day? What marks the passing of the days and weeks more than the personal rituals one does to renew oneself?
Through the ritual power of tradition we tap into something timeless and greater than ourselves on holidays like Christmas and New Years. Perhaps having a routine you enjoy and that propels you towards your goals has a similar effect. It becomes a personal ritual that heralds a unit of time (a new day or a new week), creates a transition between cycles of work and rest, and becomes regenerative and recreative in itself. You might start to see each day or week in terms of cycles of work, relaxation, recreation; that is, in the original sense of the word recreate, to re-create oneself.
A routine may not be as memorable as a family-oriented tradition, but I'd say it may provide many of the same benefits.
Routines as Ecotones
The analogy that comes to mind is an ecotone. In ecology, an ecotone is a boundary between two distinct ecosystems. A healthy ecotone is a gradual transition from one ecosystem to the next, and therefore has characteristics of both neighboring ecosystems to varying levels. Ecotones tend to be important habitats and from a landscape perspective are often the most interesting. Think of lowland forests along a riverbank or river deltas that empty into the ocean. What's important is this example is what happens when you remove the ecotone, or the transition: both neighboring ecosystems suffer as a result. The wetlands and lowland forests that form the transitions from land to water not only provide rich and unique habitats, they also stabilize the land against erosion and filter runoff water of impurities. Without these ecotones, water quality plunges, animals lose their habitats, and a destabilizing level of erosion occurs.
Heard of the giant "dead spots" in the Gulf of Mexico where nothing grows or lives? That's caused by the fertilizer from all the land that eventually drains to the Mississippi River, which if I remember correctly is about a third of the continental United States. The nutrients in the fertilizer causes algal blooms, which consumes all the oxygen in the water, oxygen that everything else in the ocean needs to survive.
What's funny is that the fertilizer in the runoff wouldn't ordinarily arrive all the way to the Gulf of Mexico if we hadn't destroyed the natural ecotones all along the shores of our water bodies. Centuries of environmental exploitation has meant that we've drained the wetlands and clear-cut the lowland forests, the very ecotones, or transitions, that protected the rivers, lakes, and oceans from pollutants.
All this to say, that a consistent routine may provide the ecotones in our daily and weekly life, creating a purifying and unifying transition between two discrete units, as in nature.
What do I do on a weekly basis to renew myself, to create weekly new beginnings?
There are two facets to the answer I have so far: creating psycological space where a beginning can incubate, and creating the structure to take advantage of it.
The first, easy to say but hard to implement: develop a life outside of work. No one can effectively work all the time. Having fun and working on other projects, hanging with friends, relaxation and recreation create the psychological space we need to see our lives and our work from a fresh perspective. Besides, they usually create more motivation to actually get the important things done while we're working.
The authors of Rework, Jason Fried and David Hansson, say it well: "[workaholism] leads to an ass-in-seat mentality—people stay late out of obligation, even if they aren’t really being productive. If all you do is work, you’re unlikely to have sound judgments. Your values and decision making wind up skewed. You stop being able to decide what’s worth extra effort and what’s not. And you wind up just plain tired. No one makes sharp decisions when tired. In the end, workaholics don’t actually accomplish more than nonworkaholics. They may claim to be perfectionists, but that just means they’re wasting time fixating on inconsequential details instead of moving on to the next task."
I've been MUCH better about this since striking out on my own and quitting my job at OU in August. If I wasn't working on weekends, I'd spend the time vegging out in front of a TV or computer, while silently panicking to myself that I wasn't doing enough. While this probably had more to do with it being my first job straight out of college (so I took it way too seriously), it definitely wasn't healthy.
Some of the best advice on this topic came from a book called The Now Habit by Neil Fiore. He advises planning recreation, social activities, fun and relaxation before planning any work activities. The cool thing about this is 1) I'm my own boss so I don't have to stick to a traditional 9-5, 5 days a week work schedule; and 2) This kicks Parkinson's law into effect: the amount of time you have available for a given task is how long it will take you to finish it. Knowing what I need to get done each week to stay on track with my business, all the better if I can get it done in, say, 10 hours instead of 40.
That leaves the structure.
For that, I've turned to the Agile methodologies I apply in my work. In other words, I'm going to try and think of the year in terms of week long sprints. At the end of each sprint, I can evaluate my progress and iterate to improve myself in the next sprint.
The part I'm excited about is running personal, Lean Startup style experiments in each sprint. In other words, each sprint I have one or more Yes/No questions I want to test. I have metrics to determine if I've "answered" the question yes or no, and I decide in advance how I change my behavior based on the answer to that question. The point is to learn quickly and inexpensively what works (so you can double down on that) or what doesn't work (so you can find a different way to do it, or decide to do something completely different instead- a pivot, using the entrepreneurship buzz-word). If your sprints are a week long, then that means that every week you're improving what you're testing. Plus, you're making decisions based on objective metrics and real data.
For example. Let's say the first hypothesis I have is, "I can wake up at 7am to begin my morning routine". The test I might decide on in advance is that I have to do so all 7 days of the week to answer "yes".
If the answer to the question is yes, I might stick with that routine and wakeup time and continue forming it as a habit. Or I might try waking up even earlier.
If the answer is no, in the next iteration I'd try waking up at 8am.
If you're familiar with the idea of 21 day or 30 day experiments, then this is similar. I've tried numerous 30 day experiments in the past, unfortunately, with little success. My attention span just doesn't last that long, and I don't naturally think in terms of months. It's too long a unit of time. Weeks are more tangible, and the built in weekly "retrospectives" means you can learn, adapt, and iterate on a weekly basis instead of a month long basis.
The goal of an agile methodology is to increase a person's, a team's, or an organization's ability to adapt to change. From that perspective, I'm learning and adapting four times faster with week-long experiments versus month-long experiments. That's 52 possible experiments in a year, at least. Once I feel comfortable running one personal experiment a week, I might try doing more than one experiment at once.
I also developed some simple yes/no questions that serve as litmus tests for how well I'm balancing each component of my lifestyle on a weekly basis. I like the idea of these because they're simple, fast, and you can easily change the yes/no question to reflect how big of a priority that component is at that time in your life.
For example, these are some of the questions I developed:
Health: Did I do my morning routine each day?
Learning: Did I finish a book in the last week?
Writing: Did I publish a blog article?
Adventure: Did I spend at least 1 night away from home?
If the answer to any of the questions are "no", then the follow up question is, why? And what can I do differently in the next sprint to correct it?
The tricky part is figuring out the metrics that determine whether you answer yes or no. They have to be simple, or it defeats the purpose of this exercise, but it also has to encapsulate the most important output or result of that aspect of you lifestyle.
In my case, I know that if I spent a night away from home, then I was likely out camping or getting into trouble. That's in line with the type of experiences I want to have more of in 2015.
With my writing, the metric is simple: I can judge it by the number of blog posts I'm outputting each week.
Of course, I will be iterating this system and working out it's kinks over the course of the next couple weeks. We'll see how it goes!
Wishing you the best for 2015,
Dillon Dakota Carroll
...sees much and knows much