Category: Structure

Universitas Technologica: The Distributed University

“Why stay in college? Why go to night school? It will all be different this time.”
“Life During Wartime” – The Talking Heads

 

“[T]he earliest universities, in the twelfth and thirteen centuries – at Bologna and Paris – were not deliberately founded; they simply coalesced spontaneously around networks of students and teachers, as nodes at the thickest points in these networks…. [U]niversities had no campuses, no bricks and mortar. The term universitas referred to a group of people, not a physical place.”
(McNeely and Wolverton, Reinventing Knowledge, 2008, pp. 79, 80).

 

We have access to more knowledge than at any time in human history. We have access to more people than at any time in human history. I can remember the breathless pronouncements of distributed intelligence, the hive mind, and “colleges without walls” in the 1990s. Yet we still structurally operate the business of learning at all levels – from instruction to scholarship – in pretty much the same way as we did in 1995. While we may have models of distributed learning as exemplified by everything from MOOCs to online colleges, these are not distributed universities but are merely the scaling of existing, entrenched models of learning.

Teaching students, however, is only one part of the university’s mission. Indeed, its first mission was to serve as a center of scholarship for the faculty itself. Concentrations of scholars quickly attracted students but that’s where it started in places like Paris, Bologna, and Oxford.  Scholars used to travel to universities to “read” for a degree because that’s where the books literally were. After Gutenberg, this was decentralized somewhat but idea of concentrating knowledge never really went away. Industrializing learning in the 19thcentury made it a logical efficiency to bring the students to where the books were. The internet increasingly makes it possible to explode those notions. I know I am increasingly frustrated when I can’t access a book or article instantly to move my work forward. There is no “technical” reason this can’t happen.

There is precedent for decentralized scholarship. The Republic of Letters of the Enlightenment was central to the advancement of scientific and philosophical thinking. Vannevar Bush conceived of the Memex in 1945 as a means of using technology to drive knowledge forward. Tim Berners-Lee originally conceived of the web as a way of coordinating scientists working at CERN with partner facilities around the world.

While physical universities of place were essential, especially early on, in the hosting and spreading of the internet, they may have sown the seeds of their own successors. This is because in so doing they have laid the groundwork to finally flip the idea of a “center” of learning.

But the more important caveat is that universities were, and are, critical “social nodes” for scholars and innovators to come together. The community of learning is perhaps the most important role that the best universities continue to play. At the same time, this reality also poses the greatest limitation and threat to a true universitas.

These physical communities invite siloes that run counter to the needs of the universitas technologica. While Harvard may be on the internet, at the end of the day Harvard is still in Cambridge. Most universities are even more divided, with sub-nodes of colleges insulating and competing with one another. These competitions extend to siloes that rise above the walls of the university itself into disciplinary organizations that create dogma and channel creative energies in very specific directions. Corporate interests, particularly in publishing, have served to further exacerbate the channeling and masking of knowledge.

The rationale for these “disciplines” is rapidly fading. Answers to complex issues are increasingly more likely found in the spaces between disciplines than within disciplines. Furthermore, as I argued in another context, constructing castles, for siloes are just that, invites attack if for no other reason than for their fixed nature. The more we have an Ivory Tower, the more ivory hunters will emerge to pillage it.

What we need is a completely new vision for our learning enterprises that redefines place and discipline. In 1994 Kevin Kelly wrote, “it is possible that a human mind may be chiefly distributed, yet, it is in artificial minds where distributed mind will certainly prevail. “ (Kelly, p. 19)  We now have the technological capacity to create such minds – a collective hive of intelligence – even if the platforms we by and large use to communicate with tend to perpetuate rather than erode siloes of knowledge.

The most successful companies in the new economy, such as Google and Apple, have managed to create interdisciplinary hives. Two centuries of specialization in the university, ironically the originator of this concept, has surprisingly lagged behind in this regard. Bryan Alexander recently argued that this is one of the reasons that it finds itself under attack as enrollments have declined.

Ironically, the attacks on the edifices of higher learning come at a time where there is increasing consensus that our children will require ever more complex mental tools to navigate their futures. Now more than ever we need to reinvent communities of learning to harness the distributed intelligence in order to solve ever more complex problems.

Instead, when we try to construct and redefine our communities of learning, we are often confronted by an atomized landscape, limited in its interdisciplinarity and ability to cope with the level of change and complexity likely in the near future. The needs of knowledge creation have outstripped the physical and organizational infrastructures of universities (and, by extension, the business specializations that they have in turn created) more than ever before. This reality has been a constant challenge to me over the last few years as I have been involved in numerous projects where different cultures, disciplines, and skills have had to come together to make them work (most haven’t, yet).

Organizationally, this problem became particularly acute with the dissolution of the New Media Consortium as that grouping provided a number of mechanisms for cross-disciplinary collaboration. It provided a center of gravity for activities that were not tied to discipline, institution, or even profession. However, even it often struggled with figuring out how to operationalize its hive mind against centripetal forces. Without it as a center, however, there was nothing stopping those forces from scattering the universitas that came together under its broad tent.

In the last year I have been involved in a number of interesting conversations and attempts to “irrigate the sands of the desert” at various levels and these might start to point us toward the visions of how the internet might finally disrupt the Ivory Tower and create a true “universitas technologica.”

On a very micro level, several of my colleagues and I decided to try to meet regularly starting last summer to free form discuss various readings that were brought to the table. Call it a “book club” or graduate seminar, but it created a lot of interesting trans-disciplinary thought. We met using Zoom videoconferencing and so were able to create a very effective universitas on a very small scale. We are currently discussing how to expand this effort, make it more public, and possibly complement it with more asynchronous tools. It has been a very rewarding effort that has introduced me to more concentrated, new thought (and entire streams of literature) than at any time since I left grad school.

On the macro end, FOEcast is attempting to create a “distributed innovation network.” The idea here is to spin up nodes of innovation in teaching and learning using technology around the world and interconnect. We hope to create a professional organization with no physical “center” to leverage the connective tissue of the internet and united through a federative structure of ideas and principles. Like all attempts to colonize virgin territory, it is struggling to lay down its roots. The vision is there, however, and we are probably closer to realizing it than we have been at any time in the last 20-30 years.

The power of both of these efforts is not due to any technological revolutions. True, video conference over the web has reached a certain level of simplicity and affordability that was not available a few years ago, but that’s not really what’s driving things here. What is unique to both the seminar and FOEcast is that we’re willing to cast off from any preconceived notions of what an educational effort is. Indeed, I don’t think many of either efforts’ participants have even thought of them as “educational” until very recently – at least not in the sense of their potential for being self-educational.

The last couple of years for me have been spent trying to tie together strands of brilliance and resources to create spaces (both real and virtual) that will accelerate the processes of reinventing education. So far, this has often been an exercise in frustration as I have constantly run up against cultural norms and institutional barriers.

What I have come to realize, however, is that we may collectively be starting to create a “Distributed University”  or universitas technologica to replace the physical university that has evolved from its medieval genesis. Now that the readings are everywhere, recreating the university’s communities of learning in a distributed fashion is our next big task. Like the internet itself was designed to be redundant and resistant to failure, we must construct a global interdisciplinary community to achieve the next steps in the evolution of scholarship and learning. Maybe we are finally seeing the emergence of that.

Pivoting the World

(Originally published on nmc.org  April 2013)

I’ve had a lot of interesting strands come together this past week as I have been pondering the next steps in our technological evolution. For the last 20 years or more we have been obsessed with transforming the real world into virtual worlds. Books have become eBooks. Magazines have become blogs. Cocktail parties have become Facebook. Meetings have become webinars, and so on. Now, however, I’m beginning to think more and more about technology bending in on itself. Our next steps will be to bleed the virtual worlds we created back into the physical world and this will reshape how we iterate the physical world.

The virtualization of everything was a trend some embraced more than others. It brought with it new opportunities to connect (from Flickr to Facebook to Twitter). Perhaps its height was the creation of virtual worlds such as Second Life. While I agree that there are huge advantages to bringing in larger, more diverse crowds, there has always been the concern that we are sacrificing depth for breadth. Marshall David Jones expressed this sentiment in his riveting poetry slam.

Jones forcefully expresses some real contradictions inherent in living in virtual spaces. I was thinking about him while working on the Horizon Report Technology Outlook for Community Colleges, when I came across an interesting article in the Toronto Globe about a professor who is building “pet rocks” using Arduino devices to physicalize social media. Consider the implications of making virtual social spaces real again and making your community extend online as well as on very real personal levels. This has implications for how we interact in education as well.

I am working with my science faculty to create a STEM-focused campus here at Houston Community College (HCC). There has always been a certain amount of resistance from the science faculty when it comes to virtualizing aspects of their curriculum. At our level of instruction, teachers need to be able to put science in the hands of their students, not just on a screen. However, I think that their wariness of mixing the virtual in with their instruction is increasingly missing the mark. There are emerging technologies exemplified by the “pet rock” project that represent a potential tipping point. They are making the virtual real.

It doesn’t stop with “pet rocks.” There are similar things going on right now from hackerspaces to Ingress where people are taking what were formerly virtual spaces and turning them into reality. As I was contemplating the science labs in our new campus, I thought the “pet rock” professor’s would be a great model. Students would be able to virtually design items and then make them real using technologies such as 3D printers or micro-computing devices such as Adruino or Raspberry Pi to test real world physical concepts. Imagine a moderated scientific hackerspace where app development meets experimentation and 3D printing meets iteration.

The promise of iteration provides a final strand to this argument. I have been thinking a lot about how to teach both our faculty and staff to become more entrepreneurial in how they approach their present and future jobs. I am convinced that this will be a critical skill moving forward as we are forced (and blessed) by technology to reinvent our jobs and ourselves repeatedly as we move forward.

Last year I stumbled across Eric Ries and his concept of the “lean” startup.

Ries makes a critical point here. He says, “if we can reduce the time between pivots we can increase the chances of success before we run out of money.” Or, in the educational context, before we run out of time. After all, in the immortal words of Billy Crystal in Spinal Tap, “Mime is money.”

Ries argues that technology is helping us reduce the time between pivots and therefore, assuming we can pivot quickly, our chances of success in any entrepreneurial effort go up. Technology hardware is becoming cheaper and cheaper. Raspberry Pi and Arduino units go for much less than $50. You can get a 3D printer for $1000. Software is becoming more accessible too, as simple programs are easier and easier to create and implement using mobile devices or the aforementioned hardware devices.

Teaching and entrepreneurism have a lot in common. A colleague of mine likes to argue that one of the reasons that it is so hard to manage faculty is that it is like managing an organization of entrepreneurs. The problem I often run into is that faculty fail to see how technology can improve their entrepreneurial efforts in the classroom. Instead, they often see it as limiting their scope for action or as a mechanism to oversee and control how they teach. However, there are a lot of parallels between what Ries is talking about and teaching. We have a limited quantity of time with which to interact with our students. Semesters end before you know it. The question is how many times can you pivot before you lose them.

Let’s go back to the idea of scientific hackerspaces. Science is fundamentally about trial and error, and the best way to get students to understand this is by allowing them to experiment and fail. The problem is that every pivot requires an initial failure, and we don’t do a good job of rewarding failure in higher education. We are also constantly up against the clock and calendar when it comes to allowing our students to fail within the context of an academic semester. It’s much easier for the instructor to drill them on the “right” answer and enable them to “pass” some sort of evaluation. This is a tried and true method that doesn’t teach our students the key tenets of entrepreneurism that will be critical to their future success. If you have any doubt about this watch this 2007 TED talk from Alan Kay:

The solution here is to make pivots much less costly in terms of time. Let’s assume our scarcity is the amount of time given to us in a given academic term (usually 48 contact hours). How quickly can we iterate physical concepts in that time and what can we do to maximize those iterations? This is where we get back to the promise of making concrete the virtual. If students can quickly program or design physics experiments virtually and then test them in the real world, we can maximize the number of pivots they could have in any kind of course requiring experimentation (and almost all courses, science or otherwise, should require some sort of experimentation). This would also have the beneficial side effect of teaching entrepreneurial, computing, and design skills critical to many occupations in today’s world.

The problem with many traditional science labs is that the inflexibility of the equipment often steers students toward predictable outcomes. What the new technology/employment environment demands, however, is people willing to create unpredictable outcomes and to be able to assess them creatively. Fortunately, this is also a core tenet of the scientific method and therefore something that we need to encourage our students to do.

A final exciting possibility that this opens up the possibility of a completely new kind of mashup. Until recently we tended to think of mashups in terms of visual or audio media. We can now start to think of them in terms of creating physical objects. This takes the concept of mashups out of the English classroom and opens them up in new ways to STEM disciplines. Remember that all fundamental breakthroughs in science are mashups. Newton’s Principia was a mashup of the thinking of Gallileo, Kepler, and many other theorists who had been struggling with the new cosmology for a century before Newton did a mashup and added his own innovations. Shouldn’t we be creating learning spaces that enable everyone to be a Newton?

Again, imagine a science campus that is more of a hackerspace than what we traditionally imagine science labs to be. In this space students and faculty will be enabled to create, fail, and recreate. It will require close cooperation between computer science and the natural sciences, but isn’t this what we are striving for in STEM in the first place?

I would leave you with one last question that is critical to the success of this kind of model: Can we graft an entrepreneurial maker culture onto our educational processes? I think we can. I think we have to.

Small Steps for Big Changes

(Originally published on pbk.com December 2017)

At the beginning of 2016, my former Instructional Technology Team at Houston Community College and I opened the Design Lab at the HCC Alief campus. It was designed to prototype the MakerSpace we were developing at the West Houston Institute. This space, which occupied a repurposed computer lab, used surplus furniture, and only required about $20,000 in new equipment, was designed to give us practical experience in running a MakerSpace. It became so much more than that.

Within months, students using the space developed everything from an augmented-reality sand table to drones to senior engineering projects for the University of Texas-Tyler Engineering program. The space quickly became the first stop on every campus tour given by our president and soon thereafter attracted the attention of the chancellor, the trustees, and the local community. Most importantly, the space completely changed the conversation at the campus and the larger college around what NextGen learning should look like.

D-Lab when it was opened

What does the example of the HCC Design Lab teach us? Many of the projects we undertook at HCC used what is called the “Lean Startup” method. Under this methodology small experiments are tried out with the expectation that a high percentage of them will fail and iteration is assumed as part of the process. The strategy essentially argues that you “build” something small, “measure” its impact, and “learn” from your mistakes to “pivot” quickly leveraging the advantages of technology.

The Design Lab itself was a lean startup project but even within the D-Lab itself we undertook many experiments, particularly with regard to the mix of equipment in the space. The most expensive piece of initial equipment we purchased was something called a Z-Space Tablet. This system allows you to use 3D glasses technology to view and manipulate objects in space. We thought this would be a cool piece of technology that would be used by students to do 3D visualization before sending the items to be 3D printed. The problem was that no one ever used it. It was too complicated and clunky for the neophyte user and 2D screens worked fine, even for 3D projects. So we quickly moved it out of the D-Lab and used its space for other projects. In its place, we purchased a $300 vinyl cutter, which quickly became one of the most popular parts of the tool package we presented to users.

The Z-Space was a failure for our purposes and we quickly moved on from it. We saw a need for a new kind of tool in the vinyl cutter and quickly pivoted around to getting one. Both of these decisions were made using the Lean Startup method, leveraging the falling costs of technology, and observing how the students in the space behaved. Both could be considered failures of the original design. Both represented what Lean Startup practitioners call “pivots” and both were necessary failures that allowed us to achieve success in the space. Note the specific mention of the vinyl cutter in the Houston Chronicle.

The D-Lab a little over a year after it was opened

Sustainability and Next Generation learning share one trait in common: they both represent cultural changes in the way we deal with teaching learning and the very spaces that we construct to do them in. Cultural changes are hard by definition. People fall into established patterns of thought and action that are very hard to dislodge. Doing this at scale is exponentially more difficult. The Lean Startup method presents a strategy that be used to make incremental instead of wholesale changes to an organization.

It is easy to underestimate how hard cultural changes are going to be. Active learning represents a significant cultural shift from traditional practice. It’s asking a lot for an educational community to suddenly pivot to NextGen teaching and learning. Further complicating this is the reality that we do not have a shared understanding of next steps on the path. We have an imperfect understanding of the nature of the technological changes sweeping through your classrooms and society as a whole. We also have an imperfect understanding of how humans learn. We can see general trends and certainly understand that teaching and learning need to change but there is no general agreement on how to get there.

The D-Lab presents a useful model for how to approach this problem. Experimentation has to become the norm as we feel our way into the future. However, experimenting with a $150 million high school is not really an option. Nor will it work very well because the scale of change is simply too large for most organizations to digest. Instead, the Lean Startup path is to make changes on a much smaller scale: a single classroom or lab, an out building, or a library makeover. The investment is much smaller and it is easier to retrain a core group of faculty or staff to maximize use and effectiveness of the space and to allow them to experiment with new modalities of teaching and learning.

If the D-Lab is any guide, demonstration spaces will also form a catalyst for change when the rest of the school sees the impact of what’s going on. With all of the pieces in place (a much easier proposition on a small scale), there will be demands for duplication or scaling of the activity. In my decade redesigning and experimenting with innovative learning spaces at HCC this is precisely what my team did. Not everything was as successful as the D-Lab has been but the costs of failure were much more easily digestible than it was on larger projects. And we learned crucial lessons that we applied to subsequent efforts. Furthermore, over time this built a community of support and also started to change the culture of the institution as faculty and staff more readily accepted entrepreneurism and felt empowered to attempt it themselves.

Without small scale projects such as the D-Lab and others, we would never have built the West Houston Institute, which was recently named a finalist for the SXSW Learn by Design Award. As we discussed in a recent Knowledge Base Article, the West Houston Institute is designed to be one giant Lean Startup incubator which will adapt and reconfigure itself and its programs to meet changing demands and also to act as a teaching tool for its students and visitors in methods, skills, and mindsets to carry them into the world of the future.

Hacking School

(Originally published on pbk.com September 2017)

For over a decade authors as diverse at Frans Johansson (The Medici Effect – https://www.fransjohansson.com/books-by-frans-johansson/), Daniel Pink (A Whole New Mind – http://www.danpink.com/books/whole-new-mind/), and Steven Johnson (Where Good Ideas Come From – https://stevenberlinjohnson.com/where-good-ideas-come-from-763bb8957069) have argued that innovation comes from a diverse approach to solving problems. This either means developing a broad educational approach within a given individual or bringing together diverse people in a collaborative setting. As Johansson says in Medici, “Leonardo da Vinci, the defining Renaissance man and perhaps the greatest intersectionalist of all times, believed that in order to fully understand something one needed to view it from at least three different perspectives.”

The industrial age was one of specialization and our schools reflect this. Instead of Da Vinci’s dictum that you have to apply a broad spectrum of views to any particular subject, the educational world for the last century has been one of increasing specialization. The further you go up the educational ladder, the more specialized this gets. But even at the lower grades there is usually separation between “art” and “math,” for instance. These distinctions are the product of the industrialized educational system that purports to prepare our students for specific careers that require specialized and in-depth knowledge of a particular subject. Alan Kay shows just how siloed thinking undermines the learning process in this TEDtalk from 2007.

 

The problem is compounded by magical thinking. Arthur C. Clarke once stated, “Any sufficiently advanced technology is indistinguishable from magic.” This is exactly what has happened to our relationship with technology. Most users have no idea how it works or why it works the way that it does. The same is true for most technologies in schools (and is becoming worse as technological systems become more complex). It is common for schools to approach technology as something expensive that needs to be protected from the students. It is also separated from instruction in the sense that it is used to facilitate certain kinds of experiences but is rarely the focus of instruction itself. As a result, both students and teachers tend to view technological systems as black boxes rather than as learning opportunities.

The struggle to align the educational world with technological realities is a direct product of the specialization that has characterized our educational experiences for a century or more. Is coding a technical subject taught in a Career and Technical Education environment? Is it an exercise in mathematics and logic best taught by engineers? Most controversially, should coding be taught as a foreign language class? The answer is all three and yet our curriculum insists on trying to insert it into traditional boxes that don’t really create effective coders. Good coders have to understand logic and the basic technical limitations of the various languages but also need to be able to speak to the machines in a natural way. These kinds of people are incredibly sought after in the technology world but our systems do a very poor job of producing them. And coding is only the first level of the problem. To get to an iPhone-level device you have to integrate coding with hardware engineering and design (and ultimately entrepreneurship). You need a multidisciplinary approach to practically any problem these days.

How do we work toward a system that creates these kinds of thinkers? One way to do this is to create environments where it is okay to experiment. In the 1950s early computer scientists at MIT and elsewhere often came from the model railroading community. Model railroads are complex electrical switching systems so the transition was a natural one. Bundles of wires often had to be cut, or hacked, and reconfigured in order to make a system work. This was true in computing as well. This is where we get the term “hacker.”

Over the last 20 years this term has returned to its definitional roots through technology spaces that allow users to build, break, and repurpose technologies. Advanced tools such as 3D printers and laser cutters have been added to the mix to allow the rapid fabrication of prototypes. Microcomputers such as Raspberry Pi’s and Arduinos provide rapid access to electronic brains. Hackerspaces, and their more commonly referred to cousin MakerSpaces, have started making their way from community spaces into education. It is not always a natural fit.

MakerSpaces go against the grain of the industrialized educational model. Like coding, they raise questions about where they belong. Viewed purely as a set of tools they can augment traditional programs in Career and Technical Education or the Fine Arts. In higher education they are most commonly placed in the context of engineering or other STEM-based programs. While these programs can greatly benefit from the access to these technologies, it is only through broad access that the true potential of developing the next generation of Da Vinci’s can be realized.

If he were a young student today Steve Jobs might never have had access to this technology because he wasn’t an engineer. Fortunately, he had access to Steve Wozniak, who was an engineer. Together they developed what was to become Apple Computer, an early example of blending computing with design. MakerSpaces offer the opportunity of developing diverse kinds of communities within our schools. However, they can only reach their true potential if they are built with those goals in mind and are accessible and visible to the entire student body, not just those who are technically inclined. As a resource area like a library they can provide a vital bridge for students from a specialized, industrial-focused curriculum and the realities of a post-industrial age. If they are tucked away supporting specialized programs and/or are invisible they are as useful as a library of unread books.

Squaring the Circle: The Learning Analytics Dilemma

(Originally Published on nmc.org May 2012)

Learning analytics presents some serious dilemmas for the academic community. It is tempting to think that our advancing capabilities in data collection and processing make understanding the learning process an achievable goal. However, because we have never really managed to define what successful learning looks like, figuring out what and how to measure, much less how to use that data to change what we do, poses some major challenges. My experience teaching this semester as well my intellectual effort in imagining a learning analytics platform have shown me that there are at least three major, interconnected hurdles to overcome before we can proceed: definition, doctrine, and culture.

Before we can go into the technicalities of developing a learning analytics system, we need to come to some definitional agreement over the ultimate purpose of higher education. Is it to teach a basic canon of knowledge? Is it to teach critical thinking skills? Or, is critical thinking not going far enough? Do we need to teach what Michael Wesch calls “knowledge-ability?”

Our only hope of developing meaningful variables to measure learning is by clearly defining our educational goals. Making meaningful comparisons across multiple classes, and even more so across disciplines, depends on coming to some sort of consensus around these often-divisive issues.

In addition to being clear about definitions we also need to address the doctrinal dichotomy over whether learning is essentially a cognitive process or behavioral process. A behaviorist would argue that learning could, in theory, be broken down into a set of discrete steps or benchmarks that lead to success. This approach would lend itself quite well to the quantitative aspects that big data would seem to offer us. Its misuse is also, unfortunately, the basis for much of the standardized testing environment that we have seen inflicted on the K-12 environment.

There is a powerful cognitive counter-argument, recently expressed by Gardner Campbell in addressing George Siemens’ LAK12 MOOC. The argument here is that learning is essentially impossible to quantify because it is a voyage of discovery with ill-defined markers along the way. The job of a teacher is to facilitate this voyage among his or her students. How do you measure the learning process using this approach? Can we develop a tool to measure curiosity, for instance? Furthermore, quantification risks taking education even further over to the dark side of standardized formulas for “learning.”

Philosophically, I wholeheartedly agree with Gardner’s position on this subject. I spent most of graduate school arguing for a cognitive model in international relations and it certainly appeals to my right-brained nature. Furthermore, I know my own learning process has been governed to a large extent by my insatiable curiosity. At the same time, that curiosity attracts me to the possibility of dissecting the learning process in a meaningful way. This is also in part because I continue to be frustrated as a teacher in trying to understand my inability to motivate the vast majority of my students to undertake curiosity-generated, self-directed learning.

Even if we can get past the definitional and cognitive/behaviorist issue, there is yet a third hurdle and that is cultural/institutional inertia. My recently concluded semester teaching a government class is a good example of the perils of getting too far ahead of the curve or outside the norms of traditional teaching approaches. As usual, I started the semester with high hopes that trying something new would lead to greater success for my students. I refocused the syllabus around a skills-based approach and attempted to give my students the freedom to explore areas of interest to them. I also tried some novel technological solutions such as incorporating Google+ and Google Docs into the workflow with the goal being to mentally take students out of the classroom as much as possible. At the same time I wanted to give them exposure to technical tools that they will need master in future work environments.

This experience left me disillusioned over my ability to break through cultural norms even in my own class, much less convincing other faculty to attempt similar efforts in their own classes. From the perspective of a faculty member, breaking down my assignments into their skill components was hard work. As a teacher with a built-in motivation (my interest in learning analytics) and an unusual propensity to take risks in class, I was able to convince myself to see this through. My experience with other faculty is that they are often understandably risk-averse and resistant to change. In other words, I’m not sure what I attempted was scalable even if it could be proven effective.

Furthermore, the changes required a significant paradigm shift on the part of both my students and myself. The students resisted, for the most part without complaint or feedback, my attempt to shift the paradigm of their learning experience. I ultimately gave up on trying to impress the significance of this approach on them. For the most part they were only concerned with the bottom line of what their final grade would be. Furthermore, freedom to explore did not agree with many of them as the class had a high failure rate. This was because many of them used their freedom to avoid doing the minimum amount of work necessary for success in the class.

The bottom line is that both students and faculty will resist this kind of cultural shift. It’s hard work and it involves an intellectual paradigm shift many won’t be willing to undertake. It’s hard to imagine a meaningful learning analytics project without a significant re-evaluation of teaching and learning being a part of the process.

To reinforce the point, I recently attended an iPad workshop put on by Apple. In it, they demonstrated how the new iBooks app, coupled with iBooks Author, could be used to create self-tests and flashcards for the readings. When I pointed out to the presenter that this was a classic McLuhanesque mistake and, furthermore perpetuated an outmoded form of teaching (rote memorization), he pointed out to me that, while he agreed with me, this was what most of the audience wanted to see pitched. Unfortunately, I could not argue with him on that point.

I am down but not out when it comes to learning analytics and the related mission of reinventing teaching and learning to meet the realities of the modern world. While I don’t see it as a magic bullet for student success, I continue to hope that it will provide critical insights to enable us to find a pathway to achieving those ideals. It also offers me an irresistible intellectual puzzle with the hope of making a real difference in the lives of our students.

As Yeats said, “Education is not the filling of a pail, but the lighting of a fire.” We clearly need more matches and less water. At the same time we can’t overlook the theoretical, doctrinal, and cultural barriers that will impede meaningful measurement of the learning process, much less allow us to reshape the process based on the data produced. However, we ignore the coming seismic shifts in teaching and learning at our peril. There is still a lot of discussion and hard thinking to be done here. In order to square this circle we have to figure out how to be disruptors while preserving the essence of what makes education such a special experience.

The Grader’s Dilemma

(Originally published on nmc.org on August 15, 2013)

In every class I teach, I struggle with some fundamentals of our industrialized education system that I cannot change but which seriously inhibit creative exploration of available information. I always dread being asked, “How do I get an ‘A’ in this class?” or even worse: “How do I pass this class?” I never seem to get this question: “What can I learn in this class?” This should not surprise me because our students have been well trained by the time they get to college. They see little connection between school and learning. Instead, it’s about beating the system. This is particularly true for most of my students who struggled in their K-12 classes and may have only graduated by the skin of their teeth. A fixation on grades is second nature to them (as it is to my own kids).

Grades have a pernicious effect on true learning and creative thought. As someone who has always loved learning for learning’s sake, I find this particularly frustrating. I have always personally been able to push the administrivia of learning to the background, in part because I never really feared failing a class in high school or college. My students are not so lucky and, as a consequence, evaluation becomes central to their educational experiences. Daniel Pink, drawing on decades of psychological research, shows fairly conclusively that external motivators like money and grades destroy our capacity for creative thought.

As a consequence, I feel like the industrialized educational system is always working against my educational goals. It is a lot harder to evaluate skills and assess final product when everyone is looking past that to what their final grade will be. I spend a lot of time and effort (it is often a central focus to my pedagogical strategy) trying to push grades to the background or at the very least to try to trivialize their effects but it’s a little like trying to hide green beans in chocolate cake. The taste can sometimes be disconcerting to the unprepared. I fight this battle with my own kids in K-12 and it’s hard with sustained effort, and even more difficult within the constraints of a 16-week class that meets once or twice a week.

In my most recent effort, I attempted to make “kinder, gentler” grading the focus. I did this by implementing a cumulative point system. No one failed assignments. It was all a process of accruing points. Furthermore, I wanted to achieve a certain level of gamification using insights from Tom Chatfield such as creating many small rewards and providing them with rapid feedback.

To achieve these goals, I gave each student a code and created a spreadsheet, which, in turn, created a bar graph to show how many points everyone had accrued relative to the rest of the class and relative to the bars necessary to achieve a grade.

Achieving rapid feedback was one area in which I failed the students. I was not able to turn around the grading fast enough to give them immediate feedback. Therefore, I don’t know if this aspect of the class was given a fair shake. As I discuss below, one issue that I encountered was the time required in developing a point system that did what I wanted to it to do. The delays in getting this finalized and workable put my grading way behind. Applying lessons learned in future semesters should alleviate this issue somewhat.

The other area that Chatfield mentioned, that of frequent, small rewards, actually caused complaints at the end of the semester is that spreading out the reward structure required constant attention on the students’ part. While we, as teachers, expect the students to put in consistent attention in the class, the reality is that most students want to be able to check in only on the week of a test or a paper deadline and essentially coast through the remaining weeks of the semester.

Even though I made expectations clear, the students were shocked about 2/3 of the way through the semester that they had failed to accumulate enough points. At this point, the point accumulations reflected their efforts posting on Google+ as well as their participation in in-class activities. To cite just one yardstick, I asked them to post twice on Google+ per week as a minimum activity. One post was in response to a prompt. The other was to bring in an outside source relating to the topic of the week. At that point in the semester (the 7thweek), they should have had a minimumof 14 posts. About 1/3 of the class had less than 5. I gave them opportunities to catch up point-wise by commenting on others’ posts but this was a real struggle for many of them. I did not allow them to go back and make new posts as this would have been unfair to those who did the work on time.

The second major part of my evaluation strategy was to be mindful of precisely what I was attempting to evaluate. The students and I had a frank discussion what kinds of skills they needed to be practicing in college and how the course could facilitate that. I introduced them to the National Education Association’s: “Four C” rubric as I think it encapsulates what I think is important to get out of college nicely and suggested that we use these four variables (Communication, Collaboration, Creativity, and Critical Thinking) to evaluate their performance. I broke it down as follows:

  • Communication – How well did students present themselves? These points were given for things like the quality and frequency of posts on Google+, quality of presentations in-class, and the clarity of their portfolio website.
  • Collaboration – How well did students work together to achieve the common goal of learning? These points were given for things like group self-evaluations for in-class debates, presentations, and commenting on Google+ posts.
  • Creativity – How effective were students in pulling together evidence into an argument? These points were given for presentation arguments, debate arguments, and the creative use of evidence in Google+ posts.
  • Critical Thinking – How reflective were students about the arguments they were presenting? These were the hardest points to earn. Students had to demonstrate self-awareness in arguments, an ability to see both sides of an issue, and to be able to integrate that into their presentations, posts, and portfolio.

The problem that occurred as a result of attempting to achieve this level of granularity was that I created four-headed hydra trying to translate the Four-C’s into a workable grading scheme. This was one of the key reasons that I didn’t get feedback to the class as quickly as I wanted to. I essentially ended up developing a manual Learning Analytics system to translate the variables into points. The net effect was that I forced myself into an incredibly complex grading system that took me half the semester to figure out. The extent to which I applied tech was limited creating a complete 12-page workbook in Numbers (it did better graphing than Excel). Frequent rewards meant that I had to deal with extremely small numbers and never really achieved the level of finesse that I usually demand from my grading.

Over and above the logistical hurdles that I created for myself, I don’t think the grading system taught the students much, if anything, about their relative skills in the four areas. I did discuss how I was assigning points while discussing grades within the class. However, I could tell the students were almost entirely fixated on the total points, not how they could earn points in communication versus critical thinking, for instance.

My struggles with the grading system and my ultimate failure in developing a more constructive system was perhaps the most disappointing result from this part of the experiment and goes back to my original dilemma. I never figured out how to move the discussion beyond, “How do I pass this class?” to “What can I learn that might be useful to me?”

I am not going to be able to get beyond the fundamental realities of the need for a final grade in the class as long as the larger educational system uses the transcript/degree as the final yardstick of educational achievement. Perhaps a more successful application of gamification principles would at least make this part of the class more fun and trivial. The students themselves live with fundamental contradictions in their own attitudes toward their educational investments. On the one hand, they play the grading game with seeming indifference when it comes to the value of an individual class. On the other hand, they are totally fixated on their final grade as the only valuable thing they get out of wasting their time in a class they don’t want to take in the first place. New Media may give us some routes forward but until it undermines these fundamental contradictions in assessment, technology’s transformative power will be very limited. Solving the grader’s dilemma may require adjustments beyond the capacity of any individual educator to achieve but I’ll keep trying. I’m stubborn.

 

Just Let it Happen: Living With Open-Ended Outcomes

(Originally Published on nmc.org March 2016)

In “Teaching Creativity – Not Conformity,” I began by positing that we need to encourage exploration through questioning rather than providing answers to our students (and employees). Since then, Kevin Kelly has reemphasized the same theme by identifying the skill of questioning as one of the key trends that will shape humans in our technological future. We have been talking this talk for years now. The exciting thing is that technology is now giving us tools to start walking the walk.

In an age driven by “quantifiable” accountability, however, many of our present organizations continue to struggle with the concept of open-ended outcomes. If we stress a questioning approach, particularly in a programmatic context, it is often seen as a risk that is hard to quantify unless we provide some sort of predictable, often quantified, outcome. Uncertainty of outcomes is often viewed as not being scientific or data-driven. That is only true if we are looking for overly simplistic answers in defining success. We can still measure outcomes – but they are often tangential to the original creative exercise. Sounds a lot like teaching, does it not?[LINK: http://www.nmc.org/blog/teaching-creativity-not-conformity/]

In 2010, I began facilitating the New Media Seminar for faculty at HCC’s Northwest College, which focuses on how the world is changing and ways in which we as educators need to adapt to those changes. Initially, I was attracted to the concept developed by Gardner Campbell because I felt it provided a nice venue for free discussion about technology and the development of pedagogy. What I did not realize at the time was the extent to which it developed questioning about the status quo of what we are doing in the classroom. Many faculty members who have participated in the seminar found it a strangely liberating experience from their daily experience precisely because of its open-ended nature. It was always a challenge, however, to explain that power to those looking at the course from the outside.

I am often surprised at how unwilling people are to question the seemingly predictable flow of their existence in order to leave room for the unexpected outcome. I always start my undergraduate government classes asking my students why they are there. It amazes me how few of them have even pondered the question. A lack of purpose turns college into essentially a hazing experience whose main purpose is to demonstrate perseverance, not learning, and all too often faculty aid and abet that by passing them through classes without giving them a clear reason to engage in learning. Furthermore, as we design courses and programs it is often easier to just go with the flow rather than risk the unexpected or ambiguous. Like training, answers give us neat little boxes in which to put everything. Teaching has the power to augment us as human beings. Giving students and faculty the freedom to creatively explore without a set purpose is liberating to some, incomprehensible to some, and threatening to others.

Makerspaces perhaps pose the biggest threat to the status quo: if they live up to the true ideals of making, outcomes are truly uncertain. In January 2016 we opened the D-Lab, the first of our makerspaces at Northwest College. The critical aspect that characterizes our D-Lab is that it is essentially a “rule-free” zone designed to promote creative problem solving. Questioning is driving the very development of the space, as we have added and adapted equipment and supplies based on community inquiries. Many of our spaces are being designed around the concept of project-based learning and our new building, the West Houston Institute, was explicitly designed to foster innovation and creative thinking. [LINKhttp://ideaspaces.net] The goal is to lower the barriers to entry as much as possible and then get out of the way.

The D-Lab –  Photo by Tom Haymes

We also recognize that all of these plans will falter if we cannot work to move HCC’s culture away from the rigidity that characterizes industrial modes of education. Training is a natural outcome of this kind of thinking. The D-Lab forms a gateway to a post-industrial mode that nurtures creative thought that is liberated by good teaching. To be clear, this is a hard task that will take years to manifest any appreciable impact institution-wide. Meanwhile, we are already seeing people gravitate to the freedom of the D-Lab in unexpected and unconventional ways.

Recently, my lab techs decided to build a virtual sandtable using an old projector and an Xbox Connect. They improvised this rig using zip ties and rubber bands. Believing it looked unsafe, our AV contractor offered to provide a proper mounting solution (costing hundreds, if not thousands, of dollars and taking months to implement, no doubt). I don’t blame him. He’s never seen activity like this at our college before. The space is generating a new form of agile activity not previously seen at our college.

Virtual Sand Table in the D-Lab – Photo by Tom Haymes

The D-Lab has shown me that there is a genuine hunger for independent thought and the creativeenterprise. Technology merely provides the mechanism to escape the rigid structures of the industrial world. This creativehunger is precisely what we need to harness bothinternally and for our students if we want to be successful in a world characterized by automation and increasingly complex problems. We are all creative teachers and learners. We just need to have that unlocked. I am seeing that happen every day in spaces like the D-Lab. It gives me hope that we will finally move away from the training mode to the creativemode.

We are currently in the process of developing business plans for the programs we envision at our new building, the West Houston Institute. Unlike the New Media Seminar, which cost relatively little other than time, and the D-Lab, which came together out of spare project funds, the Institute is a $50 million project before staffing costs. With scale comes scrutiny and a search for clarity by decision makers. While this approach is completely understandableand a responsible use of taxpayer money, a space designed around open-ended inquiry and uncertain, creative outcomes often challenges this kind of thinking.

It is relatively easy to figure out a hard set of numbers around a Collaboratoriumor conference space that can be rented out to business. At the same time, it is more difficult to demonstrate concrete outcomes around how these kinds of spaces also foster open-ended activities such as brainstorming projects and technology competitions. Activities in the makerspace or informal collaboration areas face similar challenges. If they are not associated with a particular program, it can be difficult for people to wrap their heads around the spaces’ purpose. Is the idea to let people play around? To what end? Quantifying qualitative outcomes is one of the central challenges of educational measurement and it is particularly acute in these kinds of circumstances.

The best argument to make is that this creates a very compelling learning environment, is a fun way to teach, and, perhaps most of all, creates critical 21stCentury skillsets in our students (and faculty). This approach fits into of the overall trend of refocusing around demonstrating skills rather than credentialing that is starting to be discussed throughout education.

One of my pet projects, the Teaching Innovation Lab (TIL) [LINK: http://ideaspaces.net/2016-innovation-loops-ideaspaces-presentation], addresses this challenge as well. The purpose of the TIL is to build upon the success of the New Media Seminar to create an open-ended venue for faculty to “play” with their teaching and share those experiences with their colleagues both inside and outside HCC. We also introduce rigor through data collection and analysis as well as structured workshops and other activities. Where faculty take it from there is up to them, however. Unlike many professional development projects, there is no defined fixed outcome. If it makes things better, it is a win.

I am excited about the potential for creative unpredictability of all of these efforts, but I can see where it runs counter to desires to establish standards around “good teaching” by accrediting agencies and state education boards. I do not believe there is one “right” way to teach, or at least I do not believe we are anywhere close to understanding what that is yet. Putting that notion into a business plan for the TIL is therefore presenting unique challenges. Some flexible thinking will be required all around. It is difficult to imagine quantifying exploration and unexpected outcomes. We can build a runway, but where flight takes our faculty and students is their magic, not ours. In other words, we can measure results in the context of skills acquired, but not necessarily predict them. That sounds amazingly like teaching to me.

 

Why is Augmenting Human Intellect So Difficult?

Array

 

Augmenting Human Intellect seems like such an obvious dictum. When I first read Doug Engelbart’s proposal from 1962, my reaction was to scratch my head and think, “shouldn’t this be obvious?” Over the years, however, I came to realize just how hard a challenge he had set before us. Despite the best of intentions, most technology is implemented with a set of blinders. It is no longer sufficient to understand human physiology to create a mouse or even to understand human mental organization to create hyperlinks and collaboration tools.

Engelbart, when he did his “Mother of all Demos” in 1968, was only able to succeed (barely) because he controlled every aspect of the system he was using. None of us can say that today. We are hostage to software vendors, hardware vendors, textbook publishers, and, perhaps most importantly of all, organizational structures that fail to execute on “augmenting human intellect.” This is a shame because we increasingly have access to integrated technological solutions that would seemingly allow us to achieve the goal of constantly maximizing human potential. However, a combination of politics, technological naiveté, and poorly executed technological solutions often impair our ability to grow as teachers, learners, and even human beings.

In many ways Engelbart had it easy. He was at the early stages of this story. He only had to worry about discrete technological challenges. This is in no way meant to minimize his immense contributions, both conceptually and practically, in developing the mouse, the word processor, hyperlinks, etc… Revolutionary as they were, he could confine himself largely to the walls of the Stanford Research Institute. It is absolutely true that without the work of him and his team we would live in a diminished technological environment but that environment has metastasized into something much, much larger and more complex.

We passed the pinnacle of what his practical vision led us to perhaps a decade ago. Operating systems were king and they were largely built upon concepts that he pioneered: gesture-based computing through a mouse or its descendants, the pointer or trackball; graphical-user-interfaces; an infinitely editable world. Internet 1.0 is largely a product of his (and Bush’s and Nelson’s) vision.

With the introduction of the iPhone, however, we started down another track. The iPhone differs from a traditional computer in that it requires a complex infrastructure of networks to make it even work. Furthermore, it is completely mobile and this shifted the emphasis, started initially with the Internet, decisively in the direction of systems. The Internet of Things is a direct descendant of this fork and is only accelerating the systemization of technology. Now we have to worry about connected doorbells getting hacked. On a more prosaic level, we have to worry about the doorbell connecting to the network reliably so that we can get an alert on our phone even if we are thousands of miles away. When those kinds of things don’t work we feel limited, lacking in augmentation. The fact of the matter is that we were limited by our technologies long before this latest set of challenges but they have brought home the scope of the challenges we face in creating integrated technology environments.

As Leonard Shlain argued, Art and Physics have long bounced against each other. Lately, moreover, we’ve seen an explosion of interest in design. Design is by its very nature holistic. Therefore it is no accident that this is happening. We are now designing our virtual environments and this is reflecting back onto our physical environments. We have started to recognize the limitations of all-in-one desks arranged in neat rows for our classroom experiences at the same time as we have recognized that a learning management chat room is a poor substitute for the real thing or even Facebook. We have suddenly realized that even Facebook and Twitter lead to unexpected outcomes in communication because of their design and the conversation has shifted around to such topics as “fake news.” The technology of education is fundamentally limited by an industrial mindset that argues that all learners are basically the same (or at least can be easily categorized) and that learning is most effectively broken down into 16-week segments, the outcome of which is one of five grades. These are every bit as much of a constraint to augmenting human intellect as tying a brick to your pencil is, and, I would argue, much much more.

Today’s augmentation of human intellect therefore requires an understanding of technological systems and how they interact with one another. It also requires an understanding of how human organizations operate and how they are just as likely to spread ignorance as they are to spread enlightenment. They are also just as likely to make things more inefficient as they are to make things more efficient. The reason your HR system doesn’t talk to your finance system is usually not a technological problem, at least not one that isn’t solve-able. It’s almost certainly a bureaucratic outcome of your HR department selecting software independent of your finance department and your IT department not understanding the implications of what is being specified – or not being consulted at all – before purchasing decisions are made. These are human failings amplified by the technological failings that are the product of the failing of the organization, which is itself a technology. The result is that the user attempting to achieve intellectual augmentation is instead forced to spend an inordinate amount of time dealing with things that are poorly automated. This is an augmentation fail and is in many cases even a retardation over the prior technologies being used.

The day-to-day drudgery inflicted on us by the priesthood, as Nelson so aptly puts it, eats away at our capacity to be creative, to be dreamers. For a brief time we are able to escape in a limited fashion the industrial world of drudgery but this world is once again enveloping us, now augmented by its own seemingly inexorable technological  and societal logic. We’ve given up some of our freedom in the service of efficiency and the need to make platforms more accessible. For instance, we used to code our own web pages and host our own web services. However, in order to spread this and to allow more advanced users move on to even more advanced technologies, we’ve given up freedom to designers and cloud providers. Even an open source platform such as WordPress, upon which this is being published, requires that you give up a certain measure of freedom as compared to hand coding a website. But, it gives access to the web to a much larger pool of creators so, in aggregate, it certainly augments human intellect.

The Engelbarts of today have to be systems thinkers. There are vast opportunities opening up through ubiquitous computing devices ranging for our cell phones, to beacons, to Raspberry Pi’s. We can build technology into everything. We can have technology everywhere. And, I hope, we can assemble, modify, reassemble, and hack technologies with increasing ease so that we can truly augment human intellect over a vast range of technological possibilities. We have to be able to go from the humble pencil to the networked blog to Twitter effortlessly. We need to spend our time creating rather than messing with poorly designed technological systems. We need to master the machines rather than letting the machines become our masters.

Engelbart shows us the way but his real value occurs in his admonition, implied as it is, that we step back and consider what we are trying to do before we consider how we do it. No one else will do that for you and many will offer you “solutions” that actually make it harder to do what is necessary to augment human intellect. This is more of a challenge in today’s political, educational, or personal environments than it ever was in the past. We have technology to thank for that but we also have technology to thank for that. The difference is how we go about it and Engelbart shows us the way. There’s lots of work to be done.

© 2020 IdeaSpaces

Theme by Anders NorenUp ↑