Month: May 2018 (Page 1 of 2)

99% Invisible

It’s time to stop thinking about technology as technology, but it’s hard to get people to do that. Users are either fascinated or fearful of new technologies, and these gut reactions make them blind to the ‘”why” of it. In this three-part series, I want to spend some time looking at how design telegraphs intention to users, how we can design technology environments (both hardware and software) to telegraph augmentation instead of fear and frustration, and how we can explicitly use more overt communications strategies to augment users’ ability to leverage technology for teaching, learning, and productivity.

When designing a learning environment that involves technology, there are a couple of guiding truths that are too often overlooked — first, the central element of effective technology is design; and second, the design employed is an implicit form of communication. Like a well-composed photograph, a well-thought-out design creates a pleasing and rewarding effect, but often the viewer can’t explain why. Architects and designers have an expression for this: “Design should be 99% invisible.” This applies as much to technology as any building, artwork, or piece of furniture. (This reference is from a great podcast “99% Invisible,” which I highly recommend to anyone interested in the topic of design in all of its applications.)

Unless you are a technician or technologist, the central function of technology should be as an invisible augmentation of the hard tasks we face every day. This is a difficult message to convey to both technology professionals and the average user because it requires an understanding of the importance of design in technology utilization. It also requires a fundamental reorientation of perspective on technology. You have to go far beyond the relatively simple bar of functionality to the much more complex bar of invisibility. In other words, technology should never be about technology; it should be about what it enables us to do.

To illustrate this approach, I often refer to my own unique experiences using technology. In 1999, long before I knew about Douglas Engelbart, I made the deliberate decision to leave the IT support field and took a serious pay cut to do so. At the time, I said that the reason that I was leaving was that I wanted to use technology, not be used by it. As a technology support person, you are a slave to the technology. Your existence is making sure it works, and your goals usually begin and end with the technology itself. As many of us know, getting away from this role is not as simple as walking away from it. Once people know you have a facility with computers, you automatically become tech support, whether that’s in your job description or not. Even as an instructional technologist, I’m constantly struggling to communicate to senior administrators that the primary role of my department is to support people, not technology.

My approach to computers has always been to look past them. I wanted to do things with them, not fiddle with them. The first BASIC program I wrote on the Apple ][+ was a routine that would resolve combat using the Dungeons and Dragons system. I was most interested in simplifying a real world (sort of) issue using the computer, rather than how to program. My approach to photography is similar. I want the camera to disappear so that I can focus on the image, which is a challenge.

As I like to say, it is only when the technology becomes invisible that true creativity can take place. My fixation with ends over means represents a critical distinction in how I approach technology that differs from the vast majority of the world. Again, technology doesn’t fascinate me.  It’s what technology lets me (and those that I’m trying to help) do that truly fascinates me. I don’t care if the computer works. I care if the person is able to work, create, teach, etc. with it. Increasingly, I’ve found that the only way to achieve these lofty goals is through good technology design.Every tool communicates to the user. A hammer says, “Beat something with me.” Even a two-year-old knows what to do with a pencil. He may not be able to write, but he can certainly scribble. These are natural interactions based on very functional design. Peter McWilliams illustrates this point through humor and reversing the simple to the complex in the now out-of-print McWilliams ][ Word Processer book.

The McWilliams book ironically demonstrates that when it comes to what we tend to think of as technological tools, too often the message sent by our modern tools is, “Be afraid, be very afraid” instead of “Imagine what I can help you do.” Bad technology design often gets in the way of this goal.

Combining functionality with good design is a rare trait in the technology world. This is often what sets Apple apart. Over its history, what has set Apple apart has been its unique ability to bring good design to existing technologies and make them wildly successful. There were PCs before the Apple ][+, there were mp3 players before the iPod, and there were even tablets and smartphones before the iPhone and iPad. By adding the touch of design to existing technology they made it accessible. This is a lesson we should take to heart as we design classrooms, software interfaces, and other technology spaces on our campuses.

About eight years ago, I was given a report on “ubiquitous computing.” At the time, it was considered to be the coming thing, but I completely failed to grasp its significance. To me, “ubiquitous technology” was about technology everywhere and on-demand. What I failed to see was that “everywhere” means that technology is integrated into space designs, not just tacked on. In order to achieve this, technology design has to be invisible. It should not be conspicuous, but instead quietly augment what we are doing to the point where it makes a new world possible.

In How Buildings Learn, Stewart Brand argues that even architects struggle with this concept. They build buildings as monuments to themselves instead of being concerned with the basic functionality of their use. He argues that the most effective buildings are the ones that can be infinitely reconfigured to meet the needs of their inhabitants. Technology should be viewed in exactly the same way. Rigid networks and computer security, rigid implementations of big technology are all inhibitors to the usage of the technology and the space in which it is placed. Classroom multimedia podiums are every bit as much as a monument to technologists as a Frank Gehry building is to architects.

 

 

The key is to bring together technical and non-technical people and help them understand the critical importance of good design in executing projects that involve technology. This is an ongoing challenge for my team and me. When I try to get this point across to the designers of our technology systems, they find it a hard concept to grasp. Most are not even designers in the traditional sense of the word. As a result, they are designing technology environments in purely functional terms and without regard to design. As a result, technology is, to the end user, ugly and inelegant. We will never achieve real technological adoption until we fix that.

Contents

Part II: The Essence of Good Design
Part III: The Goldilocks Zone
Part IV: A Holistic Approach to Learning Space Design

The Essence of Good Design

A few months ago, I delivered a talk on “Idea Spaces” at the New Media Consortium Summer Conference in Portland. It was a concept I had been struggling with for quite some time. There were many disparate threads to connect, and it was challenging to find time for reflection. Like many things in my life, the talk came rushing up to me faster than I was ready to deal with, and my task to discern the coherent narrative amidst the complexity was on a tight deadline.

Fortunately, on my flight to the NMC Summer Conference, I had time to watch two excellent documentaries. The first was Tim’s Vermeer  in which a Texas-based inventor, Tim Jenison, tries to replicate a 17th century Dutch masterpiece, and in the process, discovers an ingenious optical tool that Vermeer likely used to paint his photographic, still life pieces. The second film, Objectified, explores the importance of design in our lives. Both are highly recommended for anyone trying to build things — narratives or technologies (both hardware and software). The key idea I took away from these films was the importance of simplifying your final product without “dumbing” it down.

I decided then, on the plane, to challenge myself to remove at least 5 slides from my 25-slide presentation for the following morning’s talk, and given the reception from the crowd, it turned out to be the best presentation I’d ever given. It was a tough exercise, but it contains within it an object lesson for anyone who designs technology solutions. When it comes to creating the technology environments, our main challenge is to be simple and clear without being simplistic, and to communicate complexity in a clean, well-designed way. When making presentations, we often lose sight of the ultimate goal of our audience, which is to get something useful out of our story. The same thing is true when our audience is the users of our technology.

Elegant design can allow the user to interface with complex technologies in a way that is simple and approachable. This requires a discipline that is often lost on many technology designers who like to emphasize all of the neat things the technology solution can do. In the process, they lose track of the user’s goals and the cleanest narrative that is necessary to take them there. Like my task with this presentation, our work of designing technologies for education can produce overly complicated solutions because we lose sight of the end and the clearest path to get us there. Maybe it’s time to cut some slides from our technology designs.

The process for how to get to clean lines is not always clear. While it is essential to engage the end users at every phase in the project, they should not, in the end, control the process. Steve Jobs famously eschewed focus groups because he understood a fundamental problem inherent in creating an elegant solution to a technology problem: users often have a poor understanding of what makes up effective design. This is counter-intuitive, but what sometimes occurs is that a group of users will have a wide range of views on what they want a technology to do. Everyone starts adding their own pet need based on their own behavior patterns. The result is often a hodgepodge of features that muddy the essential end of the device in question.

At some point, you have to go in with a scalpel and start simplifying the process. As I have emphasized repeatedly in this series, a key way to think about this is narrative. Every piece of technology tells a story. Mostly that story is “how to use me,” but in addition, a truly effective piece of design narrative will inspire the user invent to new ways to use the technology instead of being preoccupied with just making it work. This is really the only way you get to the state of “augmentation” that Douglas Engelbart was striving for. In other words, narrative is as important to the function of classroom technology or learning management systems as it is to good storytelling. People who are in the story, however, usually fail to see the encompassing narrative except in retrospect.

Unfortunately, technicians, installers, programmers, and what we traditionally think of as IT are often equally poor at understanding the narrative of technology design. Storytelling is not part of most Computer Science curricula, and most people who are interested in that tend to get degrees in English or one of the design fields. In a recent interview, Ted Nelson, the inventor of hypertext and an inspirational figure in the technology community said, “Most technical people have virtually no idea how to present things to you.” This creates a serious problem because design is presentation and storytelling.

What we should take away from this is that every technology design project should have someone on the team who is a relatively disinterested storyteller. They do not need to be technical experts, but they must understand the technology well enough to communicate and negotiate with technical experts when they say, “It can’t be done.” This person should also have no technical stake in the outcome and should not be asking for feature sets. Their sole role should be to understand the story the technology is telling to the user and make sure that is a fundamental part of the design.

Because, as it turns out, the world needs poets and artists, after all. Marshall McLuhan wrote fifty years ago in The Medium is the Message, “The serious artist is the only person able to encounter technology with impunity, just because he is an expert aware of the changes in sense perception.” Those who design technologies should leverage that expertise as much as possible.

Contents

Top

 


The Goldilocks Zone

 

The design narrative of a technology or physical space dictates how it will be used (if it is used at all), and it should be simple and straightforward. In many cases, technology design can be almost subliminal, and once a paradigm is established, it is hard to break. If faculty members think classroom technology is complex, unreliable, and hard to use, that perception will change their approach to using the technology in the classroom, even when the technology has been upgraded to a more user-friendly design. By that point, the narrative has already been established and only a radical break will force them to reevaluate that narrative.

At the end of the day we are talking about a human interaction, not a technological one. Historically, technology frequently dictated its own narrative, sometimes unnecessarily and sometimes because of the limitations of the technology involved. However, with the advent of smaller, lighter technological solutions, it should no longer do so. We now have the opportunity to reintroduce the human narrative into the technological one; to bring the human back into human-computer symbiosis.

There is no longer any excuse for overly complex technology. All new systems need to be simple and intuitive to use. Getting to that point, however, means striking a delicate balance and landing somewhere in the “Goldilocks Zone,” the part of the spectrum where technology is complex enough to nurture creativity and learning, but not in a way that creates a barrier to entry.

In this part of the series, I want to demonstrate how we have attempted to find the Goldilocks Zone in our own classroom technology designs at Houston Community College. This has involved a complex process of continuous iteration and close cooperation between technologists, architects, furniture vendors, contractors, and, most importantly, the users. This cooperation continues to be a challenge because, as I am constantly reminded, making that kind of design leap is difficult unless you have a broad perspective on the problem. Few people have a perspective broad enough to form a narrative that encompasses the array of issues that come up in technology design.

In a recent interview, Ted Nelson compared Steve Jobs, a master of design thinking, to a movie director because he directed the Apple team the same way a director directs the cinematographer, actors, and set designers, in order to create his vision of the movie narrative. Nelson also pointed out that Jobs’s real genius was that his narrative vision was intuitive in understanding the users’ needs rather than their wants. A needs assessment is therefore a good starting point for understanding our own design journey.

In our case, this consisted of three tasks that faculty have expressed the need for.

They are to:

  1. Turn the system on;
  2. If they brought their own laptop, to toggle between the podium PC and their laptop; and,
  3. Control the volume in a movie or audio clip they are playing.

If you think about it, these tasks could be done with two rocker switches and rheostat, which can be purchased for $5 at an electronics store. However, amazingly, Crestron did not offer this simple solution in 2006-07 when we were installing our first iteration of classroom technologies. The interface at the right is what we ended up with for $150 per room (note that none of these switches are rocker switches, they are pushbuttons).

Let’s think through how long it would take to accomplish  the three tasks with this interface.

Task 1: Push the button marked PC and wait for the system to spool up (projectors take time to react, and the button has been known to get out of sync with the projector).

Task 2: You have to depend on an autoswitcher to work properly and flip the input to the laptop. You do this by plugging in the cable hanging from the side of the podium.

The video switch is only useful for VCRs and could not be repurposed as a laptop switch. Very few faculty have ever used VCRs with these systems. The problem with this part of the system is that if it fails to work the user has no idea why and there is nothing they can do to troubleshoot other than to assume that the “magic” failed to happen.

Task 3: The volume control may be the only intuitive part of this interface.

 

The lesson here is that the narrative interface presented to the user appeared to be simple. However, in actual practice, it turned out to be opaque and unintuitive for the tasks it was designed to facilitate.

The next most expensive option at the time (~2007) for $1500 more was to use the PC to control the system and run software called XPanel. $1500 per room is not a trivial amount when your project requires outfitting 150 rooms (that’s $225,000 more). As I wanted to make sure I outfitted all of my rooms uniformly, that was never a realistic option. Cost aside, this system is no more effective than the first in achieving our three tasks intuitively and elegantly.

Let’s go back to the three tasks: In the first screenshot, which is the desktop when you wake up the computer, where is the software to make the system work? (Hint: The red-green-blue icon in the toolbar.)

The way to turn on the system is not entirely apparent. Now, lest you say: “Why don’t you put an icon on the desktop entitled ‘click here’?” We did. Everyone moved it around so you couldn’t find it most of the time.

The system is better for switching once you have the software launched, as the second screenshot illustrates. However, there are some issues with buried menus and the volume control is less intuitive and somewhat fussy. At the end of the day this system requires as much training and support as the “simple” system (and costs a whole lot more to boot).

We get as many, if not more, support calls in the three rooms in which we installed these systems than in the more basic system. The lesson here is that higher cost doesn’t necessarily result in a better-designed system.

Our new systems offer a much simpler interface at only slightly more cost. At first glance, it looks low-tech, but that was the intent. In Make It So, designers Nathan Shedroff and Christopher Noessel note, “new interfaces are most understandable when they build on what users already know.” (p. 19).

The closest analogy for the multimedia classroom is probably the TV remote. From the user’s perspective this interface is merely a TV remote attached to the wall. The design is simple, clean and efficient. Everything is out front. The three tasks all have their own set(s) of hardware buttons. Once we got that right, the number of calls from these rooms plummeted, and the users were much more satisfied with their experience in the classroom.

The other thing well-executed systems do is increase the reliability of the rooms. They are very simple to build, maintain, and upgrade. These systems rely on the TV for most of the switching and therefore have very simple basic functionality on the back end. This makes them far more reliable and quicker to troubleshoot. However, they are also very expandable to encompass new technologies as they emerge and to add additional functionality the users might need without disrupting the basic narrative of the three tasks outlined above.

All of these factors make the narrative of the classroom much more about possibilities than limitations. Users can access the technology in an intuitive way. They can rely on it to work a high percentage of the time, and we can add on new features fairly simply, such as wireless instruction with iPads or other mobile devices as they become viable within the constraints of our network. This means that the faculty and students can focus on learning and innovating. In this way we are actually augmenting their experiences rather than hampering them.

Contents
Top

 


Toward a Holistic Approach to Learning Space Design

In one of my favorite quotes from The Medium is the Message, Marshall McLuhan wrote, “The serious artist is the only person able to encounter technology with impunity, just because he is an expert aware of the changes in sense perception.” In other words, since artists fundamentally deal with bending a technology, whether it’s photography, paint, or the violin, they have to be acutely aware of the narrative that that technology produces.

Design operates on that same principle. Those who have a comprehensive sense of technology, design, and the narrative it creates, understand how easily that narrative is disrupted, the same way as a dropped note or misplaced brushstroke provokes a jarring departure from the vision that the artist is seeking to convey.

Classrooms have a narrative structure that most people don’t see. Technology is part of that narrative, but so are the furniture, whiteboards, and even the color of the walls. What story does this room tell?

 

The narrative of this room is that the action is all in the front. The projection system and the whiteboard are there for the instructor’s use only, and all learning-focused interactions are within the instructor’s control. Your neighbors are largely irrelevant to the experience. The colors are interesting, but they take a secondary role to the dynamic of the space.

In short, the story here is: Look at me. I’m in front. I’m all that matters. Compare that to this:

 

This room, one of our pilot learning spaces, is a bit more collaborative than the other room, but it’s not perfect. The focal point is still at the front of the room with the projection system, and some of the desks are oriented in that direction. It straddles the line between a truly collaborative setup and a lecture hall, and the result is confusing from a narrative standpoint. I’ve observed that this room is still used about 80% for lecture and 20% for collaboration. This is because when a new narrative is not obvious, the default position is usually to go back to what you know and that is your traditional teaching style. Lecture method is also safer because, as a teacher, you control the narrative of what is going on in the classroom, and in an unfamiliar environment, this becomes doubly attractive.

The University of Minnesota (UM) has taken narrative disruption a step further and created a truly collaborative classroom where it is hard to fall back on old habits.

Active Learning Classrooms (2012) from Andy Underwood-Bultmann on Vimeo.

We are building five of these classrooms into our new spaces, and I’m excited about the promise that they have in reshaping the narrative of teaching so that it’s expressly focused on collaboration. In comparison to our pilot learning space, the UM’s Active Learning design gives students ample outlets to create their own content and work through topics at a peer-to-peer level. Whether they are hooking up to displays connected to pods, or taking advantage of the whiteboards, they have assumed control of interactions within the classroom, and as such, have assumed ownership of their learning. The teacher is still an important facilitator in this process, but has to give up control in order to let the students create their own paths to learning goals.

Weaving in a principle from one of my previous posts, the appearance of technology in a room dictates the narrative of the learning space. A key design goal here was to minimize the impact of technology in the classroom while maximizing its ability to augment the learning experience of the students. By turning the focus away from the front of the room, and instead, to the space between students, the narrative of the classroom becomes clear: the room has become a place for teaching and learning rather than a place with flashy and sometimes working technologies.

One key component here was moving the computer away from the podium and onto the wall. In this way, the podium reverts to its original role of being a piece of furniture rather than a storage place for technology. This is critical because it creates an opportunity for the instructor to decide how he or she will place and use it; the podium is a key mechanism that determines the power narrative within the classroom. Note that UM’s Active Learning classroom doesn’t have a podium in the traditional sense.

The role of the podium in a classroom deeply affects the narrative of the classroom. It is a symbol of power and authority. Some faculty even go so far as to hide behind it. Others try to minimize its role within the room because it disrupts the active/collaborative narrative that they are trying to establish. Once the podium is freed of technology, it can be used in the way that makes the most sense. Personally, I want to get it out of the way. However, the one thing that I have done in these classrooms that has caused the most resistance amongst the faculty has been to replace the podium with a lightweight lectern. For some faculty, this was a step too far and serves as a reminder of how difficult it is to reshape the larger narrative of the role of the teacher in the classroom.

This also reinforces the central question of design: When a student or a faculty member enters a space, what does that space communicate to him or her about what kind of teaching and learning happens there? The subtlety of this narrative can easily be subverted by any number of bad design decisions. Remember that my goal has been to create an environment where technology augments instruction. I’ve succeeded when the teacher can control the narrative within the class more effectively than before, and we’ve given him or her an expanded set of tools not available in the pre-technology classroom.

I do not think we have ever truly succeeded in this effort. For instance, can you spot the critical flaw in the room pictured? I have succeeded in my technology goals for the most part. The tech is off to the side. However, the alignment of the tables and chairs is rigid since you can’t move them easily. If everyone is facing the front that implies authority, and the lecture method of instruction is the natural consequence of this arrangement.

There are ways to mitigate this. In one department they’ve aligned the tables so that the students face each other rather than the front of the room. However, most do not do this and no one else I know of spontaneously rearranges this furniture. It’s too heavy and inflexible. If we are moving to a learner-centered constructivist narrative in our teaching and learning, this room does not communicate that message.

As you can see, many things can degrade the utility of a classroom if the design, technology, or other factors cloud the intended narrative. If we build learning spaces and technology around the old model of teaching, we should not be surprised if that is the kind of teaching that takes place. We must accept this reality if we stand any chance of reinventing the narrative of higher education going forward. Mindful design is key; a misplaced button, the wrong kind of furniture, or even the color of the walls can completely overshadow the message you are trying to send.

The last room illustrated was designed by a project manager, an architect, a furniture vendor, a technologist (me), an installation vendor, and a facilities manager. As such, it consists of the sum of its parts and nothing more. This is because it lacked a directorial vision centered on mindful design. Many of its elements failed to consider the desired behaviors of its users or the goal of learning because, other than myself, no one on that team had ever taught or had experimented with active learning strategies.

Creating modern learning spaces has to be a seamless collaboration of many entities with clear pedagogical goals in mind. As a member of the design team for two major campuses at Houston Community College, keeping this goal front and center is my full-time challenge and one that many of the other participants in the process do not consider a priority. Whether our next projects will achieve a truly mindful classroom design remains to be seen. I think we have yet to create a classroom that lives up to Steve Jobs’s design standards, but we’re getting closer.

Introducing the ELITE Strategy


As I have written about many times before the world we live in today is becoming more and more dynamic. This means that the old industrial silos of information and specialization are becoming a problem. That’s because, to borrow from Frans Johansson, we are increasingly working in the intersections between traditional specializations. What matters now is not as much the integrity of the design but how that design communicates flexes and supports the missions of the space that it was built for. This is always been true to a certain extent but never more so than today.

As I have learned from years of working with architects, living in an intersection is challenging. One of the main reasons for this is that both sides of the conversation often fail to realize that the other side is not in understanding them in the way that they intended. Furthermore, neither side clearly understands how technology, analysis tools, and the speed of change is impacting their jobs and the environments in which they work.

The ELITE Strategy is specifically designed to address this problem. ELITE, which defines a hierarchy of spaces, both physical and virtual, is based on the application of data on active and empowered learning as well as innovation support as practiced by companies such as Google and Facebook. It forms the cornerstone of the Space aspect of The IDEASPACES project. In an educational environment ELITE stands for Empowering LearnersInstructors, and Technological Empowerment. In a corporate environment the same rubric applies but his focus more toward Innovation facilitation. In this environment the acronym stands for Employee Learners, empowering Innovation Teams, and Technological Empowerment.

There is no question that empowering students and employees improves overall performance. In the case of students, Active and Empowered learning lead to a significant reduction in course failures, according to the National Academy of Sciences , so we should be constructing learning spaces to facilitate such activity. This data shows a simple fact: if the student can control significant parts of his or her learning experience, he or she will do better in school. Most traditional do not give the student much control. He is subject to listening to information (much of which is freely available elsewhere and is often of questionable relevance). In the case of employees companies such as Google, 3M, and others have successfully implemented employee empowerment times to support innovative projects.

Physical and virtual spaces have to support these kinds of activities and in many cases they don’t. The reasons for this are twofold. First, those using the spaces (students, teachers, employees, managers, etc.) do not necessarily understand the difference between a design that supports their intended activities and one that obstructs them until they start to use the space. In some cases these spaces are adapted to meet the needs of the moment. However, some spaces are particularly ill-suited to this kind of adaptation. In any case, the goal should be to optimize the environment for active learning and innovation, not place barriers in the way of such activity. On the other side of the problem, architects, interior designers, and software developers often have a poor understanding of what it takes to give Learners (traditional or employee) control over their environment so that they can shape it at will and on the fly. The LIT Strategy is designed to provide a common vocabulary in the analysis and design of spaces designed to facilitate learning and innovation on the individuallevel.

Organization is important to structure learning and innovation. Under Active Learning, the role of the teacher, librarian, guide, or team leader is critical in steering the innovation and learning down productive paths. Instructors and Innovation Team Leaders are perhaps the most critical link in establishing Liquid Networks, as Steven Johnson refers to them. In order to do this, they need to be empowered to create and recreate environments for the group, whether that be a classroom or a team, on the fly. Many of the strategies that empower Learnersalso empower this level. However, often larger changes that impact groups of people will amplify the learning effect, stimulate larger network activity, and transform school from a class to a community. The typologies and tools on this level are explicitly designed to create this kind of effect.

Finally, we have to flip the problem of empowered teaching, learning, and innovating by considering the impact of Technology on spaces. One of the limitations of most cutting edge technology is that it tends to create environments shaped around the technology and not user empowerment. Often this results in spaces that are poorly designed for teaching and learning. The good news is that this technological imperative is becoming less and less of an issue as the technology itself is trending toward more physically lightweight solutions. MakerSpaces are specifically designed around lightweight, flexible technological solutions.

Siloed environments like auto shops or metal shops are becoming more archaic as the technology in them is becoming outmoded or replaced by more automated solutions that are less dangerous. However, there is still a need for these kinds of spaces in many environments. If our focus is on empowering learners and innovators, they should be made as receptive to those modes as possible by introducing elements of Learnerand Instructorempowerment into them as much as possible. They also have to be able to react to changes technology and programmatic needs as easily and cheaply as possible.

The ELITE Strategy is designed to create environments that augment learning and innovative activities at levels. It is specifically designed to maximize utility while minimizing costs. This has the added benefit of making the physical and virtual space robust in the face of unexpected changes in needs and technology that are sure to disrupt education and business in the coming years. This reduces M&O expenditures by creating spaces that require minimal remodeling or infrastructure adaptations to future needs.

ELITE can be used as a design tool or an analysis tool. As an analysis tool it provides a framework to analyze existing floorplans or virtual spaces to look for opportunities to easily and cheaply rethink spaces. The system is complemented by a set of cards that establish a typology and tool set for programming and design exercises as well as brainstorming future spaces. While the card exercise is directed primarily toward physical spaces, this strategy can also be used to design virtual spaces.

Creating Spaces for Creativity

These articles were originally published on PBK.com in 2017.

What are Creation Spaces?

Creation spaces are spaces that facilitate creativity among their users. As technology creates opportunities for building and making new things that were previously the domain of industrial concerns, these spaces give students the ability to create tangible objects as part of their learning journeys. The traditional version of this kind of space is the art studio, which maintains its relevance and is enhanced through the addition of new technologies. However, new kinds of spaces are emerging to support specific kinds of creation. On the software side, One-Button Studios allow students to create videos and other multimedia content easily. On the hardware side, a wide range of MakerSpace environments allow students to create everything from robots to sculpture to furniture in a single environment.

How big an Investment are Creation Spaces?

Investments in creativity spaces can range broadly from a few hundred dollars to millions of dollars for a fully outfitted MakerSpace. Space considerations range from a few dozen square feet for an in-class Creation Station to 15,000-20,000 square feet for a Fabrication Lab that has a large functional range extending to woodshop and metalshop services. Staffing costs also range from practically nothing in a One-Button Studio to a specialized instructor required to maintain safety in a woodshop-type environment. Middle-of-the road environments like the Design Lab allow institutions to right-size facilities to maximize resources to their students while minimizing costs and overhead.

In all of these spaces, costs for equipment have declined steeply over time as equipment previously requiring specialized staffing is becoming increasingly automated. Parallel to this, safety concerns have declined as that same equipment has become simpler to use and safer to operate, necessitating a less specialized staffing model. For instance, laser cutters can perform many functions that used to have to be done by bandsaws, jigsaws, and other cutting equipment. Laser cutters are automated and therefore require less manual skill to operate. They are also enclosed and pose few safety concerns. These trends expand access to creativity enhancing tools to a wider audience and cost institutions less and less money to build and operate.

What are the benefits to instruction?

The economic logic driving down costs in technology and staffing are also driving change in the private sector and making skills acquired in this kind of environment very desirable on the job market. Companies are recognizing the benefits of having in-house design and fabrication over the delays and costs traditionally associated with prototyping. In addition, we are seeing a wide range of entrepreneurial activities rising in response to the low startup costs associated with many of these technologies. Learning by doing and the development of a portfolio is a very desirable educational outcome facilitated by these kinds of spaces.

One of the key aspects of creativity is that of access to the tools. You can’t control when good ideas happen. These spaces help students of all ages execute on their inspiration, whatever that might be. Therefore, those spaces need to be accessible when the magic happens, whether that is in the context of classroom environment or in a dedicated space. As mentioned in the previous section, that challenge has become easier as the tools become more accessible to non-craftspeople. However, the effectiveness of those tools is blunted if they are not easily accessible.
Another aspect that is important, especially at higher grade levels and in higher education, is the potential for these spaces to become interdisciplinary environments where students learn to work in complementary teams. As Frans Johannson points out in The Medici Effect, the Renaissance was in large measure a product of patrons such as the Medici bringing diverse thinkers from a broad range of disciplines together. The Industrial Age reversed this trend as people became more specialized within their disciplined. However, the Post-Industrial Age is reversing that trend yet again. Creation Spaces can become hubs where budding artists, entrepreneurs, engineers and scientists can come together to realize their collective visions.

Contents


Creation Station

Easily Accessible Basic Tools Can Broaden Horizons, Sharpen Skills

What is it?

A Creation Station is a micro-MakerSpace designed to fit within the confines of a classroom space. The idea is to allow for spontaneous creativity and support of programmatic activities within the classroom. The equipment is tailored to be accessible and unobtrusive when not in use,  but to give teachers and students easy access to Maker tools.

How big an Investment are Creation Stations?
The idea behind these spaces is to maximize access to Maker tools to as many students as possible. The Creation Station is designed to fit into any traditional or NextGen learning space, giving instant access to the tools available there. Depending on the age of the students everything from a basic vinyl cutter ($200-300) a sub-$500 3D printer, craft supplies, some Raspberry Pi’s or Arduinos (inexpensive microcomputers and controllers that can be used for a range of educational projects), and a large set of Legos would support most projects at the K-5 level and beyond. Coupled with existing desktop space, this kind of space could be deployed on top of a 4’X8’ table with room for storage underneath. Some teacher training will likely be necessary to maximize the utility of these kinds of spaces but no specialized personnel other than the usual IT support would be necessary.
Careful selection of the tools available is critical to the success and utility of a Creation Space. Since these spaces are designed to be accessed by non-specialized teaching staff, barriers to entry should be a primary consideration. We have seen too many un-used 3D printers that have either proven themselves to be hard to program or difficult to keep in operation. Newer 3D printers are designed to be accessible to a wider audience. Other devices like basic vinyl cutters such as the Cameo system also provide easily accessible Making with minimal training and skills. The focus of the instructor should always be on the instructional goals, not messing with the technology.

What are the benefits to instruction?

Introducing a Creation Station available in a classroom environment gives a teacher the ability to introduce directed, experiential activities on the fly. This has significant impacts on learning outcomes. Professor Michael Prince, in a review of the Active Learning literature, writes, “Introducing activity into lectures can significantly improve recall of information while extensive evidence supports the benefits of student engagement.” (Prince, Michael, “Does Active Learning Work? A Review of the Research, Journal of Engineering Education, July 2004)

Having Maker capability also aids in the fusion of disciplines, a critical skill in the future workforce. Robert and Michele Root Bernstein said in a recent article in the Association for Supervision and Curriculum Development that, “finding ways to foster arts education alongside science education—and, even better, finding ways to integrate the two—must become a high priority for any school that wants to produce students capable of creative participation in a science-dominated society like ours.”
Having ready access to a carefully selected set of tools within the classroom will give teachers and students the ability to express or understand critical concepts through making. Some of these items might fall under what we traditionally see as arts and crafts but new tools such as automated 3D printers and vinyl cutters increase the complexity of projects that were traditionally the province of scissors, colored paper, and clay. They also begin to introduce students to concepts like coding and design that they can build on in later grades. Software tools like TinkerCAD and Scratch provide easy avenues for students to access these advanced tools.

Sample Equipment List (approximate cost $1500-3000 per room):

  • 3D Printers (Quantity 2)
  • Silhouette Curio Vinyl Cutting Tool (Quantity 1)
  • Arduino Starter Kit (Quantity 1)
  • Raspberry Pi Starter Kits (Quantity 3)
  • PCs or laptops (Quantity 1-3)
  • 650 Piece Lego Basic Bricks (or equivalent) (Quantity 1)
  • Little Bits Electronic Base Kit (Quantity 1-3)
  • Misc. Supplies (Filament, Vinyl, etc.)
Back to Contents

Design Lab

What is it?

A Design Lab is a lightweight MakerSpace designed to fit into a classroom-sized space. Through a careful selection of tools it can perform many of the functions of a full-sized Fabrication Lab without the specialized infrastructure and safety concerns of a more robust setup. Another critical feature is that the space should be as accessible (and visible) to the campus as possible. Students will be curious about the activities taking place there and will want to participate, but only if they know that it exists.

Accessibility must be coupled to that visibility. While some programmatic activity is desirable, at the end of the day this has to be a project-driven space. Therefore, it must be accessible when students have the time and inspiration to work on their projects. Additionally, within the bounds of safety, students need to be given maximum freedom to explore projects. What seems like a silly T-shirt design today might lead to a complex 3D-printed artifact tomorrow. The experience of doing will lead to learning.
The library offers a good access model and one model might be to convert part of an existing library into such a space. However, it has to be realized that the Design Lab is not a quiet space and so if that is a priority in a school’s library it might make sense to put the lab into a separate space.

How big an Investment  is a Design Lab?

The idea behind the D-Lab is to assemble a usable set of light tools into a resource space. Typically, 700-1000 square feet is sufficient to achieve this task. It is important to leave a portion of the space available for people to spread out their projects. Heavy-duty (hard surface or butcher block) tables are recommended for this area. A selection of 3D printers is usually expected. Other equipment can include vinyl cutters, PCs, soldering irons, miscellaneous electronic components, and a standard set of hardware tools. If minimal ventilation is possible then a laser cutter is a good tool to have in this space as well. All told the equipment in this space should total no more than $20,000-30,000. Adequate power is critical. If this is a conversion project, a computer lab is a good choice for a basic space as those spaces typically have sufficient power for the various devices in the space. If not, then electrical upgrades through the addition of outlets might be required.

Ongoing expenses are fairly manageable. While it is important to have a full-time staff member overseeing the space, part-time student workers and even volunteers can be used to staff out the space. Minimal skills are necessary to run equipment in this kind of space, and many of them will likely have to be learned on the job in any case.

As far as supplies are concerned, a small supply budget is appropriate for programmatic activities and demonstration projects. However, the expectation for students is that they bring their own supplies into the space. At the college level, college bookstores will often stock 3D filament. Filament is also widely available online from vendors like Amazon. Other supplies will commonly be found in home improvement stores like Home Depot and Lowes.

What are the benefits to instruction?

The Department of Education says, “Through making, educators enable students to immerse themselves in problem-solving and the continuous refinement of their projects while learning essential 21st-century career skills, such as critical thinking, planning, and communication.” The benefits to having a space such as the D-Lab in every school, particularly at the Middle and High School levels are significant. What is even more exciting is that every one of these kinds of spaces that we have created have generated unexpected ideas and benefits to both the users and the schools in which they are located. Students have developed innovative solutions to everything from school signage to food production. They are essential breeding grounds of ideas in the 21st Century.

A student-built Food Computer at Houston Community College’s Alief D-Lab

RESOURCES:

 

SAMPLE EQUIPMENT LIST:

  • 3D Printers (6-8 of various types)
  • Laser Cutter (optional) – Glowforge could be a game changer in this area because it doesn’t require additional ventilation like more traditional laser cutters
  • PCs (6-8)
  • USCutter 34” Vinyl Cutter (1)
  • Arduino Kits (1)
  • Little Bits Kits (2-4 various kits)
  • Raspberry Pi’s (10-12) – students often purchase their own
  • Toolbox and basic tool set (1)
  • Butcher block or other hard surface work tables (3-5)
  • Misc. Supplies (filament, vinyl, etc.)

Back to Contents


Fabrication Lab

What is it?

The Fabrication Lab is more typical of what most people think of when they hear the term “MakerSpace.” In addition to the tools typically found the Design Lab these kinds of spaces will also typically include advanced fabrication areas such as Metal Shops, Wood Shops, and Machine Shops that require safety protocols and specialized staffing.

The challenge is figuring out what that audience of collaborators needs and how best to design and set up a space to meet those needs. There are a number of developed models, both on the educational and for-profit side. On the educational side MIT has developed its FabLab concept and on the for-profit side TechShop maintains a network of 10 for-profit MakerSpacesthroughout the country. Both models are designed to create environments where a broad spectrum of tinkerers and artisans can come together. 
Generally speaking, the Fabrication Lab is most appropriate for older learners in High School or Higher Education. It contains many pieces of equipment normally found in CTE spaces. Schools should consider the benefits of creating MakerSpaces to replace traditional CTE spaces as technology is rapidly moving toward a more generalized toolset and students would be better served through access to such a space. The US Department of Education recently sponsored a competition to help High Schools and Colleges convert existing CTE spaces to the MakerSpace format.
One additional factor that needs to be considered is to what extent this space will be open to the community. Opening up the space to the community carries with it many benefits, not the least of which is bringing in talent with the capacity to enrich and broaden the learning experience for all concerned.

How big an Investment is a Fabrication Lab?

The investment in a Fabrication Lab is not trivial and schools need to go into the development of this space with an understanding of the staffing needs required to successfully operate such a space. Like other MakerSpaces, the Fabrication is most useful when it is treated as a resource like a library and is accessible to the community (either inside or outside the school) as much as possible. It should be treated as a collection of tools designed to facilitate creative activity. The TechShop design is a creative approach that allows for the segregation of low-risk equipment, generally following the model of the Design Lab from the more dangerous wood shop, metal shop, or machine shop components. The MakerSpace at the West Houston Institute was designed using this model.

The West Houston Institute MakerSpace. Areas in Green are generally accessible and contain equipment that can be facilitated by part-time employees. Access is limited to the areas in red because they contain equipment that could be hazardous and need to overseen by specialists in the operation of that equipment. We would anticipate over time that equipment will increasingly be deployed into “green” areas and provision should be made to convert one or more of the “red” areas into a more accessible zone.

Generally speaking, a MakerSpace containing all of the aforementioned components should take up 12,000-15,000 square feet. Maximum flexibility is required so that equipment can be swapped out as it becomes redundant or obsolete. This requires a robust power network and/or one that allows power to be easily moved from one part of the space to the other. Easy access to a street or driveway with a garage or a loading dock is important in order to allow equipment or materials to be moved in and out of the space. Access to an outside work yard is also a benefit to many kinds of projects.

The Fabrication Lab has the greatest flexibility when it comes to the final components in the design and can be highly customized for programmatic purposes. For instance, a metal shop or wood shop might not make sense for some structures but perhaps an expanded machine shop or a specialized area focused on automotive technology could be substituted for those areas in the plan.

Issues of access also need to be considered from a safety standpoint. Increasingly, you can do with a very safe laser cutter what was previously the province of much more dangerous tools like bandsaws and jigsaws. Ultimately, the space may need to be reconfigured to allow greater access to areas that have become substantially safer to operate.

The Main Assembly Area of the West Houston Institute before equipment is installed. Note the flexible ceiling power and the open deck that allows for maximum flexibility for future equipment changes. The Wood and Machine Shops are on the far side of this space but still visible through windows. The idea is to allow for scaffolding users from less complex tasks taking place in the Main Assembly Area (3D Printing, Electronics, Laser Cutting, etc.) into the more complex areas.

Equipment costs are falling rapidly and through the strategic selection of tools the basic infrastructure of tools can be assembled for this space for under $1 million. In some cases, such as 3D printers, the more expensive ones are actually not as useful to the vast majority of users. This is because of the learning curve necessary to operate them is particularly steep and the output is not significantly more useful than the much cheaper models. In many cases, supplies are also more expensive and difficult to obtain the more costly the piece of equipment is. Some of the more complex pieces of equipment might be appropriate in a Production Space but might not find much use in a space designed for maximum accessibility. Tools always need to be adapted to programmatic needs.

Wherever possible, accessibility of the tools needs to be considered when drawing up an equipment list. Finally, as tool costs are dropping rapidly and the technology is shifting so fast, equipment should be one of the final items purchased in any building process. This is because the costs of many of these technologies can drop significantly from the time of specification to the time of purchase. For instance, similar 3D printers dropped in price from $3500 to $500 over the course of the development of the West Houston Institute MakerSpace. These kinds of price adjustments, coupled with the emergence of new technologies that could do the same kinds of jobs at a fraction of the cost of the original solution, resulted in an overall savings of between 40 and 50% in the equipment budget of the project totaling almost half a million dollars.

Staffing is also more complex in these kinds of spaces. In the “safe” zones, staffing the area with part-time students is still an option but the more complex areas such as the wood shop, metal shop, and machine shop will require specialized staffing, usually faculty, trained in the safe operation of the equipment there. The more staffing, the more accessible the space will be, and therefore evening or night staffing might prove necessary. Certified volunteers might be an option in some areas, especially if you intend to allow community access to the space during non-instructional hours.

If your model allows it, off-hour use can become a profit center for the space. Arizona State University has developed a partnership with TechShop in Phoenix to operate such a space shared by students and entrepreneurs from the community. It is possible to replicate this model without a private partner and different economic models may be appropriate for different circumstances.

What are the benefits to instruction?

We have already discussed the benefits of Making extensively in the sections on Creation Spaces, Design Labs, and in the blog Hacking School. Creating a broader set of tools will benefit a wider range of programs and curricula than the less complex spaces. At the High School level, CTE programs can and should be rethought with Making in mind as Making creates a more flexible path toward the same learning outcomes. It also future-proofs the spaces far more effectively than more technologically (and architecturally) rigid spaces do. At the college level, more advanced programs in the STEAM fields will be given access to a wider range of potential projects for their students. Additionally, bringing in entrepreneurs, engineers, artisans, and craftspeople from the community brings in a whole range of “teachers” capable of providing valuable lessons and even partnerships for students working in the space.

RESOURCES:

EQUIPMENT LIST: There are many configurations for Fabrication Labs and this will have a significant impact on the equipment list. The best way to approach the equipment list is to develop a floor and then customize based on programmatic needs. MIT has developed just such a floor through its Fab Lab Standard. This equipment list can be found at:https://docs.google.com/spreadsheets/d/1U-jcBWOJEjBT5A0N84IUubtcHKMEMtndQPLCkZCkVsU/pub?single=true&gid=0&output=html

Back to Contents

 


The One-Button Studio

A One-Button Studio at Houston Community College

What is it?

Penn State University originally created the One Button Studio as a solution to eliminate barriers to multimedia production, especially video, workflow. Its design is meant to provide students and faculty who have little experience in video production with the ability to seamlessly produce high-quality content for assignments, podcasts, online learning, and a host of other uses.

The workflow is designed to be extremely simple. The user enters the room with a USB flash drive and inserts it into a reader. There is literally a button that is pushed which fires up the lights, turns on the camera, and starts a countdown timer. The user positions him or herself in front of the camera and makes a presentation. At the end of the session, the button is hit a second time and the system encodes the video into a compressed file. In a few seconds users have a finished video on their flash drives that can be immediately uploaded to a wide range of online platforms.

The system consists of a ceiling mounted lighting system, a Mac Mini computer, a microphone, and a digital video recorder. All of these are tied together with a simple set of controllers borrowed from home automation systems.

The biggest design/budget question institutions have to ask themselves is what kind of display system they want to give users of the system. The simplest solutions put up a green screen (Rotoscope) or a background for the speaker. A wide throw projector, however, when coupled with a second PC, gives users access to PowerPoint, the Internet, and other technologies to enrich their videos. Adding touch through a system such as an Epson Brightlink interactive projector allows users to seamlessly interact with that content.

How big an Investment is The One-Button Studio?

In its simplest configuration you can set up a One-Button Studio in 80-100 square feet. Some electrical work is usually required to mount the lights to the ceiling and, if necessary, to mount the projection system. Electrical outlets will be needed for the computer(s) that drive the system as well. With those pieces in place the technology is relatively inexpensive. A basic system with a simple backdrop will cost around $7000. Adding a second computer and projection system adds $3000-5000 to that cost.

Since most of the components in this system are off-the-shelf computer components most IT personnel will be able to support this system with minimal training. The operation of the system requires little or no staff. One institution we have worked with simply has a key to the room at the front desk of the library and students are responsible for checking it in and out. The system is self-operating so no dedicated staff is required.

What are the benefits to instruction?

“We live and work in a visually sophisticated world, so we must be sophisticated in using all the forms of communication, not just the written word.” – George Lucas

New state and national educational standards are emphasizing the importance of creating across a broad range of media using technologies that are increasingly available to students. The problem to this point has been one of access, both in terms of cost and the skills required to operate video equipment. Full-blown studios can cost more than a million dollars and require specialized staff to operate them. Most institutions cannot afford this kind of investment at scale. As a result, students are unable to access professional video production. The One Button Studio solves the access problem by providing a relatively inexpensive solution that doesn’t require specialized staff, yet produces professional-looking content.

The uses of this technology in instruction are expansive. At Penn State in the first two academic years after constructing the One Button Studio, over 8,800 students – more than 10% of the main-campus’ student body per year – recorded well over 13,000 videos equating to a staggering 658 hours of recording. At other institutions students have created entrepreneurship pitch videos, mini lectures (teaching is one of the most effective ways of learning), club announcement, and many other kinds of videos. Faculty members have recorded short videos of particularly thorny instructional items such as solutions to complex equations or explaining the technicalities of the Senate committee structure. These examples clearly demonstrate that the One Button Studio represents a tool that can be applied to a wide range of educational challenges and needs.

Unbound: Teaching in the New Media Landscape

(Originally published on the nmc.org website February-September 2013)

In the Spring Semester of 2013 I embarked on an experiment to see if I could teach a government class based on the principles that I have been advocating:

  1. Create opportunities for collaboration and teamwork among the students
  2. Allow the students to explore the material for themselves
  3. Create an iterative, yet cumulative, format that allowed students to be creative, to take risks, fail without “failing” and build on that for later success in class
  4. Gamify the course to create positive incentives for the students to push themselves
  5. Focus on skills mastery rather than content mastery

This series of four blogs under the broad rubric of “Unbound” chronicle that experience and analyze the outcomes.

 

Unbound: The Role of Textbooks in the New Media Environment


Unbound: Observations on the Structure of the Class

As you may recall from my column in January, I launched a radical redesign of my American Government class this semester. I always start out the semester with a certain spirit of optimism, but I felt it was important in this case to talk about what worked and what didn’t work as I tried to leverage New Media to reshape how I teach.

I structured this class with five goals in mind:

  1. Create opportunities for collaboration and teamwork among the students
  2. Allow the students to explore the material for themselves
  3. Create an iterative, yet cumulative, format that allowed students to be creative, to take risks, fail without “failing” and build on that for later success in class
  4. Gamify the course to create positive incentives for the students to push themselves
  5. Focus on skills mastery rather than content mastery

 

In this installment, I’ll focus on my pursuits to meet the first three objectives. In addressing the first goal, my aim was to create incentives for my students to prepare for the kind of collaboration that they are likely to encounter in the professional world. Very few things in the real world effectively reflect the individualism of what we ask them to do in class. With the interconnectedness of social technology, this is becoming even more pronounced. Therefore, with the exception of the final portfolio, I randomly assigned students to groups. I reshuffled those groups four times over the course of the semester.

So, how well did this work? Getting students to work effectively in groups has been one of my greatest struggles. There is a lot of data to suggest that peer learning is one of the most effective teaching methods. My own experience validates this. I had a class once (in a mythical fairy land – no really) that spontaneously formed study groups and pulled each other along. As a group, they made higher grades than any class I’ve ever taught, before or since.

This was a spontaneous result of the chemistry of the class. I have never been able to make this happen. More often, typical groupwork can be summed up with this graphic from Endless Origami.

To evaluate group work, I used ballots and self-evaluation techniques. However, students tended to inflate everyone’s grades, giving even the biggest slacker full points. This was not a good way to figure out who was pulling their weight in the group. So, in classic economic fashion (I am a social scientist after all), I introduced scarcity into the equation. I limited the number of points that any individual could give out to less than the number of other group members. It turns out that my students would have made excellent communists. Their response was to aggregate the points the group had as a whole and distribute them equally throughout the group.

Considering the self-avowed conservatism of most of my students, I think this sheds an interesting light on the relative value that they place on educational incentives. Grades are so disconnected from financial goals that they assume little or no value in the minds of most students. I could be reading too much into this considering the small sample size (and I’ve had other classes that behaved quite differently), but it’s a point worth considering.

The students’ unwillingness to give each other meaningful evaluations didn’t stop them from complaining about the efforts of the other members of their group who did not pull their own weight, making it extremely hard to evaluate the internal workings of a particular group. As a consequence most of the learning here was of a soft variety. It consisted mainly of counseling students about how to handle these kinds of situations based on my own experiences.

Regrettably, the group work scenarios did not inspire students through their shared sense of purpose or the competitive incentives I’d laid out to explore educationally. In other words, I once again failed to recreate the effect of the class that pulled itself along collectively.

The second major departure in the class structure I implemented was to eliminate the requirement for a standard textbook. Instead, I gave them information frameworks such as my class notes and Wikipedia and then encouraged them to explore more widely. My goal was to give them open-ended information with a basic structure to allow them to follow their creative impulses.

I set this up on the very first day of class, by engaging the class in a discussion of creativity, the role of information and education in today’s world. I then showed them Ken Robinson’s video on education and creativity.

We then discussed how we could construct the course so that artificial boundaries, such as those imposed by a standard textbook, did not limit our creative efforts. For structure I provided my standard Study Guide, which is simply a list of questions organized by topics from the course as well the class notes, which answer those questions.

As I discovered the last time I did this, giving the students the freedom to map their own intellectual journeys is usually a recipe for them not leaving home. While I set minimum work requirements, students continued to do the minimum necessary. For instance, they would post a random articles without meaningful commentary. Many students failed to even look at the basic starting points in my notes or Wikipedia. Then they complained about a lack of structure in the course. Over the course of the semester, the situation improved, although there were some students who never understood or took advantage of the freedom to let their curiosity guide their educational path.

This issue relates to a basic dilemma I seem to face every time I deviate from the standard course structure. Students are unprepared for novelty. They have been well trained in “this is what a college course looks like.” In theory, they love the idea of freedom from their shackles but relatively few of them know what to do with their freedom once it is given to them. This is a fundamental cultural issue we all face as we try to innovate our way out of some of the contradictions of the educational system we have created. Change in this area is likely to come slowly, especially among those students who have never understood the purpose of education or their responsibilities to their own educational enterprise.

The third structural element I built into the course comes from the insights of Dan Ariely’s work in The Upside of Irrationality and his discussion of motivation. He discovered that relatively meaningless tasks discourage people even if they are incentivized with extrinsic rewards such as grades (or money).

I tried to address this issue in two ways. First, I created a point system that was completely cumulative in nature. Students could accumulate points, but they were never taken away. No assignment was graded as a ratio of achieved points/possible points. Second, I constructed the course so that the pieces would theoretically build upon one another. The weekly assignments led to the four projects and the work produced in the course contributed to the Final Portfolio assignment, which I deliberately put in a format that was both public and would persist after the course was over. I also provided incentives for going back and citing earlier student work on Google+ in later postings or presentations.

Getting students to look above the individual task effort and to take a longer view over the course of the semester has always been a challenge. I usually attempt this in some form, but this is first time I explicitly wove it into the structure of the class. I was also hoping that the relative permanence and openness of the Final Portfolio would make it more of a tangible object than something that ends up in my filing cabinet.

It was a mixed success. On the negative side, students, like in my more traditional classes, failed to use the earlier parts of the class to gain insights into the later parts of the class such as recognizing how the priorities of Congressmen tend to undermine long-term goals in foreign or economic policy. Furthermore, almost none of them cited their fellow classmates’ prior work. Even when I relaxed the rules and allowed students to go back and post in earlier topics on Google+ in order to bring up their point totals, very few of them exercised that option.

On the other hand, I was pleasantly surprised by the quality of the effort on the Final Portfolios (web pages). On the whole, the students improved their performance as they created their individual web pages. No one did worse on this assignment than they had done previously in their presentations and Google+ postings and a significant number of them did considerably better. This is in marked contrast to the traditional finals I had given in previous iterations of the class where students did not typically raise their overall grades through that one assessment.

Overall, the structure of the course did not significantly improve learning outcomes, but it also did not harm them. The class did about as well as past classes did in terms of what they learned and retained from the class. It was my hope, however, that the strong emphasis on group work, the focus on creative exploration, and the open-ended structure of the syllabus would foster better outcomes. Although quite a few students commented that they liked the class and had more fun with it, this was not, for the most part, reflected in the quality of their work. In the future, I may build on the one clear success, the Final Portfolio/web page, to create more assignments like this.

I am still scratching my head, however, about how to incite peer learning and group work as well as how to propel students to outline their own educational journeys. In the next installment of this series, I will discuss my experience and related struggles in creating a more explicit assessment rubric with the goal of gamifying extrinsic motivators for learning.

Top


The Grader’s Dilemma


Unbound: Teaching Questioning Instead of Answering

Pivoting the World

(Originally published on nmc.org  April 2013)

I’ve had a lot of interesting strands come together this past week as I have been pondering the next steps in our technological evolution. For the last 20 years or more we have been obsessed with transforming the real world into virtual worlds. Books have become eBooks. Magazines have become blogs. Cocktail parties have become Facebook. Meetings have become webinars, and so on. Now, however, I’m beginning to think more and more about technology bending in on itself. Our next steps will be to bleed the virtual worlds we created back into the physical world and this will reshape how we iterate the physical world.

The virtualization of everything was a trend some embraced more than others. It brought with it new opportunities to connect (from Flickr to Facebook to Twitter). Perhaps its height was the creation of virtual worlds such as Second Life. While I agree that there are huge advantages to bringing in larger, more diverse crowds, there has always been the concern that we are sacrificing depth for breadth. Marshall David Jones expressed this sentiment in his riveting poetry slam.

Jones forcefully expresses some real contradictions inherent in living in virtual spaces. I was thinking about him while working on the Horizon Report Technology Outlook for Community Colleges, when I came across an interesting article in the Toronto Globe about a professor who is building “pet rocks” using Arduino devices to physicalize social media. Consider the implications of making virtual social spaces real again and making your community extend online as well as on very real personal levels. This has implications for how we interact in education as well.

I am working with my science faculty to create a STEM-focused campus here at Houston Community College (HCC). There has always been a certain amount of resistance from the science faculty when it comes to virtualizing aspects of their curriculum. At our level of instruction, teachers need to be able to put science in the hands of their students, not just on a screen. However, I think that their wariness of mixing the virtual in with their instruction is increasingly missing the mark. There are emerging technologies exemplified by the “pet rock” project that represent a potential tipping point. They are making the virtual real.

It doesn’t stop with “pet rocks.” There are similar things going on right now from hackerspaces to Ingress where people are taking what were formerly virtual spaces and turning them into reality. As I was contemplating the science labs in our new campus, I thought the “pet rock” professor’s would be a great model. Students would be able to virtually design items and then make them real using technologies such as 3D printers or micro-computing devices such as Adruino or Raspberry Pi to test real world physical concepts. Imagine a moderated scientific hackerspace where app development meets experimentation and 3D printing meets iteration.

The promise of iteration provides a final strand to this argument. I have been thinking a lot about how to teach both our faculty and staff to become more entrepreneurial in how they approach their present and future jobs. I am convinced that this will be a critical skill moving forward as we are forced (and blessed) by technology to reinvent our jobs and ourselves repeatedly as we move forward.

Last year I stumbled across Eric Ries and his concept of the “lean” startup.

Ries makes a critical point here. He says, “if we can reduce the time between pivots we can increase the chances of success before we run out of money.” Or, in the educational context, before we run out of time. After all, in the immortal words of Billy Crystal in Spinal Tap, “Mime is money.”

Ries argues that technology is helping us reduce the time between pivots and therefore, assuming we can pivot quickly, our chances of success in any entrepreneurial effort go up. Technology hardware is becoming cheaper and cheaper. Raspberry Pi and Arduino units go for much less than $50. You can get a 3D printer for $1000. Software is becoming more accessible too, as simple programs are easier and easier to create and implement using mobile devices or the aforementioned hardware devices.

Teaching and entrepreneurism have a lot in common. A colleague of mine likes to argue that one of the reasons that it is so hard to manage faculty is that it is like managing an organization of entrepreneurs. The problem I often run into is that faculty fail to see how technology can improve their entrepreneurial efforts in the classroom. Instead, they often see it as limiting their scope for action or as a mechanism to oversee and control how they teach. However, there are a lot of parallels between what Ries is talking about and teaching. We have a limited quantity of time with which to interact with our students. Semesters end before you know it. The question is how many times can you pivot before you lose them.

Let’s go back to the idea of scientific hackerspaces. Science is fundamentally about trial and error, and the best way to get students to understand this is by allowing them to experiment and fail. The problem is that every pivot requires an initial failure, and we don’t do a good job of rewarding failure in higher education. We are also constantly up against the clock and calendar when it comes to allowing our students to fail within the context of an academic semester. It’s much easier for the instructor to drill them on the “right” answer and enable them to “pass” some sort of evaluation. This is a tried and true method that doesn’t teach our students the key tenets of entrepreneurism that will be critical to their future success. If you have any doubt about this watch this 2007 TED talk from Alan Kay:

The solution here is to make pivots much less costly in terms of time. Let’s assume our scarcity is the amount of time given to us in a given academic term (usually 48 contact hours). How quickly can we iterate physical concepts in that time and what can we do to maximize those iterations? This is where we get back to the promise of making concrete the virtual. If students can quickly program or design physics experiments virtually and then test them in the real world, we can maximize the number of pivots they could have in any kind of course requiring experimentation (and almost all courses, science or otherwise, should require some sort of experimentation). This would also have the beneficial side effect of teaching entrepreneurial, computing, and design skills critical to many occupations in today’s world.

The problem with many traditional science labs is that the inflexibility of the equipment often steers students toward predictable outcomes. What the new technology/employment environment demands, however, is people willing to create unpredictable outcomes and to be able to assess them creatively. Fortunately, this is also a core tenet of the scientific method and therefore something that we need to encourage our students to do.

A final exciting possibility that this opens up the possibility of a completely new kind of mashup. Until recently we tended to think of mashups in terms of visual or audio media. We can now start to think of them in terms of creating physical objects. This takes the concept of mashups out of the English classroom and opens them up in new ways to STEM disciplines. Remember that all fundamental breakthroughs in science are mashups. Newton’s Principia was a mashup of the thinking of Gallileo, Kepler, and many other theorists who had been struggling with the new cosmology for a century before Newton did a mashup and added his own innovations. Shouldn’t we be creating learning spaces that enable everyone to be a Newton?

Again, imagine a science campus that is more of a hackerspace than what we traditionally imagine science labs to be. In this space students and faculty will be enabled to create, fail, and recreate. It will require close cooperation between computer science and the natural sciences, but isn’t this what we are striving for in STEM in the first place?

I would leave you with one last question that is critical to the success of this kind of model: Can we graft an entrepreneurial maker culture onto our educational processes? I think we can. I think we have to.

Small Steps for Big Changes

(Originally published on pbk.com December 2017)

At the beginning of 2016, my former Instructional Technology Team at Houston Community College and I opened the Design Lab at the HCC Alief campus. It was designed to prototype the MakerSpace we were developing at the West Houston Institute. This space, which occupied a repurposed computer lab, used surplus furniture, and only required about $20,000 in new equipment, was designed to give us practical experience in running a MakerSpace. It became so much more than that.

Within months, students using the space developed everything from an augmented-reality sand table to drones to senior engineering projects for the University of Texas-Tyler Engineering program. The space quickly became the first stop on every campus tour given by our president and soon thereafter attracted the attention of the chancellor, the trustees, and the local community. Most importantly, the space completely changed the conversation at the campus and the larger college around what NextGen learning should look like.

D-Lab when it was opened

What does the example of the HCC Design Lab teach us? Many of the projects we undertook at HCC used what is called the “Lean Startup” method. Under this methodology small experiments are tried out with the expectation that a high percentage of them will fail and iteration is assumed as part of the process. The strategy essentially argues that you “build” something small, “measure” its impact, and “learn” from your mistakes to “pivot” quickly leveraging the advantages of technology.

The Design Lab itself was a lean startup project but even within the D-Lab itself we undertook many experiments, particularly with regard to the mix of equipment in the space. The most expensive piece of initial equipment we purchased was something called a Z-Space Tablet. This system allows you to use 3D glasses technology to view and manipulate objects in space. We thought this would be a cool piece of technology that would be used by students to do 3D visualization before sending the items to be 3D printed. The problem was that no one ever used it. It was too complicated and clunky for the neophyte user and 2D screens worked fine, even for 3D projects. So we quickly moved it out of the D-Lab and used its space for other projects. In its place, we purchased a $300 vinyl cutter, which quickly became one of the most popular parts of the tool package we presented to users.

The Z-Space was a failure for our purposes and we quickly moved on from it. We saw a need for a new kind of tool in the vinyl cutter and quickly pivoted around to getting one. Both of these decisions were made using the Lean Startup method, leveraging the falling costs of technology, and observing how the students in the space behaved. Both could be considered failures of the original design. Both represented what Lean Startup practitioners call “pivots” and both were necessary failures that allowed us to achieve success in the space. Note the specific mention of the vinyl cutter in the Houston Chronicle.

The D-Lab a little over a year after it was opened

Sustainability and Next Generation learning share one trait in common: they both represent cultural changes in the way we deal with teaching learning and the very spaces that we construct to do them in. Cultural changes are hard by definition. People fall into established patterns of thought and action that are very hard to dislodge. Doing this at scale is exponentially more difficult. The Lean Startup method presents a strategy that be used to make incremental instead of wholesale changes to an organization.

It is easy to underestimate how hard cultural changes are going to be. Active learning represents a significant cultural shift from traditional practice. It’s asking a lot for an educational community to suddenly pivot to NextGen teaching and learning. Further complicating this is the reality that we do not have a shared understanding of next steps on the path. We have an imperfect understanding of the nature of the technological changes sweeping through your classrooms and society as a whole. We also have an imperfect understanding of how humans learn. We can see general trends and certainly understand that teaching and learning need to change but there is no general agreement on how to get there.

The D-Lab presents a useful model for how to approach this problem. Experimentation has to become the norm as we feel our way into the future. However, experimenting with a $150 million high school is not really an option. Nor will it work very well because the scale of change is simply too large for most organizations to digest. Instead, the Lean Startup path is to make changes on a much smaller scale: a single classroom or lab, an out building, or a library makeover. The investment is much smaller and it is easier to retrain a core group of faculty or staff to maximize use and effectiveness of the space and to allow them to experiment with new modalities of teaching and learning.

If the D-Lab is any guide, demonstration spaces will also form a catalyst for change when the rest of the school sees the impact of what’s going on. With all of the pieces in place (a much easier proposition on a small scale), there will be demands for duplication or scaling of the activity. In my decade redesigning and experimenting with innovative learning spaces at HCC this is precisely what my team did. Not everything was as successful as the D-Lab has been but the costs of failure were much more easily digestible than it was on larger projects. And we learned crucial lessons that we applied to subsequent efforts. Furthermore, over time this built a community of support and also started to change the culture of the institution as faculty and staff more readily accepted entrepreneurism and felt empowered to attempt it themselves.

Without small scale projects such as the D-Lab and others, we would never have built the West Houston Institute, which was recently named a finalist for the SXSW Learn by Design Award. As we discussed in a recent Knowledge Base Article, the West Houston Institute is designed to be one giant Lean Startup incubator which will adapt and reconfigure itself and its programs to meet changing demands and also to act as a teaching tool for its students and visitors in methods, skills, and mindsets to carry them into the world of the future.

Digital Space and Time

(Originally Published on pbk.com October 2017)

I was reading an interesting blog entry recently. In it Ryan O’Connor argued that one of the dictums of modern architecture, “form follows function,” should be replaced by “structure follows strategy.” Read full article here. In other words, he was saying that in a digital world function is no longer a guide to form. He was making an argument about user interface design and the virtual world but in the last decade we have seen the virtual world encroach upon the physical world in ways that will profoundly impact how we work, live, and learn. Furthermore, arguably architecture is an exercise, on a vast scale, of user interface design. Technology is creating a world that is vastly more customizable and subject to change in very short timeframes than ever before in human experience. This presents significant opportunities and challenges for both education and the architecture that supports it.

A telephone existed for one reason. It was there for sending voice signals from Point A to Point B.  “Form follows function” certainly applied to these kinds of analog technologies. The shape of a Walkman was driven by the cassette tape in it. The shape of the cassette in turn was driven by the nature of the magnetic tape that formed its core.

The digital age explodes all of these constraints and, over the last decade, has increasingly encroached on tangible, physical technologies from telephones to cars to buildings themselves. The implications of this will be profound. When everything is reducible to zeros and ones the shape of anything can suddenly be changed. The ripples of this new reality extend far beyond computers or even the iPods that replaced the cassette and CD Walkmans. Those were just the first step.

These shifts have implications for both architecture and education. Consider how much of education has been driven by the necessity of seeing the chalkboard (or whiteboard or display) at the front of the room and the limits that this technology imposed on a teacher trying to convey information. In turn, the necessity of moving large groups of students through the school day is driven in part by the constraints of putting them into distinct learning spaces driven in large part by an immobile blackboard. As a consequence, learning, hard enough in an unconstrained environment, is now required to happen according to schedule. The logistical need to segregate also feeds specialization in the academic environment. From 10-11 am you are in History, not English.

Taking this to a more abstract level, consider how much learning is driven by the medium of the book. The linear narrative of the textbook drives classes. Tests come as a seemingly natural consequence of that linear narrative.

In this way the textbook drives the structure of learning in a school and that, in turn, can drive the very structure of the building itself. We have specialized boxes that we call classrooms. They are a direct product of the technologies of the blackboard and the book in much the same way as a cassette Walkman was the product of the cassette tape.

Despite the literature emphasizing the importance of communities of learning, classrooms often serve to divide communities as much as create them, especially in the older grades where specialization becomes pronounced. However, the logic of the schedule, the textbook, and test-centric instruction continue to drive the design of schools.

The fundamental assumptions we make as we consider what learning spaces should be are all subject to disruption in a digital age. There is less and less logic to having a physical textbook and the information it contains can easily be acquired through digital means that are much cheaper and more adaptable to the needs of the teacher and learner [here is another article on the changing role of textbooks]. While the blackboard/whiteboard are still very useful tools they can easily be supplemented and/or replaced by other means of transmitting information to the learner. The digital age effectively decouples information exchange from the physical space, allowing us to optimize those spaces with the human element as the primary consideration. This can happen either within the context of a particular space (digital displays, interactive touch, augmented reality) or by leveraging the cloud to put those environments onto mobile devices.

These technologies allow us to create buildings that can functionally disconnect what we traditionally understand as a classroom from the learning experience. Learning spaces can become more communal and less driven by specialization. In essence the school can become a bazaar of interconnecting ideas rather than egg carton of disconnected concepts.

The world of work is being driven by increased demand for diverse skillsets. The need for highly specialized workers is declining and those with broad ranging skillsets are in high demand. Working backward from this fact, does it still make sense to teach in a segregated, highly-specialized environment represented by the classroom [click here for more on this subject]? Instead, we need to mold the tools around the needs of the teacher and the learner, not the reverse. Strategies of teaching and learning should drive the technology and space design that supports it. The rationale for constraining education based on technological limitations makes less and less sense every day.

We also have to be mindful that strategies of teaching and learning are still very much in flux. Schools are driven by the concerns of today even as they struggle with the implications of building for tomorrow.

Therefore, the other key takeaway from these technological shifts is that we need to build for flexibility wherever possible. Teaching and learning are likely to change considerably over the lifetime of the spaces we are now creating. We need to recognize the limitations driven by technology that we impose on our structures and constantly re-evaluate whether or not they are still relevant. Furthermore, we need to recognize that these constraints are likely to be eliminated at an exponential rate going forward. Schools will need to adapt to innovations just like everyone else and the extent to which we can create spaces that can be easily adapted over time will determine the long-term vitality of the structure.

Adaptable Technology is giving people a vast range of choices and this can become overwhelming, especially to those with many other decisions to make in their day-to-day existence. The strategy behind mastering these new realities is to focus on the intent of the activity and to do so critically. While the technological means may be constantly shifting, the ends typically don’t change much. We want learners to emerge from schooling with the knowledge, skillsets, and mindsets that will help them succeed in life. Let’s put that up front and build backwards from there. We have the tools to do it.

Hacking School

(Originally published on pbk.com September 2017)

For over a decade authors as diverse at Frans Johansson (The Medici Effect – https://www.fransjohansson.com/books-by-frans-johansson/), Daniel Pink (A Whole New Mind – http://www.danpink.com/books/whole-new-mind/), and Steven Johnson (Where Good Ideas Come From – https://stevenberlinjohnson.com/where-good-ideas-come-from-763bb8957069) have argued that innovation comes from a diverse approach to solving problems. This either means developing a broad educational approach within a given individual or bringing together diverse people in a collaborative setting. As Johansson says in Medici, “Leonardo da Vinci, the defining Renaissance man and perhaps the greatest intersectionalist of all times, believed that in order to fully understand something one needed to view it from at least three different perspectives.”

The industrial age was one of specialization and our schools reflect this. Instead of Da Vinci’s dictum that you have to apply a broad spectrum of views to any particular subject, the educational world for the last century has been one of increasing specialization. The further you go up the educational ladder, the more specialized this gets. But even at the lower grades there is usually separation between “art” and “math,” for instance. These distinctions are the product of the industrialized educational system that purports to prepare our students for specific careers that require specialized and in-depth knowledge of a particular subject. Alan Kay shows just how siloed thinking undermines the learning process in this TEDtalk from 2007.

 

The problem is compounded by magical thinking. Arthur C. Clarke once stated, “Any sufficiently advanced technology is indistinguishable from magic.” This is exactly what has happened to our relationship with technology. Most users have no idea how it works or why it works the way that it does. The same is true for most technologies in schools (and is becoming worse as technological systems become more complex). It is common for schools to approach technology as something expensive that needs to be protected from the students. It is also separated from instruction in the sense that it is used to facilitate certain kinds of experiences but is rarely the focus of instruction itself. As a result, both students and teachers tend to view technological systems as black boxes rather than as learning opportunities.

The struggle to align the educational world with technological realities is a direct product of the specialization that has characterized our educational experiences for a century or more. Is coding a technical subject taught in a Career and Technical Education environment? Is it an exercise in mathematics and logic best taught by engineers? Most controversially, should coding be taught as a foreign language class? The answer is all three and yet our curriculum insists on trying to insert it into traditional boxes that don’t really create effective coders. Good coders have to understand logic and the basic technical limitations of the various languages but also need to be able to speak to the machines in a natural way. These kinds of people are incredibly sought after in the technology world but our systems do a very poor job of producing them. And coding is only the first level of the problem. To get to an iPhone-level device you have to integrate coding with hardware engineering and design (and ultimately entrepreneurship). You need a multidisciplinary approach to practically any problem these days.

How do we work toward a system that creates these kinds of thinkers? One way to do this is to create environments where it is okay to experiment. In the 1950s early computer scientists at MIT and elsewhere often came from the model railroading community. Model railroads are complex electrical switching systems so the transition was a natural one. Bundles of wires often had to be cut, or hacked, and reconfigured in order to make a system work. This was true in computing as well. This is where we get the term “hacker.”

Over the last 20 years this term has returned to its definitional roots through technology spaces that allow users to build, break, and repurpose technologies. Advanced tools such as 3D printers and laser cutters have been added to the mix to allow the rapid fabrication of prototypes. Microcomputers such as Raspberry Pi’s and Arduinos provide rapid access to electronic brains. Hackerspaces, and their more commonly referred to cousin MakerSpaces, have started making their way from community spaces into education. It is not always a natural fit.

MakerSpaces go against the grain of the industrialized educational model. Like coding, they raise questions about where they belong. Viewed purely as a set of tools they can augment traditional programs in Career and Technical Education or the Fine Arts. In higher education they are most commonly placed in the context of engineering or other STEM-based programs. While these programs can greatly benefit from the access to these technologies, it is only through broad access that the true potential of developing the next generation of Da Vinci’s can be realized.

If he were a young student today Steve Jobs might never have had access to this technology because he wasn’t an engineer. Fortunately, he had access to Steve Wozniak, who was an engineer. Together they developed what was to become Apple Computer, an early example of blending computing with design. MakerSpaces offer the opportunity of developing diverse kinds of communities within our schools. However, they can only reach their true potential if they are built with those goals in mind and are accessible and visible to the entire student body, not just those who are technically inclined. As a resource area like a library they can provide a vital bridge for students from a specialized, industrial-focused curriculum and the realities of a post-industrial age. If they are tucked away supporting specialized programs and/or are invisible they are as useful as a library of unread books.

The Hidden Future

(Originally Published on nmc.org, July 2017)

I seem to be have been hit by the Futurism bug recently. I say “hit” rather than “bit” deliberately. I’ve been fascinated by Futurism ever since I read Isaac Asimov’s classic Foundation trilogy as a child. The idea of psychohistory has stayed with me. Part of the reason is that I recognize prediction as largely an effort centered around the human rather than the machine. With psychohistory, Asimov argued that, given a large enough sample size, one could predict the future through the aggregated actions of vast numbers of people. Note that his version of futurism was based on societal factors, not technological ones. It is only when societies accept the new cultural norms imposed by technological change that we can say that change has actually occurred. When we focus on specific technologies, we miss the point.

In a recent post, Bryan Alexander remarked how little the world has apparently changed since the 1980s. With the exception of smartphones, there is scant visible evidence of change seen while walking down the street in Boston or DC. The cars and buildings are similar to what existed in 1985. Bryan and I are almost exactly the same age and we both graduated from high school in that year. I suspect we both expected our hoverboards by 2015.

In supporting his argument, Bryan references a great book, The Shock of the Old by David Edgerton, which makes a similar point to Asimov: namely, that we miss the pace of change because we focus on technology, not the human element. Edgerton remarks on the repeated persistence of traditional technologies well past the time when histories seem to imply that they were long gone. Everything from the use of horses to the longevity of radio as a medium is cited in this excellent book.

Edgerton’s book, written in 2007, somewhat misses a fundamental shift that occurred in the way that technology has evolved. Most of the technologies that Edgerton highlights require heavy capital expenditure to change, and large swathes of this planet do not have access to the latest and greatest. I can understand when seemingly anachronistic technologies seem to persist and/or are adapted into weird hybrids like the tuk-tuk in Asia (a hybrid of the rickshaw and scooter). Modern taxis are simply not economically feasible in many cities in less prosperous parts of the world. It makes sense that the locals try to maximize functionality of the limited resources available to them. This is what humans do; they adapt their tools to their needs.

At the same time, I wonder if this argument only makes sense when applied to physical media (and that may be shifting, too). The barriers to entry for virtual technology are much, much lower than those required to operate a New York City-style taxi in Phnom Penh. In the developing world, hacking can now be applied in revolutionary new ways. Several years ago, I read an article about how West African “hackers” were converting discarded PCs (did you ever wonder where that seven-year-old Dell winds up after your IT department carts it away?) into 3D printers.

Circuit boards are the physical manifestation of the rapid diffusion of “invisible” tech throughout the world. I carry around the rough equivalent of a 1985 Cray supercomputer on my belt. Those seven-year-old Dells are still more powerful machines than my iPhone 6 — and we’re throwing them away. Even more accessible, the Raspberry Pi delivers similar performance for just $35 — making it far more prevalent than an $8 million Cray ever could be (bench not included, however). Moreover, that is only the tip of the iceberg. The utility of these supercomputers on our belts and the basic ones embedded in almost every facet of our lives is that many of them are connected to a much larger “computer” that is the network. I don’t need to store the entire contents of Google Maps on my belt-mounted computer (or in my car, for that matter), as long as I’m connected.

The one area that Edgerton really doesn’t discuss is communications technology, and it is in this area that everything else is being upended. This is true both globally and locally. You can now hail a tuk-tuk on the streets of Phnom Penh using Uber. In many fundamental ways my own life has shifted, especially in the last few years, from what it used to be. I would put the blame for this squarely on one invisible technology: the Network. When I refer to “the Network,” I’m talking about a vast array of wired and wireless communications, from broadband internet in the home to Bluetooth in my car to cellular networks that pervade every fiber of my existence as well as the ubiquitous computing technology that binds it together. I think we often underestimate the pervasive ways that it changes everything we do.

This may surprise many of you, but I am a technological conservative. I rarely get excited about the “Next Big Thing” that will “change the world.” They rarely do — and I don’t adopt things until I can personally see utility that supersedes the old model by enough of a margin to justify the opportunity cost of changing my habits. I frustrate some of my more technologically progressive friends by resisting change until it suits me. I also get mad when I’m forced into a technological downgrade due to a company’s commercial interests. Some good examples of my recent frustrations in this area include Apple’s decision to get rid of most of the ports on their “Pro” laptops and Facebook’s forced transition to a separate app for Facebook Messenger.

That being said, I have noticed some significant changes in my behavior over the past half-decade or so, particularly in media consumption and transportation, both related to the Network. I am more likely to buy books on Kindle than I used to be, and that is usually because of a need for immediate gratification and the added benefit of synchronization across various devices. I recently took my first trip ever where I didn’t pack a paper book. While I still enjoy the serendipity of browsing through CDs, I find that I’m increasingly buying my audio in downloadable form. Both of these are partial shifts, impacting perhaps 50% of my media purchasing these days.

My video consumption, on the other hand, has shifted almost 100%. Other than sports, I watch almost nothing live anymore. I can’t be bothered to conform my schedule to that of some random network scheduler. That means that, unlike 1985, there are no more “Must-See Thursdays” in my life. If I want go out or my kids have a game, I can do that when I want, regardless of what’s on. I don’t feel the need to fuss with DVR or recording a VHS tape anymore, either. I watch what’s available on Netflix, Amazon Prime, or HBOGo. I buy a very limited selection of movies, but more and more, I won’t buy a DVD/Blu-ray unless it comes with a free digital download. In most cases, the physical media don’t even get used.

In the area of transportation, I have to ask myself, how many of the fossil fuel-driven “dumb” cars that Bryan Alexander references have Google Maps running on their phone? I am a heavy user of Google Maps. Even when I know where I am going, I run the route before departing to warn me of unexpected traffic. In the past few years, rideshare services like Uber and Lyft have exploded onto the scene. They are heavily dependent on Google Map-like software to get their drivers where they are needed as quickly as possible. While not a Tesla, my Mazda 6 includes sensors that warn me of nearby objects when I back up or when someone is sitting in my blind spot; the car also adapts its cruise control to vehicles in front of me. When I am driving a car without these features now, I really notice it.

The one thing I don’t know, and that neither Bryan Alexander nor I can see with our eyes, is just how widely diffused these technologies are. There is plenty of evidence that “the Network” has proliferated far more rapidly than “physical” technologies, such as those described by Edgerton. While the developing world may have become a technological junkyard, those pieces are being repurposed into cutting-edge technologies, perhaps leapfrogging the ideas being produced in the technologically fat and happy “developed” world. While transportation in crowded urban centers remains challenging, friction-free dispatching and navigation services are now facilitating it.

What Edgerton argues persuasively is that when people see a need, they will adapt their tools to that need. “The Network” facilitates tool adaptation in ways unimaginable in 1985 or even when Edgerton was writing 20 years later. While I may not have my hoverboard, I can watch Marty McFly anytime without physical media (he inhabits my iTunes library). In the very near future, I will be able to experience him in a VR environment from my sofa so I can get the hoverboard experience without risking injury to my knees. The outside world will continuously adapt to these new realities. In the meantime, the street will look much the same, but those of us who have taken the red pill will see the matrix that imperceptibly lies beneath it.

Holistic Design

(Originally published on PBK Architects, November 2017)

Technology is driving a sea change in how we interact with objects around us. What was fixed is now fluid. What was permanent is now (potentially) adaptable. What was distinct is becoming integrated and increasingly blurred. This has profound implications for wide swathes of our society and teaching and learning are not exempt. Technologies occupy an increasingly wide spectrum of what we have to consider when designing learning environments and it is important to take a holistic view of the project being undertaken.

What exactly does this mean for school design? Do we need to understand how furniture, lighting, and technology integrate into a classroom? Absolutely. However, we also have to consider how that classroom integrates with the rest of the campus. Not only does technology change how learning happens within the classroom, it also expands the range of places where learning can take place. Furthermore, many learning spaces are not what we traditionally think of as “classrooms.” They include spaces like MakerSpaces, One Button Studios, and informal collaboration spaces. Empowered learning does not always take place in discrete modules. It is a broad ranging enterprise that includes a range of activities in a typical day. Spaces and technology need to complement these activities.

There is a lot of neurological research on how learning takes place that supports the idea that a multimodal approach to teaching and learning is the most effective way to teach our children. As the OECD wrote in 2007, “Far from the focus on the brain reinforcing an exclusively cognitive, performance-driven bias, [neuroscience] suggests the need for holistic approaches which recognise the close inter-dependence of physical and intellectual well-being, and the close interplay of the emotional and cognitive, the analytical and the creative arts.” (p. 7)

In the intervening decade the challenge has been how to square this scientific conclusion with the realities of schools as we know them today. Instruction in most schools falls in a very linear one-size-fits-all paradigm instead of the dynamic multimodal approach suggested by the OECD report. As was have discussed previously, technology now offers the possibility of liberation from the physical constraints of this educational model.

However, in order to maximize these technological opportunities our approaches to school design have to pivot. We need to adopt a holistic approach to our thinking about learning environments throughout the building. There are many areas in the traditional school where technological affordances could be leveraged to achieve transformative learning experiences.

One example of rethinking spaces holistically is the ongoing transformation of Career and Technical Education (CTE). Under the old model CTE students were trained in very specific technologies. Accelerating technological changes challenge the efficacy of many CTE programs as schools struggle to keep their curricula relevant. A more holistic approach may solve this problem. This is precisely what has been advocated by the US Department of Education, “Making includes student-driven activities from a multitude of subject areas. While traditional shop classes may focus assignments on skills such as metalworking and carpentry, makerspaces are not subject or skill specific.” In other words, a MakerSpace is by definition a holistic space.

Another area where there are opportunities to think holistically is in the hallways. When we programmed the West Houston Institute the team explicitly started with proviso that there would be no hallways in the traditional sense of the word. We were able to maintain that vision on the first floor, which itself is designed very holistically across a broad range of spaces. Instead of constructing traditional hallways we created a range of informal learning and collaboration areas. These range from comfortable pod seating to interactive stations that allow students to connect into a monitor and work together at a collaborative table. Ubiquitous computing resources exist throughout the space. These spaces interconnect and fuse the various areas of the building together. Overall, the West Houston Institute is very holistically designed with making, collaboration, learning, and creative support all designed to work synergistically with one another.

Developing a holistic campus environment requires that the concept be a central part of the planning, design, and construction process from beginning to end. There are many factors that tend to drive projects to subdivide. First, it is common practice to have discrete parts of the school submit their programmatic needs separately. This is natural and necessary. However, those programmatic needs should be looked at holistically through a joint visioning session and a user-designated representative whose responsibility it is to maintain a vision of the synergies throughout the campus.

The user representative should also be responsible for making sure that the centrifugal forces that characterize the construction process do not undermine the overall vision of the space. Compromises are a necessary part of any building project but they should always be considered against the opportunity costs they impose on the overall design and not just specific programmatic elements. As a further hedge to maintain these principles, a focus on flexibility will help the environment grow in a holistic fashion as the building adapts itself to its community of users post-occupancy.

There are many reasons to approach your next school or campus project holistically. Most significantly, the new generation of learning environments are being driven by technological changes that are making everything more fluid. These changes are making the spaces themselves more fluid as groups flow from one technological environment to another to suit their learning needs.

Technology augments learning in unexpected ways and is subject to rapid change. These two factors dictate that all learning environments need to as flexible as possible and that learning should meet the space, not the other way around. They also argue for a set of interconnected spaces that allow that learning to migrate to where it is most effective at teaching our students the necessary skills for success. It is only through a holistic, flexible approach to teaching and learning that they will be able to thrive in a technologically-driven world where change is the norm and adaptability to new realities are at a premium. Our spaces have to support this.

« Older posts

© 2024 IdeaSpaces

Theme by Anders NorenUp ↑


Deprecated: Directive 'allow_url_include' is deprecated in Unknown on line 0