Tuesday, August 27, 2013

Microsoft's Strategy

Steve Ballmer announced his intention to retire on Friday. The news came as a bit of a surprise because Microsoft's board had always stood squarely behind him despite widespread calls for his resignation and the fact that Microsoft shed over half its value during his thirteen years at the helm. Microsoft's market capitalization was over $600 billion when he took over and less than $270 billion on Friday. Stock price isn't the only metric for a CEO's performance, but Microsoft is a publicly traded company and accountable to shareholders.

There has been a lot of speculation around what triggered his decision. As I see it, these are the three possible scenarios:

  • Ballmer and the board agreed that Microsoft needed a major change in direction and a new CEO. However, the new CEO would face enormous backlash if he or she came in and tried to change direction too quickly. It was decided that Ballmer would initiate the changes, moving Microsoft toward vertical integration and a functional structure, and then the new CEO would come in and finish the job. No one would be able to question Ballmer's love for Microsoft or the old way of doing things, so the changes would be accepted as good and necessary.
  • Ballmer, finally realizing that Microsoft needs a major change in direction, begins moving Microsoft toward vertical integration and a functional structure. The board gives its tentative approval for these initiatives, but panics when the road gets bumpy. OEMs are up-in-arms over the Surface, enterprise and the channel are up-in-arms over the changes in Windows 8, shareholders are up-in-arms over last quarter's $900 million write-down, and company insiders are up-in-arms over the new functional structure. In attempting to change direction, Ballmer has put the company at risk. Instead of facing a slow decline, during which huge profits are still being raked in, the board now sees the potential for a rapid collapse and cans Ballmer.
  • Ballmer, finally realizing that Microsoft needs a major change in direction, begins moving Microsoft toward vertical integration and a functional structure. But the board, feeling like these moves are too little and too late, decides that Ballmer needs to go. If change is necessary, they would rather start with a clean slate and a new CEO.

Does Microsoft even need a new direction? If they want to stay relevant in the consumer space, I think they do. Fifteen years ago, if someone sent me a text document, it was in the .doc format, and I needed Microsoft Word to open it. Documents were created in Word and sent as Word files. (If the recipient didn't have a copy of Word to open the document, that reflected poorly on the recipient and not the sender.) That was enough to ensure that I owned a fairly up-to-date version of Microsoft Office. Today, not so much. It is now bad form to assume that everyone has a copy of Word and most text documents I receive are sent in the .pdf format, even if they are still often created in Word. If I am exchanging text documents with someone and those documents need to be editable on both ends, then we negotiate the use of Word or Google Apps.

For me, the value proposition of owning Microsoft Office was to guarantee that I could seamlessly exchange documents with other people. In the 80s and 90s, I also used Office for creating documents (since I owned it and it did the job well), but I've since moved on to other applications that I enjoy using more. The old value proposition is going away and it is unclear what the new value proposition for using Microsoft products will be.

The biggest troll at Horace Dediu's blog, Asymco, is this fellow, obarthelemy. He recently posted this comment:

To me, Apple's innovation recipe is fairly simple: in a market that's tech-driven, make a product that's both easy to use and socially desirable.

That works wonder for markets that have historically been tech-driven, with ugly product and unfriendly UIs. I'm just unsure if there are any more such markets (TVs maybe, desktops maybe), and if Apple can work up a new recipe ?

That is Apple's value proposition! If there is something that you use everyday but hate using, Apple will sell you their version that is attractive and easy to use, which means that you will enjoy using it and get more out of it. Lots of us see the value in that even if obarthelemy doesn't. Many people may not like Apple as a company, but most of them still wish that the companies they do prefer would care more about design and ease-of-use.

Google's value proposition is services that are free, useful, cool, uncluttered (Google may be an ad company, but they don't bombard you with ads or serve up noisy user interfaces), and that have a large user base, which is important when you want to interact with other people through those services.

Microsoft needs to offer something similar to its consumers. At one point, it was developers, developers, developers! Microsoft Windows was the platform that developers developed for, which meant a rich ecosystem of applications. That value proposition still exists today, but like the value proposition for Office, it is weakening and unlikely to survive the transition to mobile unless Microsoft takes steps to shore it up. Just putting Windows Phone, Windows RT, and Windows 8 on a shared kernel — and then bragging about it — is not enough. OS X and iOS already share a kernel, and Apple has put far more work into optimizing iOS for mobile, implementing APIs to facilitate integration and frameworks for creating well-designed apps, and pushing the envelope with their own apps.

With so many strong platforms out there to target, I'm not sure if Microsoft's old strategy of winning over most developers is even possible any more. But if they are going to try, there are some things that I'm going to want to see. I'm still waiting to see Microsoft Office and key third-party apps like Photoshop Elements for Windows RT. If Adobe won't port Photoshop Elements to Windows RT, that says a lot about Microsoft's lack of a value proposition. You can talk all you want about the power of a tablet that runs down-scaled desktop apps instead of up-scaled phone apps, but right now, the Windows App Store is full of apps that would run just as easily on Android or iOS, and those tablet apps won't even run on your phone. Without more powerful apps, Windows RT simply feels bloated and unoptimized.

Talk is cheap, and a strategy with a black box at its center is no strategy at all.

Friday, August 23, 2013

Fear of Black Boxes

A popular internet meme involves listing steps to reach an outcome, except one of the crucial steps is missing. An example might look something like this:
  1. Build product
  2. Give product away for free
  3. Grow user base
  4. ???
  5. Profit!
This would be funny if it didn't represent the kind of thinking that is prevalent in education today. I attended an ed tech meetup last night where we discussed adaptive learning. There is a widespread assumption that the educational components needed to make an adaptive learning system work are in place and that adaptive learning is poised to revolutionize education once the technological challenges can be worked out. What makes this assumption completely ridiculous and more than a little terrifying is that it is being made by people who can't be bothered to do the minimum due diligence to see if it is true or not.

One person walked up to me after the meetup and asked me if and how adaptive learning systems are being used in schools. Only one out of ten decided that it might be worth engaging in some inquiry before pronouncing adaptive learning systems as the next big thing. He was surprised to learn that schools have been using adaptive learning systems ever since I could remember. A teacher gives you a quiz. The quiz is scored. The teacher reaches into the filing cabinet and pulls out the next worksheet based on that score. You do that worksheet. Lather. Rinse. Repeat. Adaptive learning systems are most commonly used for remedial math or for math skills practice throughout the year.

When I was curriculum coordinator in Holliston, I was sometimes called upon to assess a student and then suggest strategies to the teacher for working with that student. Erin had been a top math performer all through sixth-grade, but was suddenly struggling to get C's in seventh-grade. Erin had an eidetic (photographic) memory, and as I was working with her and probing her thinking, it became clear her primary strategy for solving a math problem was to flip through her memories until she could find one where her teacher was solving a similar problem on the whiteboard. She would then mimic what the teacher had done. This learning strategy had served her well and it was all she knew, but suddenly it was no longer working and she was in tears over it. Her entire sense of self was on the line.

Increasing Erin's engagement level wouldn't help. Giving her more practice on skills she hadn't mastered or presenting concepts in alternative ways wouldn't help. She needed to transition from her dominant learning strategy to another one. While Erin's situation may sound extreme, it isn't. Her learning strategy stands out as different from most (I picked this example because most people would have no trouble seeing that her particular learning strategy doesn't scale very well), but most of us have primitive learning strategies that don't scale very well. We just don't see it because our learning strategies are similar to the ones that everyone else has. Just ask someone how to study for a science test. The most common approach is to re-read your notes or the textbook. In math, it would be to redo old homework problems. We do these things because we don't have better strategies. It's all we know. But do they sound like good strategies for learning math or science?

My point isn't that adaptive learning systems are incapable of diagnosing issues like Erin's. I don't think they can right now, but I have little doubt that eventually an expert system will come along that can pretty much do everything that I can do. My point is that, in trying to work with Erin, I encountered dozens of open questions... questions that the research base doesn't cover. To make an adaptive learning system work, we need to start filling in those gaps. Unfortunately, I see lots of gaps that no one is working on.

As educators, we drastically overestimate the number of best practices that have been identified. Most practices used in classrooms developed out of expedience. Teachers need to move students from point A to point B, and they need to do it quickly. Just like many businesses are focused on their next quarter, many teachers are focused on what they need to cover and what their students will be tested on at the end of the year. The result is practices that are designed to cover over issues instead of uncovering and fixing them. Uncovering and fixing issues takes time, and if you throw resources at it, you may not make next quarter's numbers. What makes this even more insidious is that even well-intentioned teachers will see a practice that everyone else is using and assume that it must be the best practice, and not simply a practice that we started doing because it gave us quick and dirty results. This is why most highly trained educators erroneously believe that the best approach for working with struggling students is to break things down for them and make learning more procedural.

For most people, what happens inside the classroom represents a black box that no one wants to peek inside. Policymakers use test scores to "motivate" teachers. They assume that if they increase the stakes and the level of accountability, that the stuff that happens inside of the classroom will take care of itself. They assume that teachers know what to do and just need to do it. But I can tell you, that is not the case. We don't know what to do. When you put pressure on us, we will do more of what we know how to do, but that really doesn't work. You can put a gun to my head, but flapping my arms faster and harder is not going to help me to fly.

Increasing student engagement is another way to avoid the black box. So is the idea of using technology to enable students to produce knowledge instead of consuming it. Walk me through, step-by-step, how these increase student learning without any other parts of the system changing because I've got plenty of counter examples to show that it doesn't happen automatically. If you want to have an impact on learning, you can't be afraid of the black box. Writing "then a miracle occurs" in a cartoon or "here be dragons" on a map may be cute, but it just isn't going to cut it.




Thursday, August 22, 2013

The Fallacy of Using Relevance or Interest to Improve Learning

There is a line of reasoning in education today that goes something like this:
  1. kids learn better when they are engaged;
  2. kids are more engaged when the subject matter is relevant and/or interesting;
  3. therefore, we should connect or embed stuff that we want kids to learn in stuff that they find relevant or interesting.
This approach certainly makes sense and works to some degree. If the stuff I'm learning is relevant or interesting to me, I will work harder, take more risks, and test my understanding more actively. But trying harder only takes you so far. If the tools and strategies you have at your disposal to learn something aren't working, doing more of the same is unlikely to make much of a difference.

The easiest way to prove my point is with a few counter examples. Let's tackle relevance first. Most people reach a point in their lives where finding a life partner is very relevant to them. So, we all become master daters at that point, right? No. Even though dating seems like an eminently learnable skill and we get plenty of useful feedback while doing it, most of us still suck at it. And I'm not even talking about the finding a life partner part (which we suck at too), I'm talking about the being on a date part: creating a good impression, picking up on cues, listening to the other person, selling yourself, being open and energetic... basic stuff that we should have learned on the playground (but didn't) if relevance was enough.

So how about interest? Many kids are interested in sports or playing an instrument, but how many of us become really good at them? When I was a student teacher, I had a student named Zack who was dedicated to becoming the best basketball player he could. I ran into him on the courts one day and he was practicing dribbling and shooting with his left hand so that his left hand would be as good as his right hand. He did that every day. It struck me because it was so unusual. Most people practice the parts of the game they are already good at. That's what I did. It's why I never did get good at basketball, and it had nothing to do with height (sigh) or a lack of physical talents.

Most people cite video games as an example of something that kids learn and get good at due to interest. The problem with video games is that the tasks are specifically designed so that you can get good at them through perseverance and by doing what you know to do. The game with the most buzz in education circles right now is Minecraft. Kids are learning how to do all kinds of new stuff by studying other players and imitating what they do. This is a tried-and-true strategy when it comes to learning a new game and, if all you care about is getting the kid to pass the next test, it usually works in school, too. But there comes a point where that strategy doesn't work anymore. I hear it all the time as a math teacher: I was good at math until I hit this topic and suddenly everything was over my head. Kids and adults become lost at that point because they don't know what to do next. Think about all of the dedicated sudoku players who plateau, not because their interest plateaued, but because they don't have the skills to take their game to the next level.

I'm not arguing that we should ignore relevance or interest; I'm just saying that they aren't the magic bullets that a lot of people make them out to be. This should be obvious, but isn't. I am far more interested in figuring out the learning skills, strategies, and habits of mind that the kids who get calculus have that the kids who hit a brick wall at that point don't have... and then helping all kids get them. Not because I think that all kids should learn calculus (or any other topic where people seem to hit a brick wall), but because I'm guessing that those things might be good to have for learning all kinds of other stuff, including stuff that kids may even find relevant or interesting.

Tuesday, August 20, 2013

Markets as Match-Makers

If you ever wanted to create an app and put that app in front of millions of eye balls, then Apple's App Store presents an amazing opportunity. I published an educational game, Chocolate Chip Cookie Factory: Place Value, to the App Store back in September and, with little to no marketing push on my part, have sold thousands of copies in dozens of countries. Someone in Hong Kong gifted it to 80 friends and the International School of Milan has recommended it to parents for use at home. It's a bit like pushing a button and having your product sold in every Walmart around the world. How cool would that be?

Now imagine that everyone and their uncle could do the same, and the local Walmart was a giant warehouse with miles of shelving. There may be a million people out there who would happily buy my game if they ran into it, but most never will. When creating a market, it is not enough to make sure that the shelves are amply stocked and people are coming through the doors; you also have to make sure that the right products and the right people manage to find each other.

People have been saying for years now that the Google Play Store was on the verge of overtaking the App Store in terms of developer focus and resources. The tipping point was always right around the corner. Eric Schmidt predicted that it would happen by the summer of 2012 (the same summer that Google TV would be installed on the majority of televisions sold in Best Buy). Developers would just have to develop for Android first once the Android market was so much larger than the iOS market... or so the logic went.

Android marketshare is now 4-5 times larger than iOS marketshare, but developers still aren't prioritizing for Android. At best, apps are released on Android and iOS simultaneously. Rene Ritchie makes an interesting case that many iOS developers won't simply follow the numbers because of their passion for the Apple way. But what would happen if the App Store shut down completely and all of its users moved to Android? I would argue that many iOS developers would go out of business. Not because they would rather leave the mobile app game than develop for Android (though that certainly might be true for some), but because they could not compete in the new market.

Let's call Apple users and developers hipsters (the word has lost all meaning, so no harm in co-opting it here). The App Store is hipster nirvana; it is super easy for hipsters to find each other. Not only that, but most tech reviewers are hipsters as well, so they are constantly promoting the latest hipster apps. Apple is famous for curating the apps it chooses to sell on its App Store, but it also curates its users through its branding.

If everyone moved to the Google Play Store, how would the hipsters find each other? Google certainly isn't going to be promoting any hipster apps, and hipsters aren't going to want to comb through hundreds of utility apps to find the next Instagram. In markets, density counts, often more than mass.

Last month, Zach, the CEO of School Yourself, mentioned over lunch that he was searching for talented curriculum developers but couldn't find any. In fact, the quality of the educators that he was interviewing was really discouraging. I told him that I had stopped applying for or even noticing job postings for curriculum developers because they invariably wanted someone with much less experience or expertise than I had. Curriculum writing is viewed as a commodity, and large publishers often felt like it was easier and cheaper to train someone new. They did not want anyone who might disagree with the big name subject matter expert or principal investigator leading the project.

Because it was so difficult to sift through all of the crappy openings looking for the few good ones, I eventually stopped looking. And because good curriculum developers stopped applying for openings, companies looking for really good curriculum developers stopped posting them. Poor match-making killed the market. Reviving it is going to be a little tricky.

Thursday, August 15, 2013

The Giant iPod Touch

In 2010, Apple announced the iPad. It was immediately derided across the internet for being just a giant iPod touch. But here's the funny thing: (1) until the Surface RT was released last year, the iPad was the only tablet on the market that wasn't just a giant iPod touch; and (2) many of the people who criticized it for being just a giant iPod touch have since gone out and bought giant iPod touches themselves and now argue vociferously that tablets don't need to be anything more than giant iPod touches.

Anyone who says that Apple took a smartphone operating system and scaled it up to fit on a tablet simply doesn't know what they are talking about. That is what Google did with Android. Apple spent ten years developing iOS as an operating system for tablets. They had no intention of entering the smartphone market. The only reason that Apple did not release the iPad in 2007 instead of the iPhone is because they couldn't find a viable go-to-market strategy for it. The multitouch interface was too different and consumers would have a hard time seeing why they would want one.

While they were struggling to figure out how to bring the iPad to market, someone had the bright idea to use iOS on a smartphone instead. The thinking went something like this:
  1. People already see the value proposition in a smartphone.
  2. More and more people are buying them.
  3. Smartphones will eventually displace cell phones over time.
  4. The user interfaces for current smartphones suck.
  5. A smartphone that was powerful and fun to use could easily disrupt the market.
  6. Once people were familiar with iOS and multitouch interfaces, the market would be ready to embrace the iPad and tablet computing.
The iPhone was the go-to-market strategy for the iPad, and iOS was scaled down to fit on a smartphone, not scaled up to fit on a tablet. The reason why iOS feels so much simpler than Android (toy-like is the technical term) is because the vision that Apple had for tablet computing was very app-centric. Apple envisioned the iPad as an appliance for apps. When the user launched an app, the hardware and operating system would disappear and the app would take over.

Because of the way that iOS and the iPad were engineered, apps in the Apple ecosystem flourished. I haven't read this anywhere, but I'm sure that the iPad was designed to run touch-based versions of Apple's own iWork and iLife suites from the very beginning, and that these apps were being designed in parallel with the hardware and operating system. And I haven't read this anywhere either, but I'm willing to bet that Surface was designed first and only now is the Office team being asked to port Office to it.

Because apps are flourishing in the Apple systems and are cited as the number one reason for choosing an iPad over a much cheaper Android tablets, the Google fans who once derided the iPad for being just a giant iPod touch are now saying that tablets don't need apps... they are simply media consumption devices and you only need the built-in functionality. Tablets are valued among Android users almost exclusively because you have more screen real estate to see things. In Apple's vision, tablets are valued because you have more screen real estate to also do more things. This runs counter to Andy Rubin's vision for Android apps, which is that there should be no difference between phone and tablet apps and that all tablet apps should work equally well on a smaller screen.

If there is one tech company that would not simply take an OS designed for one form factor and shoehorn it into another form factor, it is Apple. That's not how they roll. When Apple ported Safari over from OS X to iOS, they didn't just tweak the UI to make things touch-friendly. The browser was a core component of the user experience and, originally, most apps were going to be delivered as web apps built on top of WebKit, so a ton of thought and engineering was put into it. Since most web pages were designed for and tested in desktop browsers, page elements that accepted mouse input were generally spaced for large screens. But on a small screen with imprecise touch input, many of those page elements were going to be too close together. Instead of porting Safari over to iOS and calling it a day, Apple engineered Safari so that it will try to guess which page element you are trying to touch when you tap close to multiple targets. This is useful for touch input, but it is also critical because of how Apple has implemented its tap-to-zoom feature. When the user double taps on the screen to zoom in, the browser zooms into a specific page element, such as a <div> tag. These page elements are often nested and were never designed to accept mouse input in the first place, so Apple knew that they needed to build some kind of logic into Safari so that this was a useful feature and not an exercise in frustration. And all of this was built into frameworks so that developers could use it with little to no effort.

On Android, as far as I know, browsers still behave like desktop browsers, which make no attempt to interpret user intent when the user taps close to multiple targets. And tap-to-zoom zooms a fixed amount instead of zooming in on a specific page element. This means that Android users tap-to-zoom to sort of zoom in on the content they want, and then use pinch-to-zoom to adjust the view. Android users have been complaining about this for years on forums. Another thing that they have been complaining about is the inability to use pinch-to-zoom on emails displayed using html in Google's own gmail app. This has been fixed recently, but it means that pinch-to-zoom was hacked together for the Android browser and not built into some kind of webview framework for all to use. Does any of this matter? Well, I think it is kind of telling when you analyze the usage data for Android and iOS. Android and iOS users both surf the web about the same amount when using cellular data, but iOS users surf the web about an order of magnitude more when using wifi data. My interpretation of this data is that, when you have access to wifi you often have access to a computer, and most Android users prefer to surf on a computer when presented with a choice while most iOS users prefer to surf on an iOS device. That has to say something about how well the browsing experience has been optimized on mobile devices.

When people laughed at the iPad for just being a giant iPod touch, they may have really meant that no one would want a giant iPod touch for $499, but that it would be a great deal for $199. I think that is a perfectly logical argument to make. There really isn't a strong argument for choosing an iPad over an Android tablet unless you simply prefer being immersed in the Apple ecosystem or you want to do stuff (i.e., run apps) on your tablet. Just don't say that the iPad is a giant iPod touch. You're being lazy.

Wednesday, August 14, 2013

Sustainable Systems

I think a lot about sustainability, systems, and sustainable systems. Back in the late 1980s, Japan was surging and there was general consensus that it was about to overtake the U.S. as the world's economic superpower. The Japanese were buying up American landmarks right and left, and absolutely killing it in consumer electronics, cars, and many other industries. Where Japanese companies were nimble, innovative, and driven by excellence, competing American companies were slow, stupid, and lazy. So what happened?

Conventional wisdom says that irrational exuberance created a bubble in real estate and stock prices, and when the bubble burst, the Japanese economy stalled and became mired in either the Lost Decade, the Lost Two Decades, or the Lost Two Decades and counting (depending on who you ask). Bubbles happen, especially during periods of extreme growth, but that doesn't explain why the Japanese economy wasn't able to absorb the shock and then move on. After all, the declining American economy was able to recover relatively quickly from the dot-com bubble and may even recover from the recent housing bubble before the Japanese economy pulls out of its decades-long slump.

I was in college at the time and it seemed obvious to me that the Japanese economy would never be able to reach superpower status, at least not without some major restructuring. It was designed to be a fast follower and not a leader. It didn't have the legs needed to sustain its growth if or when it did reach number one status.

The Japanese economy is centrally managed; a few key decision-makers at the top pick winners and losers. This means that resources get funneled into a few strategic initiatives, enabling you to make rapid progress in those areas. This works great if you know what those strategic initiatives should be because someone else is blazing the trail ahead for you. But it is not a system that is going to be able to pick out the next big thing because the next big thing is almost always something that comes out of left field. Is it biotechnology? Green energy? A system that relies on one person to keep picking a string of winners is never going to last. One day it is going to make the wrong bet and get left behind.

By contrast, the U.S. economy is a complete mess. Everyone is doing their own thing. Entrepreneurs and venture capitalists are pursuing all kinds of harebrained schemes, most of which will end up failing. It makes me furious to see how many dumb ideas get funded. At the same time, it gives me hope that my own dumb idea might get funded some day and change the world. You can think of pathfinding in the U.S. economy as crowd-sourced, or like an ant colony where hundreds of scouts are sent out so that one can find the next sugar pile. It seems incredibly wasteful, but any sustainable economic system is going to need something like it.

Another place where I see sustainability problems is in democratic systems. This is on my mind because of the Arab Spring, but I first started thinking about it after the break up of the Soviet Union. Democracies are great, but they are incredibly fragile, especially when first taking hold. There are a number of things that a democratic system needs to guard against, but the one that I want to highlight is the tyranny of the majority. How is it possible that the Egyptian constitution was ratified by a simple majority vote? Designing the system this way almost guaranteed that minority groups would be disenfranchised and, with absolutely no protections in place, either leave or attempt to overthrow the government. At a minimum, supermajorities should be needed to ratify or amend a constitution. If you want a system to last, you need to design it with sustainability in mind.

Tuesday, August 13, 2013

Cooking and the Tao of Dave

I got off the phone with my coach about two hours ago. She had asked me to start thinking about what I wanted to work on for our next project, so I had brainstormed a potential list to prepare for our conversation. Recently, we've been talking about putting systems in place for my life. At the top of my list was to get back to cooking.

Cooking has deep roots in my family. My dad started out as a waiter/bartender for one of the major restaurants in Chinatown. He became friends with the owners and they opened up a small chain of Chinese takeout restaurants together in the Boston area. At home, my mom cooked the family fourteen meals from scratch each week. No leftovers. No cans. The only frozen foods were peas and corn. One of my friends came over to my house one day and she was absolutely blown away by our fridge. It looked like a mini-farmers' market inside. She's still talking about it.

I jumped on the cooking bandwagon early, before it became a full-on craze. I loved cooking shows on PBS. I think Jacques Pepin was my favorite. Because the idea of cooking for my family was so intimidating to me, I kept a list of recipes in my head that I wanted to prepare some day. When I finally had a place of my own with a real kitchen, I started cooking like crazy. One of my favorite things was walking through the supermarket for inspiration while I planned out a menu for a dinner party.

Over the past few years, as my life has gotten busy with work, I've been cooking less and less. Even worse, I don't think that I've tried a new recipe in the past two years. I now visit my parents about once every two weeks. They cook lunch and send me home with frozen chicken breasts and fresh produce. My parents used to cook an amazing repertoire of dishes. As they have gotten older, that repertoire has narrowed down to almost nothing. They eat the same things every day, and every two weeks, they give me broccoli, snow peas, tomatoes, and sometimes bok choy.

Cooking this stuff has become joyless for me. I can tell that I'm mailing it in when I stir fry chicken and broccoli and only use soy sauce for seasoning. I can't be bothered to throw in a little garlic or oyster sauce. Because of this lack of joy, I'm cooking less and less and buying less of my own fresh produce from the supermarket. When I brainstormed things to work on with my coach, the first thing on my list was to cook my chicken, broccoli, snow peas, and tomatoes more often. I needed to get through this stuff because my mom gave it to me (she won't take no for answer) and I don't want it to go to waste. A distant second was getting back to trying new recipes.

Talking to my coach, it slowly dawned on me that I could take two different approaches to this cooking project. The first approach would be to put systems into place that would enable me to cook more frequently. As I was cooking more frequently, I would use up what I had brought home from my parents and buy more of my own stuff to cook, and it would free up energy to do more things like expand my own cooking repertoire. Essentially, getting things done would lead to joy. The other approach would be to go for joy first. I would put systems into place to make sure that I was being inspired in the supermarket and trying new recipes. This would put me into a happy, centered place and then I would start cooking more regularly. Joy would lead to getting things done.

The realization I had with my coach is that my default approach is to take care of business first and then, if I can squeeze it in, seek joy. But when it comes to cooking and how I see my life, joy should be my priority. It's what I really want for myself. I need to get in the habit of taking the path to joy because a joyful Dave is a fully actualized Dave who gets things done and kicks butt. While getting things done can lead to joy, it is a less direct path that can be hard to sustain before the joy kicks in. It is also me not listening to what I want and who I want to be... which is a very bad thing. I'm not saying that everyone should go for joy. If cooking were not a joyful thing for me, then I should, at a minimum, get down to business and eat healthier. And if you have other priorities in your life (respect, power, challenge, peace, security, tranquility), then you should go for those. Joy happens to be my thing, along with being in the moment.

As the Fake Steve Jobs would have once said, Namaste. :)

Tuesday, August 6, 2013

Sustainability

One of the exercises that I'm doing for my coach is keeping track of my core values. A core value is something that runs through your life like a thread, showing up consistently in all kinds of ways. Even though you may not be aware of them, a core value should jump out at you in hindsight. One of my core values is education. It never occurred to me that I might want to be a teacher for twenty-one years until I was a grad student in chemical engineering at Berkeley. But once I did consider it, I could trace how important education had always been to me since I was a kid. The idea of becoming a teacher suddenly seemed an obvious career choice.

Another is sustainability. When I bought my condo, it was the first place that really felt like mine and I needed to know that I could take care of it. For the first year or two, whenever I cleaned or repaired something, I would note that I could restore whatever I was cleaning or repairing to the same condition it had been in when I moved in. This felt really important to me and it brought a lot of pleasure and satisfaction every time I could sustain my own home.

And now that I've noticed it, I can't help but see how often sustainability shows up as a core value for me. If you read my last blog post on Bill Parcells and Steve Jobs, you'll know that a big part of my admiration for them was their ability to imprint their own unique DNA on any organization they led. Because if you can't reproduce a result, can you really sustain it? I also admired Parcells because his organizations were able to produce players and coaches who were then able to go and imprint that DNA on their own organizations; and I admired Jobs because he wanted to build organizations that would have the capacity to sustain themselves long after he was gone.

I've changed jobs many times in my career, and each time it was out of a desire to test a different hypothesis. I left Brookline after I had proven that I could motivate students and help them perform at levels far beyond what their academic records, or even self-beliefs, indicated. What I didn't know is if these students had established a new normal. Most students don't push themselves very hard in school, so it is entirely possible that I had encouraged them to push themselves in my classroom, but that they would retreat again in the face of less conducive learning environments. I didn't want them to retreat. I wanted them to sustain this new way of being with new skills, attitudes, and habits of mind. I wanted students to say, "No! This is who and how I am, and I'm not going to compromise that just because its hard." It's why I ended up taking a job at the Jewish Community Day School where I could work with kids for three years.

Once I had proven that I could help kids establish new normals in my classroom, I wanted to know if I could help organizations, an entire school or district, do the same. This meant helping teachers establish new normals so that they could help kids establish new normals. And it never got this far, but the next step would have been seeing if any of those teachers or kids could do the same thing in a different school or district on their own.

Business people like to talk about disruption. For me, disruption establishes a new normal. When Apple introduced Mac OS, it didn't disrupt the personal computer market because it was popular or sold a lot. The pet rock was popular and sold a lot. It was disruptive because it established the GUI as the new normal. If Apple had introduced Mac OS in 1984 and then gone out of business in 1985, the GUI still would have taken over as the new paradigm. The genie was out of the bottle and there was no going back.

At a certain level, sustainability is important to me because I want to create new normals. I don't want to accept how the world is; I want to know that I can change the world to the way it should be. And I want know that everyone else can do the same. This means creating things that can sustain themselves and don't go away when you stop driving them. But I think that my quest for sustainability is independent from quest to shape the world around me. That is, unless you can explain why else I'm obsessed with getting the grout in my bathroom back to pristine condition.

Friday, August 2, 2013

Bill Parcells and Steve Jobs

It was a bit of a shock when I realized that I admired Bill Parcells. It was even more of a shock when I realized that Bill Parcells was the first and (at the time) only person I admired. While I also respected historical figures like Abraham Lincoln, those feelings were based on intellectual calculation... an evaluation of their accomplishments and sacrifices made. My feelings for Bill Parcells ran much deeper than that. They were visceral. I didn't expect or want to have them; they were just there — something for me to sort out and deal with later.

The year was 1994 and Parcells was in the process of rebuilding my beloved Patriots and guiding them to a Super Bowl. What I remember most was the aura he had about him. It felt like he had a system for turning around franchises and getting the most out of his players. There was the sense that, if you had a young team with tons of potential, you should bring in some Parcells guys for veteran leadership. They would instill a team ethic in the locker room and mentor the younger players on the field. Parcells guys were smart and tough, and knew how to win. Not everyone wanted to (or could) play for Parcells, but those that did became better players and developed an intentionality that they could then pass on to others.

It is amazing how many players and assistant coaches mentored by Parcells have turned around and become mentors themselves. There are so many that Bill Parcells is considered to be at the root of his own coaching tree. It was also significant to me that Parcells turned around three consecutive franchises: the Giants, Patriots, and Jets. He didn't stumble on the right chemistry; he brought it with him wherever he went and knew the recipe.

Although I knew Steve Jobs before I knew Bill Parcells, my admiration for Jobs came later after he founded Pixar and NeXT Computer. Like Parcells, Jobs had an intentionality about him. Jobs developed his during his exile from Apple. He learned how to instill a culture of innovation and risk-taking and get the most out of people. I've never worked for Google or Apple, but I imagine that people go to Google to do their best work while people at Apple come to discover what their best really is.

Jobs hired Joel Podolny to develop Apple University to help sustain the culture at Apple. It is far too early to know how effective this initiative will be, but executives leaving Apple have not been as successful as the Parcells guys have been. Jon Rubenstein failed at Palm and Ron Johnson failed at JCPenney; Tony Fadell seems to be doing okay at Nest Labs.

However, where Jobs has an edge over Parcells is his ability to execute on a longer term vision. No, Jobs did not invent the GUI. But he was able to recognize its potential and the future of computing when he saw it at Xerox PARC, something that hundreds of other people, including the executives at Xerox, failed to do. After Apple released the Macintosh in 1984, it took Microsoft eight years to develop Windows 3.1 (the first truly usable version of Windows) and another three years to develop Windows 95 (the first mainstream version of Windows). And keep in mind that Microsoft had early access to Mac OS as they were developing Word and Excel for the Macintosh. This means that it took Microsoft ten years to develop a polished GUI when they had Mac OS to use as a blueprint and knew that the market existed. Can you imagine the balls it took to develop a polished GUI for ten years from a bunch of tech demos when virtually no one believed in it? And then Apple did it again 23 years later with iOS and the iPhone/iPad.